I'm using SKVideoNode as a material for my sphere for 360 video, but it only renders the video on the xy positive part of the sphere, I'm streaming the video from a URL it's a .m3u8
for reference see - SKVideoNode as texture for SCNSphere
Multiple people seem to be having the same issue as me
func makeSphere() {
let sceneView = SCNView(frame: self.view.frame);
self.view.addSubview(sceneView);
var screenSize: CGRect = UIScreen.mainScreen().bounds;
var screenWidth = screenSize.width;
var screenHeight = screenSize.height;
sceneView.frame.size.height = screenHeight * 1;
sceneView.frame.size.width = screenWidth * 1;
sceneView.center.x = screenWidth * 0.5;
let scene = SCNScene();
sceneView.scene = scene;
sphereGeometry = SCNSphere(radius: 5);
sphereNode = SCNNode(geometry: sphereGeometry);
sphereNode.position = SCNVector3(x: 0, y: 0, z: 0);
sphereGeometry.segmentCount = 55;
constraint = SCNLookAtConstraint(target: sphereNode);
let camera = SCNCamera();
let cameraNode = SCNNode();
cameraNode.camera = camera;
cameraNode.position = SCNVector3(x: 0, y: 0, z: 0);
let light = SCNLight();
light.type = SCNLightTypeOmni;
let lightNode = SCNNode();
lightNode.light = light;
lightNode.position = SCNVector3(x: 0, y: 0, z: 0);
cameraNode.constraints = [constraint];
scene.rootNode.addChildNode(cameraNode);
scene.rootNode.addChildNode(sphereNode);
let videoMaterial = SCNMaterial();
let path = "http://video-url.m3u8";
let url = NSURL(string: path);
let asset = AVURLAsset(URL: url!,options: nil);
let playerItem = AVPlayerItem(asset: asset);
let player = AVPlayer(playerItem: playerItem);
let videoNode = SKVideoNode(AVPlayer: player);
let size = CGFloat(100.0);
let spriteScene = SKScene(size: CGSizeMake(size,size));
videoNode.size.width = size;
videoNode.size.height = size;
spriteScene.addChild(videoNode);
videoMaterial.diffuse.contents = spriteScene;
videoMaterial.specular.contents = UIColor.redColor();
videoMaterial.shininess = 1.0;
videoMaterial.doubleSided = true;
sphereGeometry.materials = [videoMaterial];
videoNode.play();
}
You can use the code above to reproduce my problem, if it makes a difference, when I display an image it works just fine.
EDIT
Using videoMaterial.diffuse.contents.transfom(SCNMatrix4MakeScale(0,-1,1));
and videoMaterial.diffuse.wrapT = SCNWrapMode.Repeat;
causes the video to be project on the lower half of the sphere, but instead of appearing correctly all I can see is stretched rings, changing the WrapMode
makes it so that the iOS 6 screen only shows 1 colour.
Using videoMaterial.diffuse.contents.transfom(SCNMatrix4MakeScale(1,0,1));
and videoMaterial.diffuse.wrapT = SCNWrapMode.Repeat;
renders the video on the left side of the sphere, but stretches the texture / video.
It's hard to say specifically what's going wrong, but I have a working solution here: https://github.com/alfiehanssen/ThreeSixtyPlayer
It uses an SKVideoNode for both monoscopic and stereoscopic spherical 360 video.
I do notice that you're not setting the
position
oranchorPoint
of your SKScene, and this is something I believe you must do in order to get the SKVideoNode (material) positioned properly.