I have a Rajawali Cardboard renderer (texture painted on a sphere, similar to the example), and it works totally fine with sensor-based navigation. I'm also getting touch input, and accumulating in two variables the total angular horizontal and vertical movement. In monocular view, the idea is to allow both forms of navigation.
If, before setting camera orientation in onRenderEye
, I do:
mSphere.rotateAround(Vector3.getAxisVector(Vector3.Axis.Y), Math.toDegrees(x), false);
mSphere.rotateAround(Vector3.getAxisVector(Vector3.Axis.X), Math.toDegrees(y), true);
Then I get the desired affect. However, If the device significantly changes its physical position prior to swiping (e.g. spinning around in chair), these manipulations cause the view to begin to rotate in a different plane than I would expect. What would be the most general way to combine x
and y
with cardboard's supplied eye view, or to otherwise get them to work together?
It seems like I need to incrementally apply a rotation, but I can't figure out what the standard way to do this would be.