Following this documentation, I have not understood if there is an easy way to move a ModelEntity along the Y-axis. The reason I expected this is that in ARQuickLook this functionality works together with the .scale
and .rotate
functions, also listed in the Apple documentation.
If there is any easy/similar way to install these gestures, please let me know.
In RealityKit 2.0, unlike ARQuickLook, only a single touch drag gesture is implemented to move a model (double-finger gesture for vertical drag isn't implemented at the moment). With a single-finger gesture you can move entity along its anchoring plane – as a rule it's XZ plane, so there's no Y-axis drag.
Despite this, you have the option to additionally implement 2D
UIGestureRecognizer
.(The same way you can implement a two-finger pan gesture (also known as levitation gesture) like in AR Quick Look apps)
P. S.
Also, this post will show you how raycasting works in conjunction with RealityKit gestures.