Server for AR and VR features.
The AR/VR server is the heart of our Advanced and Virtual Reality solution and handles all the processing.
$DOCS_URL/tutorials/xr/index.html
Registers a new [XRFaceTracker] that tracks the blend shapes of a face.
Registers a new [XRHandTracker] that tracks the joints of a hand.
Registers an [XRInterface] object.
Registers a new [XRPositionalTracker] that tracks a spatial location in real space.
This is an important function to understand correctly. AR and VR platforms all handle positioning slightly differently.
For platforms that do not offer spatial tracking, our origin point (0, 0, 0) is the location of our HMD, but you have little control over the direction the player is facing in the real world.
For platforms that do offer spatial tracking, our origin point depends very much on the system. For OpenVR, our origin point is usually the center of the tracking space, on the ground. For other platforms, it's often the location of the tracking camera.
This method allows you to center your tracker on the location of the HMD. It will take the current location of the HMD and use that to adjust all your tracking data; in essence, realigning the real world to your player's current position in the game world.
For this method to produce usable results, tracking information must be available. This often takes a few frames after starting your game.
You should call this method after a few seconds have passed. For example, when the user requests a realignment of the display holding a designated button on a controller for a short period of time, or when implementing a teleport mechanism.
Clears the reference frame that was set by previous calls to [method center_on_hmd].
Finds an interface by its [param name]. For example, if your project uses capabilities of an AR/VR platform, you can find the interface for that platform by name and initialize it.
Returns the [XRFaceTracker] with the given tracker name.
Returns a dictionary of the registered face trackers. Each element of the dictionary is a tracker name mapping to the [XRFaceTracker] instance.
Returns the [XRHandTracker] with the given tracker name.
Returns a dictionary of the registered hand trackers. Each element of the dictionary is a tracker name mapping to the [XRHandTracker] instance.
Returns the primary interface's transformation.
Returns the interface registered at the given [param idx] index in the list of interfaces.
Returns the number of interfaces currently registered with the AR/VR server. If your project supports multiple AR/VR platforms, you can look through the available interface, and either present the user with a selection or simply try to initialize each interface and use the first one that returns [code]true[/code].
Returns a list of available interfaces the ID and name of each interface.
Returns the reference frame transform. Mostly used internally and exposed for GDExtension build interfaces.
Returns the positional tracker with the given [param tracker_name].
Returns a dictionary of trackers for [param tracker_types].
Removes a registered [XRFaceTracker].
Removes a registered [XRHandTracker].
Removes this [param interface].
Removes this positional [param tracker].
The primary [XRInterface] currently bound to the [XRServer].
The current origin of our tracking space in the virtual world. This is used by the renderer to properly position the camera with new tracking data.
[b]Note:[/b] This property is managed by the current [XROrigin3D] node. It is exposed for access from GDExtensions.
The scale of the game world compared to the real world. By default, most AR/VR platforms assume that 1 game unit corresponds to 1 real world meter.
Emitted when a new face tracker is added.
Emitted when a face tracker is removed.
Emitted when an existing face tracker is updated.
Emitted when a new hand tracker is added.
Emitted when a hand tracker is removed.
Emitted when an existing hand tracker is updated.
Emitted when a new interface has been added.
Emitted when an interface is removed.
Emitted when a new tracker has been added. If you don't use a fixed number of controllers or if you're using [XRAnchor3D]s for an AR solution, it is important to react to this signal to add the appropriate [XRController3D] or [XRAnchor3D] nodes related to this new tracker.
Emitted when a tracker is removed. You should remove any [XRController3D] or [XRAnchor3D] points if applicable. This is not mandatory, the nodes simply become inactive and will be made active again when a new tracker becomes available (i.e. a new controller is switched on that takes the place of the previous one).
Emitted when an existing tracker has been updated. This can happen if the user switches controllers.
The tracker tracks the location of the players head. This is usually a location centered between the players eyes. Note that for handheld AR devices this can be the current location of the device.
The tracker tracks the location of a controller.
The tracker tracks the location of a base station.
The tracker tracks the location and size of an AR anchor.
Used internally to filter trackers of any known type.
Used internally if we haven't set the tracker type yet.
Used internally to select all trackers.
Fully reset the orientation of the HMD. Regardless of what direction the user is looking to in the real world. The user will look dead ahead in the virtual world.
Resets the orientation but keeps the tilt of the device. So if we're looking down, we keep looking down but heading will be reset.
Does not reset the orientation of the HMD, only the position of the player gets centered.