UnityEngine.XRModule The XR module contains the VR and AR related platform support functionality. A tracked bone on the device at an XR.XRNode in the XR input subsystem. Get the child bones of this bone. A list of bones that will be filled out with the children bones of this bone. true if bone can be queried for child bones; otherwise false. Gets the parent of this bone. Bone struct that receives the parent bone of this bone. true if the rotation was retrieved, false otherwise. Gets the world position of the bone Vector3 to receive the position of the bone in Unity world space. true if the rotation was retrieved, false otherwise. Gets the world rotation of the bone. Quaternion to receive the rotation of the bone in Unity world space. true if the rotation was retrieved, false otherwise. Defines static variables that are used to retrieve input features from XR.InputDevice.TryGetFeatureValue. Value representing the current battery life of this device. The acceleration of the center eye on this device. The angular acceleration of the center eye on this device, formatted as euler angles. The angular velocity of the center eye on this device, formatted as euler angles. The position of the center eye on this device. The rotation of the center eye on this device. The velocity of the center eye on this device. The acceleration of the color camera on this device. The angular acceleration of the color camera on this device, formatted as euler angles. The angular velocity of the color camera on this device, formatted as euler angles. The position of the color camera on this device. The rotation of the color camera on this device. The velocity of the color camera on this device. The acceleration of the device. The angular acceleration of this device, formatted as euler angles. The angular velocity of this device, formatted as euler angles. The position of the device. The rotation of this device. The velocity of the device. A non-handed 2D axis. An Eyes struct containing eye tracking data collected from the device. Represents the users grip on the controller. A binary measure of whether the device is being gripped. Value representing the hand data for this device. Represents the grip pressure or angle of the index finger. Represents a touch of the trigger or index finger. Informs to the developer whether the device is currently being tracked. The acceleration of the left eye on this device. The angular acceleration of the left eye on this device, formatted as euler angles. The angular velocity of the left eye on this device, formatted as euler angles. The position of the left eye on this device. The rotation of the left eye on this device. The velocity of the left eye on this device. Represents a menu button, used to pause, go back, or otherwise exit gameplay. Represents the grip pressure or angle of the middle finger. Represents the grip pressure or angle of the pinky finger. The primary touchpad or joystick on a device. Represents the primary 2D axis being clicked or otherwise depressed. Represents the primary 2D axis being touched. The primary face button being pressed on a device, or sole button if only one is available. The primary face button being touched on a device. The acceleration of the right eye on this device. The angular acceleration of the right eye on this device, formatted as euler angles. The angular velocity of the right eye on this device, formatted as euler angles. The position of the right eye on this device. The rotation of the right eye on this device. The velocity of the right eye on this device. Represents the grip pressure or angle of the ring finger. A secondary touchpad or joystick on a device. Represents the secondary 2D axis being clicked or otherwise depressed. Represents the secondary 2D axis being touched. The secondary face button being pressed on a device. The secondary face button being touched on a device. Represents a thumbrest or light thumb touch. Represents the thumb pressing any input or feature. Represents the values being tracked for this device. A trigger-like control, pressed with the index finger. A binary measure of whether the index finger is activating the trigger. Use this property to test whether the user is currently wearing and/or interacting with the XR device. The exact behavior of this property varies with each type of device: some devices have a sensor specifically to detect user proximity, however you can reasonably infer that a user is present with the device when the property is UserPresenceState.Present. Contains eye tracking data from the device at an XR.XRNode in the XR input subsystem. Gets the point represents the convergence of the line of sight for both eyes. A Vector3 struct that is filled in with the fixation position. true if eyes can be queried for the fixation point; otherwise false. Gets a value that represents the how far the left eye is open. A float value, with a range of 0.0 to 1.0, that indicates how open the left eye is. A value of 0.0 indicates that the eye is fully closed, while a value of 1.0 indicates that the eye is fully open. true if eyes can be queried for the amount that the left eye is open; otherwise false. Gets the Vector3 that describes the position of the left eye. A Vector3 struct to receive the left eye position. true if eyes can be queried for the left eye position; otherwise false. Gets the Quaternion that describes the rotation of the left eye. A Quaternion struct to receive the left eye rotation. true if eyes can be queried for the left eye rotation; otherwise false. Gets a value that represents the how far the right eye is open. A float value, with a range of 0.0 to 1.0, that indicates how open the right eye is. A value of 0.0 indicates that the eye is fully closed, while a value of 1.0 indicates that the eye is fully open. true if eyes can be queried for the amount that the right eye is open; otherwise false. Gets the Vector3 that describes the position of the right eye. A Vector3 struct to receive the right eye position. true if eyes can be queried for the right eye position; otherwise false. Gets the Quaternion that describes the rotation of the right eye. A Quaternion struct to receive the right eye rotation. true if eyes can be queried for the right eye rotation; otherwise false. A tracked hand on the device at an XR.XRNode in the XR input subsystem. Gets a list of the finger bones for a finger on this hand. HandFinger enum value for this finger. A list of bones that will be filled out for this finger. true if hand can be queried for this finger; otherwise false. Gets the root bone for this hand. A Bone struct to receive the root bone. true if hand can be queried for the root bone; otherwise false. Enumeration describing the AR rendering mode used with XR.Hand. Index finger on a hand. Middle finger on a hand. Pinky finger on a hand. Ring finger on a hand. Thumb finger on a hand. Describes the haptic capabilities of the device at an XR.XRNode in the XR input subsystem. The frequency (in Hz) that this device plays back buffered haptic data. The maximum amount of data that can be sent to an InputDevice via InputDevice.SendHapticBuffer. The optimal buffer size an InputDevice expects to be sent via InputDevice.SendHapticBuffer in order to provide a continuous rumble between individual frames. The number of channels that this device plays back haptic data. True if this device supports sending a haptic buffer. True if this device supports sending a haptic impulse. Defines an input device in the XR input subsystem. Read Only. A bitmask of enumerated flags describing the characteristics of this InputDevice. Read Only. True if the device is currently a valid input device; otherwise false. The manufacturer of the connected Input Device. Read Only. The name of the device in the XR system. This is a platform provided unique identifier for the device. Read Only. The InputDeviceRole of the device in the XR system. This is a platform provided description of how the device is used. The serial number of the connected Input Device. Blank if no serial number is available. Gets the XRInputSubsystem that reported this InputDevice. Sends a raw buffer of haptic data to the device. The channel to receive the data. A raw byte buffer that contains the haptic data to send to the device. Returns true if successful. Returns false otherwise. Sends a haptic impulse to a device. The channel to receive the impulse. The normalized (0.0 to 1.0) amplitude value of the haptic impulse to play on the device. The duration in seconds that the haptic impulse will play. Only supported on Oculus. Returns true if successful. Returns false otherwise. Stop all haptic playback for a device. Gets a list of all the input feature usages available on this device. For example, "Trigger" or "Device Position". A List of InputFeatureUsage structures to receive the available features on this device. true if device can be queried; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Retrieves information about the input feature specified by the Usage parameter. Those functions which take a time parameter allow querying for that feature at a particular point in time Usage that describes the feature to retrieve. A DateTime struct with the local time at which to query for data. A variable of the appropriate type to receive the information about the feature. True if the feature information is retrieved; otherwise false. Gets the haptic capabilities of the device. A HapticCapabilities struct to receive the capabilities of this device. Returns true if the device supports any form of haptics. Returns false otherwise. A set of bit flags describing XR.InputDevice characteristics. The InputDevice has a camera and associated camera tracking information. The InputDevice is a game controller. The InputDevice provides eye tracking information via an Eyes input feature. The InputDevice provides hand tracking information via a Hand input feature. The InputDevice is attached to the head. The InputDevice is held in the user's hand. Typically, a tracked controller. The InputDevice is associated with the left side of the user. A default value specifying no flags. The InputDevice is associated with the right side of the user. The InputDevice reports software approximated, positional data. The InputDevice provides 3DOF or 6DOF tracking data. The InputDevice is an unmoving reference object used to locate and track other objects in the world. Enumeration describing the role of a XR.InputDevice in providing input. This device is a game controller. This device is typically a HMD or Camera. This device is a hardware tracker. This device is a controller that represents the left hand. This device is a legacy controller. This device is a controller that represents the right hand. This device is a tracking reference used to track other devices in 3D. This device does not have a known role. An interface for accessing devices in the XR input subsytem. Defines the delegate to use to register events when an InputDevice's configuration changes. The InputDevice whose configuration has changed. Defines the delegate to use to register events when an InputDevice is connected. The InputDevice that just connected. Defines the delegate to use to register events when an InputDevice is disconnected. The InputDevice that just disconnected. Gets the input device at a given XR.XRNode endpoint. The XRNode that owns the requested device. An XR.InputDevice at this [[XR.XRNode]. Gets a list of active input devices available to the XR Input Subsystem. A List of type InputDevices to receive the available input devices. Gets a list of active input devices available to the XR Input Subsystem at a given XR.XRNode endpoint. The XRNode that owns the requested device. A List of type InputDevices to receive the available input devices. Gets the list of active XR input devices that match the specified InputDeviceCharacteristics. A bitwise combination of the characteristics you are looking for. A List<InputDevice> object to receive the available input devices. Gets a list of active input devices available to the XR Input Subsystem that match the specified role. XR.InputDeviceRole that is defined for the devices returned. A List of type InputDevices to receive the available input devices. Defines a generic usage that maps to an input feature on a device. Use the As method to turn into a generic usage. The string name of this usage feature; used internally to map to an input feature on a device. The type of this usage feature; used internally to map to an input feature on a device. Returns the generic version of this type for retrieving a feature value from a device. Defines a generic usage that maps to an input feature on a device. The string name of this usage feature; used internally to map to an input feature on a device. Construct a usage from a usage name. The name of the feature usage to query for. Converts a generic InputFeatureUsage<T> into an InputFeatureUsage. The generic InputFeatureUsage_1 to conver into an InputFeatureUsage. A collection of methods and properties for accessing XR input devices by their XR Node representation. Disables positional tracking in XR. This takes effect the next time the head pose is sampled. If set to true the camera only tracks headset rotation state. Called when a tracked node is added to the underlying XR system. Describes the node that has been added. Called when a tracked node is removed from the underlying XR system. Describes the node that has been removed. Called when a tracked node begins reporting tracking information. Describes the node that has begun being tracked. Called when a tracked node stops reporting tracking information. Describes the node that has lost tracking. Note: This API has been marked as obsolete in code, and is no longer in use. Please use InputTracking.GetNodeStates and look for the XRNodeState with the corresponding XRNode type instead. Gets the position of a specific node. Specifies which node's position should be returned. The position of the node in its local tracking space. Note: This API has been marked as obsolete in code, and is no longer in use. Please use InputTracking.GetNodeStates and look for the XRNodeState with the corresponding XRNode type instead. Gets the rotation of a specific node. Specifies which node's rotation should be returned. The rotation of the node in its local tracking space. Accepts the unique identifier for a tracked node and returns a friendly name for it. The unique identifier for the Node index. The name of the tracked node if the given 64-bit identifier maps to a currently tracked node. Empty string otherwise. Describes all currently connected XRNodes and provides available tracking states for each. A list that is populated with XR.XRNodeState objects. Center tracking to the current position and orientation of the HMD. Represents the values being tracked for this device. Represents acceleration being tracked for this device. Represents all InputTrackingState values being tracked for this device. Represents angular acceleration being tracked for this device. Represents no angular velocity being tracked for this device. Represents no values being tracked for this device. Represents position being tracked for this device. Represents rotation being tracked for this device. Represents velocity being tracked for this device. The state of a tracked mesh since the last query. The mesh has been added since the last call to XRMeshSubsystem.TryGetMeshInfos. The mesh has been removed since the last call to XRMeshSubsystem.TryGetMeshInfos. The mesh has not changed since the last call to XRMeshSubsystem.TryGetMeshInfos. The mesh has been updated since the last call to XRMeshSubsystem.TryGetMeshInfos. Options for generating meshes. Indicates you plan to consume the resulting mesh's transform. No options are specified. Contains event information related to a generated mesh. The MeshVertexAttributes that were written to the MeshGenerationResult.Mesh. If the generation was successful, data has been written to this Mesh. If the generation was successful, physics data has been written to this MeshCollider. The MeshId of the tracked mesh that was generated. The position associated with the generated mesh relative to the session origin. The rotation associated with the generated mesh relative to the session origin. The scale associated with the generated mesh relative to the session origin. The MeshGenerationStatus of the mesh generation task. The timestamp associated with the generated mesh. The status of a XRMeshSubsystem.GenerateMeshAsync. The mesh generation was canceled. The XRMeshSubsystem was already generating the requested mesh. The mesh generation failed because the mesh does not exist. The mesh generation was successful. The mesh generation failed for unknown reasons. A session-unique identifier for trackables in the environment, e.g., planes and feature points. Represents an invalid id. Generates a nicely formatted version of the id. A string unique to this id Contains state information related to a tracked mesh. The change state (e.g., Added, Removed) of the tracked mesh. The MeshId of the tracked mesh. A hint that can be used to determine when this mesh should be processed. Contains transform information related to a tracked mesh. Creates a new MeshTransform. The identifier of the mesh. The timestamp for the mesh's transform. Larger values indicate newer transforms. The position of the mesh relative to the session origin. The rotation of the mesh relative to the session origin. The scale of the mesh relative to the session origin. The session-unique identifier of the tracked mesh. The position of the mesh, relative to the session origin. The rotation of the mesh, relative to the session origin. The scale of the mesh, relative to the session origin. The timestamp associated with this transform. A set of vertex attributes. Vertex normals No vertex attributes Vertex normals Vertex tangents Vertex UVs Provides timing and other statistics from XR subsystems. Retrieve a statistic for an XR subsystem. The subsystem with which the stat is registered. The tag used to query for a statistic. Receives the current value of the requested statistic. Contains a valid value when this method returns true. True, if the requested statistic is available, false otherwise. This enum provides context to where the 0,0,0 point of tracking for InputDevices is. XRInputSubsystem tracks all InputDevices in reference to the first known location of a specific InputDevice when set to TrackingOriginModeFlags.Device. XRInputSubsystem tracks all InputDevices in reference to a point on the floor when set to TrackingOriginModeFlags.Floor. XRInputSubsystem tracks all InputDevices in reference to an InputDevice with the InputDeviceCharacteristics.TrackingReference flag set when set to TrackingOriginModeFlags.TrackingReference. XRInputSubsystem tracks all InputDevices in relation to a world anchor. This world anchor can change at any time, and is chosen by the runtime. TrackingOriginModeFlags.Unknown enumerates when the XRInputSubsystem was not able to set its tracking origin or has no tracking. An XRDisplaySubsystem controls rendering to a head tracked display. Sets or gets the state of content protection for the current active provider. For most providers, content protection allows you to use write only textures for rendering. This stops the ability for apps to read textures from the graphics card and view/record images that may be protected in some way. Disables the legacy renderer while this XRDisplaySubsystem is active. Event sent when XR display focus changes. Delegate method to call when the event is sent. Determines if the current attached device has an opaque display. Most VR devices are opaque in order to increase the immersive experience, AR devices are transparent to allow for interaction with an augmentation of the current environment. Controls optional behavior of the foveated rendering system. Controls the intensity of the foveated rendering system. The HDROutputSettings for the XR Display Subsystem. A scale applied to the standard occulsion mask. The kind of reprojection the app requests to stabilize its holographic rendering relative to the user's head motion. Controls the size of the textures submitted to the display as a multiplier of the display's default resolution. Controls how much of the allocated display texture should be used for rendering. Returns true when single pass stereo rendering is disabled and returns false if otherwise. Specifies all texture layouts supported by this display subsystem. This var is a bit field that could be combination of XRDisplaySubsystem.TextureLayout. Set DisplaySubsystem to use certain texture layout. Should query supported texture layout through XRDisplaySubsystem.supportedTextureLayouts first for the capabilities. Set DisplaySubsystem to use zFar for rendering. Set DisplaySubsystem to use zNear for rendering. This function records the display subsystem's native blit event to the target command buffer. This function is typically called by a scriptable rendering pipeline. The target CommandBuffer that records the native blit event. True causes the graphics device to invalidate internal states before and after calling into the provider's native blit. This ensures the GFX internal states' consistency with the cost of some runtime performance. The XRMirrorViewBlitMode XR display should perform. Returns true if native blit event is successfully recorded. Returns false otherwise. This function records the display subsystem's native blit event to the target command buffer. This function is typically called by a scriptable rendering pipeline. The target CommandBuffer that records the native blit event. True causes the graphics device to invalidate internal states before and after calling into the provider's native blit. This ensures the GFX internal states' consistency with the cost of some runtime performance. The XRMirrorViewBlitMode XR display should perform. Returns true if native blit event is successfully recorded. Returns false otherwise. This function enables late latching recording of constant buffer memory locations which are later patched with the latest pose data. The camera where late latch recording is to be enabled. This function disables late latching recording of constant buffer locations. The camera where late latch end recording is to be done. Optional flags to control the foveated rendering system. Allows the platform to use eye tracking to optimize foveated rendering. The default behavior with no extra configuration flags. Gets culling parameters for a specific culling pass index. Camera for the basis of the culling view and frustum. Index of the culling pass obtained from XR.XRDisplaySubsystem.XRRenderPass.cullingPassIndex|XRRenderPass.cullingPassIndex. Scriptable culling parameters to populate. Get a mirror view blit operation descriptor from the current display subsystem. A render texture representing mirror view's render target. Information that describes desired mirror view blit operation. The XRMirrorViewBlitMode XR display should perform. Return true if information is retrieved successfully, false otherwise. Get a mirror view blit operation descriptor from the current display subsystem. A render texture representing mirror view's render target. Information that describes desired mirror view blit operation. The XRMirrorViewBlitMode XR display should perform. Return true if information is retrieved successfully, false otherwise. Returns the XR display's preferred mirror blit mode. Display subsystem's preferred blit mode. Gets an XRRenderPass of a specific index. The index of the render pass to get. Must be less than GetRenderPassCount. Render pass to populate. The number of XRRenderPass entries for this XR Display. Count of render passes. Given the UnityXRRenderTextureID returned by IUnityXRDisplayInterface::CreateTexture, return the managed UnityEngine.RenderTexture instance. The ID number identifying the render texture. The managed UnityEngine.RenderTexture instance associated with the UnityXRRenderTextureID. Given a render pass, return the RenderTexture instance backing that render pass. If the render pass is invalid, or if the render texture does not exist, return null. The render pass index to get the render texture for. The render texture associated with that render pass, or null if not found. Given a render pass, return the shared depth buffer RenderTexture instance backing that render pass. If the render pass is invalid, or if the render texture does not exist, return null. The render pass index to get the shared depth buffer render texture for. The shared depth buffer render texture associated with that render pass, or null if not found. Type of node to be late latched. Head node type for late latching. This represents the camera node in the pose hierarchy. Left hand node type for late latching. This represents the left hand anchor node in the pose hierarchy. Right hand node type for late latching. This represents the right hand anchor node in the pose hierarchy. This marks a given GameObject's transform to be late latched in the next frame. Once marked for late latching, the GameObject transform and its descendants will be updated with the latest VR pose updates before rendering is submitted to the GPU. The transform of the GameObject to be late latched. The late latch node type to be associated with the transform. The kind of reprojection the app requests to stabilize its holographic rendering relative to the user's head motion. Does not stabilize the image for the user's head motion and instead fixes it in the display. Note that this is only comfortable for users when you use it sparingly, for example when the only visible content is a small cursor. Stabilizes the image only for changes to the user's head orientation, ignores changes in position. This is best for body-locked content that you want to move with the user as they walk around, such as a 360-degree video. Stabilizes the image for changes to both the user's head position and orientation. This is best for world-locked content that you want to remain stationary as the user walks around. Does not specify the type of reprojection mode to use. Sets a point in 3D space that acts as the focal point of the Scene for this frame. This helps to improve the visual fidelity of content around this point. You must set this value every frame. Note that specifying body-locked content in focus improves the fidelity of body-locked content at the expense of content not locked to the body. This is especially apparent when the user moves. The position of the focal point in the Scene, relative to the Camera. Surface normal of the plane being viewed at the focal point. A vector that describes how the focus point moves in the Scene at this point in time. This allows the device to compensate for both your head movement and the movement of the object in the Scene. Set MSAA level for the DisplaySubsystem's render texture. The MSAA level. Override the XR display's preferred mirror blit mode from the script. XRMirrorViewBlitMode to set. Flags that represents supported texture layout. Textures could be configured to multiple texture2D type. Textures could be configured to a texture2D that represents multiple views. Textures could be configured to a texture2DArray type. Retrieves the time the GPU has spent on executing commands from the application's last frame, as reported by the XR Plugin. Measured in seconds. Outputs the time spent by the GPU during the last frame. Returns true if the GPU time spent on the last frame is available. Returns false if that time is unavailable. Retrieves the amount of time that the GPU spent executing the compositor renderer during the last frame, as reported by the XR Plugin. Measured in seconds. Outputs the time spent by the GPU for the compositor during the last frame. Returns true if the GPU time spent on the last frame is available. Returns false if that time is unavailable. Retrieves the refresh rate of the display as reported by the XR Plugin. Outputs the display refresh rate in Hz. Returns true if the display refresh rate is available. Returns false if that rate is unavailable. Retrieves the number of dropped frames reported by the XR Plugin. Outputs the number of frames dropped since the last update. Returns true if the dropped frame count is available. Returns false otherwise. Retrieves the number of times the current frame has been drawn to the device as reported by the XR Plugin. Outputs the number of times the current frame has been presented. Returns true if the current frame count is available. Returns false otherwise. Retrieves the motion-to-photon value as reported by the XR Plugin. Outputs the motion-to-photon value. Returns true if the motion-to-photon value is available. Returns false otherwise. This struct holds data for a single blit operation. Destination Rect area that the blit operation wants to blit to. A pointer to a native struct containing platform-specific data for foveated rendering. The ColorGamut of the source texture if srcHdrEncoded is true. Specifies whether the source texture is encoded for use with an HDR display and might require decoding during the blit process. The maximum luminance in nits of the encoding used for the source texture if srcHdrEncoded is true. Source Rect area that the blit operation wants to blit from. Source render texture that the blit operation wants to blit from. Describes source texture's desired array slice. Texture2D will have array slice 1. All information in this struct describes the desired mirror view blit operation. The number of XRBlitParams entries for this XRMirrorViewBlitDesc. When this is true, the current display subsystem supports native blit and AddGraphicsThreadMirrorViewBlit must be called to perform native blit. When this is true, display subsystem will modifiy the graphics state. Gets an XRBlitParams for a specific XRMirrorViewBlitDesc. Index of the blit parameter to get. XRBlitParams to populate. A single viewpoint that must be rendered by the render pipeline. Contains a target viewport and texture array slice within a corresponding XR.XRDisplaySubsystem.XRRenderPass.renderTarget|renderTarget. Determines whether XR.XRDisplaySubsystem.XRRenderParameter.previousView is valid for use in a frame. Represents the area in screen-space that is not visible on the XR Display. Previous frame view matrix for use in motion vector calculation. Use XR.XRDisplaySubsystem.XRRenderParameter.isPreviousViewValid to determine if previous view is valid for use. When late latching is enabled, previous view is also adjusted for late latching. The projection matrix that the render pipeline should use to render to the XR.XRDisplaySubsystem.XRRenderPass.renderTarget|renderTarget. The slice of the output texture array that the render pipeline should render to. World transform that the render pipeline should use to render to the XR.XRDisplaySubsystem.XRRenderPass.renderTarget|renderTarget. Selects the viewport of the output texture XR.XRDisplaySubsystem.XRRenderPass.renderTarget|renderTarget. Contains configuration parameters about which view into the Scene the renderer should rasterize, and a render target (which can be a texture array) for the result of the rasterization. An index that a render pipeline can pass to XR.XRDisplaySubsystem.GetCullingParameters to obtain culling information. A pointer to a native struct containing platform-specific data for foveated rendering. A boolean indicating if this render pass contains a motion-vector generation pass. The output render-texture target for the motion-vector generation render pass. The render texture description for the target texture for the motion-vector render pass. The index of the render pass (originally passed in to XRDisplaySubsystem.GetRenderPass). The output target for the render pass. Descriptor that can be passed to RenderTexture.GetTemporary to create temporary textures that match the XR Display render target. When this is false an optimal renderer can avoid resolving the depth buffer. When true, the SpaceWarp motion vector data is in the right-handed normalized device coordinate (NDC) space. When false, the motion vector data is in the left-handed NDC space. Gets an XRRenderParameter for a specific XRRenderPass. Camera for the basis of the view and projection. Index of the render parameter to get. Must be less than GetRenderParameterCount. XRRenderParameter to populate. The number of XRRenderParameter entries for this XRRenderPass. Count of render parameters. Class providing information about XRDisplaySubsystem registration. Indicates whether legacy VR settings must be disabled for the subsystem. Set to true if the Editor must disable the legacy VR settings disabled; otherwise false. Indicates whether MSAA must be resolved in the back buffer. Set to true if MSAA needs to be resolved in the back buffer; otherwise false. Get current display subsystem's total number of supported mirror blit modes. Number of supported mirror blit modes. Get a supported mirror view blit mode from the current display subsystem descriptor. XRMirrorViewBlitMode to populate. Index of the mirror blit mode to get. XRInputSubsystem Instance is used to enable and disable the inputs coming from a specific plugin. An event that takes the delegate instance that the XRInputSubsystem calls when it changes its tracking boundary. Unity calls this delegate when the tracking boundary changes. An event that takes the delegate instance that the XRInputSubsystem calls when it changes the origin it reports devices at. Unity calls this delegate when the TrackingOriginFlags changes. Gets all TrackingOriginModeFlags that this subsystem supports. A single series of flags that contains all supported TrackingOriginModeFlags. Gets the Tracking Origin Mode. The Tracking Origin Mode that this subsystem is in. Gets the list of 3D position values that represents the SDK-set boundary. The list of boundary points. True if this XRInputSubsystem supports boundary points and they are available. Returns false otherwise. Gets a list of all connected InputDevices reported by this XRInputSubsystem. The list of devices reported by this subsystem. True, if the XRInputSubsystem retrieves any devices. Returns false otherwise. Centers the tracking features on all InputDevices to the current position and orientation of the head-mounted device. True if the method recenters the XRInputSubsystem. Returns false otherwise. Attempts to set the TrackingOriginModeFlags of the subsystem. The new TrackingOriginModeFlags that you'd like to change to. True if the method changes the origin. Returns false otherwise. Information about an Input subsystem. When true, will suppress legacy support for Daydream, Oculus, OpenVR, and Windows MR built directly into the Unity runtime from generating input. This is useful when adding an XRInputSubsystem that supports these devices. Allows external systems to provide dynamic meshes to Unity. Call this function to request a change in the density of the generated Meshes. Unity gives the density level as a value within the range 0.0 to 1.0 and the provider determines how to map that value to their implementation. Setting this value does not guarantee an immediate change in the density of any currently created Mesh and may only change the density for new or updated Meshes. Requests the generation of the Mesh with MeshId meshId. Unity calls onMeshGenerationComplete when generation finishes. The MeshId of the mesh you wish to generate. The Mesh to write the results into. (Optional) The MeshCollider to populate with physics data. This may be null. The vertex attributes you'd like to use. The delegate to invoke when the generation completes. Requests the generation of the Mesh with MeshId meshId. Unity calls onMeshGenerationComplete when generation finishes. The MeshId of the mesh you wish to generate. The Mesh to write the results into. (Optional) The MeshCollider to populate with physics data. This may be null. The vertex attributes you'd like to use. The delegate to invoke when the generation completes. The mesh generation options. Gets the updated mesh transforms. The allocator to use for the returned NativeArray. A new NativeArray of MeshTransforms. Set the bounding volume to restrict the space in which Unity generates and tracks Meshes. The bounding volume is an Axis Aligned Bounding Box (AABB) centered at the origin and extends in each dimension as defined in extents. The units of measurement depend on the provider. Gets information about every Mesh the system currently tracks. A List of MeshInfos to be filled. Passing null will throw an ArgumentNullException. True if the List was populated. Information about an XRMeshSubsystem. Engine reserved blit modes. Blit mode capabilities should be queried from XRDisplaySubsystemDescriptor.GetAvailableMirrorBlitModeCount and XRDisplaySubsystemDescriptor.GetMirrorBlitModeByIndex. Mirror view pass should blit platform default image to the mirror target. Mirror view pass should blit after distortion pass image to the mirror target. Mirror view pass should blit left eye image to the mirror target. Mirror view pass should blit left eye image and right eye image in a side-by-side fashion to the mirror target, displaying motion vectors. Mirror view pass should not be performed. Mirror view pass should blit right eye image to the mirror target. Mirror view pass should blit left eye image and right eye image in a side-by-side fashion to the mirror target. Mirror view pass should blit similar to side-by-side mode, but also showing not rendered pixels saved by the occlusion mesh. Struct that describes the mirror view blit mode. Mirror view blit mode Id. For details, see XRMirrorViewBlitMode. In case of provider's custom blit mode, the value wouldn't be the reserved XRMirrorViewBlitMode. String that describes the mirror view blit mode. Enumeration of XR nodes which can be updated by XR input or sent haptic data. Node representing a point between the left and right eyes. Represents a tracked game Controller not associated with a specific hand. Represents a physical device that provides tracking data for objects to which it is attached. Node representing the user's head. Node representing the left eye. Node representing the left hand. Node representing the right eye. Node representing the right hand. Represents a stationary physical device that can be used as a point of reference in the tracked area. Describes the state of a node tracked by an XR system. Sets the vector representing the current acceleration of the tracked node. Sets the vector representing the current angular acceleration of the tracked node. Sets the vector representing the current angular velocity of the tracked node. The type of the tracked node as specified in XR.XRNode. Sets the vector representing the current position of the tracked node. Sets the quaternion representing the current rotation of the tracked node. Set to true if the node is presently being tracked by the underlying XR system, and false if the node is not presently being tracked by the underlying XR system. The unique identifier of the tracked node. Sets the vector representing the current velocity of the tracked node. Attempt to retrieve a vector representing the current acceleration of the tracked node. True if the acceleration was set in the output parameter. False if the acceleration is not available due to limitations of the underlying platform or if the node is not presently tracked. Attempt to retrieve a Vector3 representing the current angular acceleration of the tracked node. True if the angular acceleration was set in the output parameter. False if the angular acceleration is not available due to limitations of the underlying platform or if the node is not presently tracked. Attempt to retrieve a Vector3 representing the current angular velocity of the tracked node. True if the angular velocity was set in the output parameter. False if the angular velocity is not available due to limitations of the underlying platform or if the node is not presently tracked. Attempt to retrieve a vector representing the current position of the tracked node. True if the position was set in the output parameter. False if the position is not available due to limitations of the underlying platform or if the node is not presently tracked. Attempt to retrieve a quaternion representing the current rotation of the tracked node. True if the rotation was set in the output parameter. False if the rotation is not available due to limitations of the underlying platform or if the node is not presently tracked. Attempt to retrieve a vector representing the current velocity of the tracked node. True if the velocity was set in the output parameter. False if the velocity is not available due to limitations of the underlying platform or if the node is not presently tracked.