libovr
- Oculus Rift interface using LibOVR¶
Classes and functions for using the Oculus Rift (DK2, CV1, and S) HMD and associated peripherals via the official runtime/SDK (LibOVRRT). Currently, only OpenGL is supported for rendering VR scenes.
This extension module makes use of the official Oculus PC SDK. A C/C++ interface for tracking, rendering, and VR math for Oculus products. The Oculus PC SDK is Copyright (c) Facebook Technologies, LLC and its affiliates. All rights reserved. You must accept the ‘EULA’, ‘Terms of Use’ and ‘Privacy Policy’ associated with the Oculus PC SDK to use this module in your software, see Legal Documents to access those documents.
Overview¶
Classes¶
LibOVRPose ([pos, ori]) |
Class for representing rigid body poses. |
LibOVRPoseState ([thePose, linearVelocity, …]) |
Class for representing rigid body poses with additional state information. |
LibOVRTrackingState () |
Class for tracking state information. |
LibOVRBounds ([extents]) |
Class for constructing and representing 3D axis-aligned bounding boxes. |
LibOVRHmdInfo () |
Class for general HMD information and capabilities. |
LibOVRHmdColorSpace |
Class for HMD color space data. |
LibOVRTrackerInfo |
Class for storing tracker (sensor) information such as pose, status, and camera frustum information. |
LibOVRSessionStatus |
Class for storing session status information. |
LibOVRBoundaryTestResult |
Class for boundary collision test data. |
LibOVRPerfStatsPerCompositorFrame () |
Class for frame performance statistics per compositor frame. |
LibOVRPerfStats () |
Class for frame performance statistics. |
LibOVRHapticsInfo |
Class for touch haptics engine information. |
LibOVRHapticsBuffer (buffer) |
Class for haptics buffer data for controller vibration. |
Functions¶
success (int result) |
Check if an API return indicates success. |
unqualifiedSuccess (int result) |
Check if an API return indicates unqualified success. |
failure (int result) |
Check if an API return indicates failure (error). |
getBool (bytes propertyName, …) |
Read a LibOVR boolean property. |
setBool (bytes propertyName, bool value=True) |
Write a LibOVR boolean property. |
getInt (bytes propertyName, int defaultVal=0) |
Read a LibOVR integer property. |
setInt (bytes propertyName, int value) |
Write a LibOVR integer property. |
getFloat (bytes propertyName, …) |
Read a LibOVR floating point number property. |
setFloat (bytes propertyName, float value) |
Write a LibOVR floating point number property. |
getFloatArray (bytes propertyName, ndarray values) |
Read a LibOVR float array property. |
setFloatArray (bytes propertyName, ndarray values) |
Write a LibOVR floating point number property. |
getString (bytes propertyName[, defaultVal]) |
Read a LibOVR string property. |
setString (bytes propertyName, value) |
Write a LibOVR floating point number property. |
isOculusServiceRunning (int timeoutMs=100) |
Check if the Oculus Runtime is loaded and running. |
isHmdConnected (int timeoutMs=100) |
Check if an HMD is connected. |
getHmdInfo () |
Get HMD information. |
initialize (bool focusAware=False, …[, …]) |
Initialize the session. |
create () |
Create a new session. |
checkSessionStarted () |
Check of a session has been created. |
destroyTextureSwapChain (int swapChain) |
Destroy a texture swap chain. |
destroyMirrorTexture () |
Destroy the mirror texture. |
destroy () |
Destroy a session. |
shutdown () |
End the current session. |
getGraphicsLUID () |
The graphics device LUID. |
setHighQuality (bool enable) |
Enable high quality mode. |
setHeadLocked (bool enable) |
Set the render layer state for head locking. |
getPixelsPerTanAngleAtCenter (int eye) |
Get pixels per tan angle (=1) at the center of the display. |
getTanAngleToRenderTargetNDC (int eye, tanAngle) |
Convert FOV tan angle to normalized device coordinates (NDC). |
getPixelsPerDegree (int eye) |
Get pixels per degree at the center of the display. |
getDistortedViewport (int eye) |
Get the distorted viewport. |
getEyeRenderFov (int eye) |
Get the field-of-view to use for rendering. |
setEyeRenderFov (int eye, fov, …) |
Set the field-of-view of a given eye. |
getEyeAspectRatio (int eye) |
Get the aspect ratio of an eye. |
getEyeHorizontalFovRadians (int eye) |
Get the angle of the horizontal field-of-view (FOV) for a given eye. |
getEyeVerticalFovRadians (int eye) |
Get the angle of the vertical field-of-view (FOV) for a given eye. |
getEyeFocalLength (int eye) |
Get the focal length of the eye’s frustum. |
calcEyeBufferSize (int eye, …) |
Get the recommended buffer (texture) sizes for eye buffers. |
getLayerEyeFovFlags () |
Get header flags for the render layer. |
setLayerEyeFovFlags (unsigned int flags) |
Set header flags for the render layer. |
createTextureSwapChainGL (int swapChain, …) |
Create a texture swap chain for eye image buffers. |
getTextureSwapChainLengthGL (int swapChain) |
Get the length of a specified swap chain. |
getTextureSwapChainCurrentIndex (int swapChain) |
Get the current buffer index within the swap chain. |
getTextureSwapChainBufferGL (int swapChain, …) |
Get the texture buffer as an OpenGL name at a specific index in the swap chain for a given swapChain. |
setEyeColorTextureSwapChain (int eye, …) |
Set the color texture swap chain for a given eye. |
createMirrorTexture (int width, int height, …) |
Create a mirror texture. |
getMirrorTexture () |
Mirror texture ID. |
getSensorSampleTime () |
Get the sensor sample timestamp. |
setSensorSampleTime (double absTime) |
Set the sensor sample timestamp. |
getTrackingState (double absTime, …) |
Get the current tracking state of the head and hands. |
getDevicePoses (deviceTypes, double absTime, …) |
Get tracked device poses. |
calcEyePoses (LibOVRPose headPose[, originPose]) |
Calculate eye poses using a given head pose. |
getHmdToEyePose (int eye) |
HMD to eye pose. |
setHmdToEyePose (int eye, LibOVRPose eyePose) |
Set the HMD eye poses. |
getEyeRenderPose (int eye) |
Get eye render poses. |
setEyeRenderPose (int eye, LibOVRPose eyePose) |
Set eye render pose. |
getEyeRenderViewport (int eye, ndarray out=None) |
Get the eye render viewport. |
setEyeRenderViewport (int eye, values) |
Set the eye render viewport. |
getEyeProjectionMatrix (int eye, ndarray out=None) |
Compute the projection matrix. |
getEyeViewMatrix (int eye, ndarray out=None) |
Compute a view matrix for a specified eye. |
getPredictedDisplayTime (…) |
Get the predicted time a frame will be displayed. |
timeInSeconds () |
Absolute time in seconds. |
waitToBeginFrame (unsigned int frameIndex=0) |
Wait until a buffer is available so frame rendering can begin. |
beginFrame (unsigned int frameIndex=0) |
Begin rendering the frame. |
commitTextureSwapChain (int eye) |
Commit changes to a given eye’s texture swap chain. |
endFrame (unsigned int frameIndex=0) |
Call when rendering a frame has completed. |
getTrackingOriginType () |
Get the current tracking origin type. |
setTrackingOriginType (int value) |
Set the tracking origin type. |
recenterTrackingOrigin () |
Recenter the tracking origin. |
specifyTrackingOrigin (LibOVRPose newOrigin) |
Specify a new tracking origin. |
clearShouldRecenterFlag () |
Clear the LibOVRSessionStatus.shouldRecenter flag. |
getTrackerCount () |
Get the number of attached trackers. |
getTrackerInfo (int trackerIndex) |
Get information about a given tracker. |
getSessionStatus () |
Get the current session status. |
getPerfStats () |
Get detailed compositor frame statistics. |
resetPerfStats () |
Reset frame performance statistics. |
getLastErrorInfo () |
Get the last error code and information string reported by the API. |
setBoundaryColor (float red, float green, …) |
Set the boundary color. |
resetBoundaryColor () |
Reset the boundary color to system default. |
getBoundaryVisible () |
Check if the Guardian boundary is visible. |
showBoundary () |
Show the boundary. |
hideBoundary () |
Hide the boundry. |
getBoundaryDimensions (int boundaryType) |
Get the dimensions of the boundary. |
testBoundary (int deviceBitmask, int boundaryType) |
Test collision of tracked devices on boundary. |
getConnectedControllerTypes () |
Get connected controller types. |
updateInputState (int controller) |
Refresh the input state of a controller. |
getButton (int controller, int button, …) |
Get a button state. |
getTouch (int controller, int touch, …) |
Get a touch state. |
getThumbstickValues (int controller, …) |
Get analog thumbstick values. |
getIndexTriggerValues (int controller, …) |
Get analog index trigger values. |
getHandTriggerValues (int controller, …) |
Get analog hand trigger values. |
setControllerVibration (int controller, …) |
Vibrate a controller. |
getHapticsInfo (int controller) |
Get information about the haptics engine for a particular controller. |
submitControllerVibration (int controller, …) |
Submit a haptics buffer to Touch controllers. |
getControllerPlaybackState (int controller) |
Get the playback state of a touch controller. |
cullPose (int eye, LibOVRPose pose) |
Test if a pose’s bounding box or position falls outside of an eye’s view frustum. |
Details¶
Classes¶
-
class
psychxr.drivers.libovr.
LibOVRPose
(pos=(0., 0., 0.), ori=(0., 0., 0., 1.))¶ Class for representing rigid body poses.
This class is an abstract representation of a rigid body pose, where the position of the body in a scene is represented by a vector/coordinate and the orientation with a quaternion. LibOVR uses this format for poses to represent the posture of tracked devices (e.g. HMD, touch controllers, etc.) and other objects in a VR scene. There are many class methods and properties provided to handle accessing, manipulating, and interacting with poses. Rigid body poses assume a right-handed coordinate system (-Z is forward and +Y is up).
Poses can be manipulated using operators such as
*
,~
, and*=
. One pose can be transformed by another by multiplying them using the*
operator:newPose = pose1 * pose2
The above code returns pose2 transformed by pose1, putting pose2 into the reference frame of pose1. Using the inplace multiplication operator
*=
, you can transform a pose into another reference frame without making a copy. One can get the inverse of a pose by using the~
operator:poseInv = ~pose
Poses can be converted to 4x4 transformation matrices with getModelMatrix, getViewMatrix, and getNormalMatrix. One can use these matrices when rendering to transform the vertices and normals of a model associated with the pose by passing the matrices to OpenGL. The ctypes property eliminates the need to copy data by providing pointers to data stored by instances of this class. This is useful for some Python OpenGL libraries which require matrices to be provided as pointers.
Bounding boxes can be given to poses by assigning a
LibOVRBounds
instance to the bounds attribute. Bounding boxes are used for visibility culling, to determine if a mesh associated with a pose is visible to the viewer and whether it should be drawn or not. This aids in reducing workload for the application by only rendering objects that are visible from a given eye’s view.Parameters: - pos (array_like) – Initial position vector (x, y, z).
- ori (array_like) – Initial orientation quaternion (x, y, z, w).
-
alignTo
(self, alignTo)¶ Align this pose to another point or pose.
This sets the orientation of this pose to one which orients the forward axis towards alignTo.
Parameters: alignTo (array_like or LibOVRPose) – Position vector [x, y, z] or pose to align to.
-
apply
(self, v, ndarray out=None)¶ Apply a transform to a position vector.
Parameters: - v (array_like) – Vector to transform (x, y, z).
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Vector transformed by the pose’s position and orientation.
Return type: ndarray
-
at
¶ Forward vector of this pose (-Z is forward) (read-only).
Type: ndarray
-
bounds
¶ Bounding object associated with this pose.
-
copy
(self)¶ Create an independent copy of this object.
Returns: Copy of this pose. Return type: RigidBodyPose
-
ctypes
¶ Pointers to matrix data.
This attribute provides a dictionary of pointers to cached matrix data to simplify passing data to OpenGL. This is particularly useful when using pyglet which accepts matrices as pointers. Dictionary keys are strings sharing the same name as the attributes whose data they point to.
Examples
Setting the model matrix:
glMatrixMode(GL_MODELVIEW) glPushMatrix() glMultTransposeMatrixf(myPose.ctypes['modelMatrix']) # run draw commands ... glPopMatrix()
If using fragment shaders, the matrix can be passed on to them as such:
# after the program was installed in the current rendering state via # `glUseProgram` ... loc = glGetUniformLocation(program, b"m_Model") # `transpose` must be `True` glUniformMatrix4fv(loc, 1, GL_TRUE, myPose.ctypes['modelMatrix'])
-
distanceTo
(self, v)¶ Distance to a point or pose from this pose.
Parameters: v (array_like) – Vector to transform (x, y, z). Returns: Distance to a point or LibOVRPose. Return type: float Examples
Get the distance between poses:
distance = thisPose.distanceTo(otherPose)
Get the distance to a point coordinate:
distance = thisPose.distanceTo([0.0, 0.0, 5.0])
Do something if the tracked right hand pose is within 0.5 meters of some object:
# use 'getTrackingState' instead for hand poses, just an example handPose = getDevicePoses(TRACKED_DEVICE_TYPE_RTOUCH, absTime, latencyMarker=False) # object pose objPose = LibOVRPose((0.0, 1.0, -0.5)) if handPose.distanceTo(objPose) < 0.5: # do something here ...
Vary the touch controller’s vibration amplitude as a function of distance to some pose. As the hand gets closer to the point, the amplitude of the vibration increases:
dist = handPose.distanceTo(objPose) vibrationRadius = 0.5 if dist < vibrationRadius: # inside vibration radius amplitude = 1.0 - dist / vibrationRadius setControllerVibration(CONTROLLER_TYPE_RTOUCH, 'low', amplitude) else: # turn off vibration setControllerVibration(CONTROLLER_TYPE_RTOUCH, 'off')
-
duplicate
(self)¶ Create a deep copy of this object.
Same as calling copy.deepcopy on an instance.
Returns: An independent copy of this object. Return type: LibOVRPose
-
getAngleTo
(self, target, dir=(0., 0., -1.), bool degrees=True)¶ Get the relative angle to a point in world space from the dir vector in the local coordinate system of this pose.
Parameters: - target (LibOVRPose or array_like) – Pose or point [x, y, z].
- dir (array_like) – Direction vector [x, y, z] within the reference frame of the pose to compute angle from. Default is forward along the -Z axis (0., 0., -1.).
- degrees (bool, optional) – Return angle in degrees if
True
, else radians. Default isTrue
.
Returns: Angle between the forward vector of this pose and the target. Values are always positive.
Return type: Examples
Get the angle in degrees between the pose’s -Z axis (default) and a point:
point = [2, 0, -4] angle = myPose.getAngleTo(point)
Get the angle from the up direction to the point in radians:
upAxis = (0., 1., 0.) angle = myPose.getAngleTo(point, dir=upAxis, degrees=False)
-
getAt
(self, ndarray out=None)¶ Get the at vector for this pose.
Parameters: out (ndarray or None) – Optional array to write values to. Must have shape (3,) and a float32 data type. Returns: The vector for at. Return type: ndarray Examples
Setting the listener orientation for 3D positional audio (PyOpenAL):
myListener.set_orientation((*myPose.getAt(), *myPose.getUp()))
See also
getUp()
- Get the up vector.
-
getAzimuthElevation
(self, target, bool degrees=True)¶ Get the azimuth and elevation angles of a point relative to this pose’s forward direction (0., 0., -1.).
Parameters: - target (LibOVRPose or array_like) – Pose or point [x, y, z].
- degrees (bool, optional) – Return angles in degrees if
True
, else radians. Default isTrue
.
Returns: Azimuth and elevation angles of the target point. Values are signed.
Return type:
-
getModelMatrix
(self, bool inverse=False, ndarray out=None)¶ Get this pose as a 4x4 transformation matrix.
Parameters: - inverse (bool) – If
True
, return the inverse of the matrix. - out (ndarray, optional) – Alternative place to write the matrix to values. Must be a ndarray of shape (4, 4,) and have a data type of float32. Values are written assuming row-major order.
Returns: 4x4 transformation matrix.
Return type: ndarray
Notes
- This function create a new ndarray with data copied from cache. Use the modelMatrix or inverseModelMatrix attributes for direct cache memory access.
Examples
Using view matrices with PyOpenGL (fixed-function):
glMatrixMode(GL_MODELVIEW) glPushMatrix() glMultTransposeMatrixf(myPose.getModelMatrix()) # run draw commands ... glPopMatrix()
For Pyglet (which is the standard GL interface for PsychoPy), you need to convert the matrix to a C-types pointer before passing it to glLoadTransposeMatrixf:
M = myPose.getModelMatrix().ctypes.data_as(ctypes.POINTER(ctypes.c_float)) glMatrixMode(GL_MODELVIEW) glPushMatrix() glMultTransposeMatrixf(M) # run draw commands ... glPopMatrix()
If using fragment shaders, the matrix can be passed on to them as such:
M = myPose.getModelMatrix().ctypes.data_as(ctypes.POINTER(ctypes.c_float)) M = M.ctypes.data_as(ctypes.POINTER(ctypes.c_float)) # after the program was installed in the current rendering state via # `glUseProgram` ... loc = glGetUniformLocation(program, b"m_Model") glUniformMatrix4fv(loc, 1, GL_TRUE, P) # `transpose` must be `True`
- inverse (bool) – If
-
getNormalMatrix
(self, ndarray out=None)¶ Get a normal matrix used to transform normals within a fragment shader.
Parameters: out (ndarray, optional) – Alternative place to write the matrix to values. Must be a ndarray of shape (4, 4,) and have a data type of float32. Values are written assuming row-major order. Returns: 4x4 normal matrix. Return type: ndarray Notes
- This function create a new ndarray with data copied from cache. Use the normalMatrix attribute for direct cache memory access.
-
getOri
(self, ndarray out=None)¶ Orientation quaternion X, Y, Z, W. Components X, Y, Z are imaginary and W is real.
The returned object is a NumPy array which references data stored in an internal structure (ovrPosef). The array is conformal with the internal data’s type (float32) and size (length 4).
Parameters: out (ndarray or None) – Optional array to write values to. Must have a float32 data type. Returns: Orientation quaternion of this pose. Return type: ndarray Notes
- The orientation quaternion should be normalized.
-
getOriAngle
(self, bool degrees=True)¶ Get the angle of this pose’s orientation.
Parameters: degrees (bool, optional) – Return angle in degrees. Default is True
.Returns: Angle of quaternion ori. Return type: float
-
getOriAxisAngle
(self, degrees=True)¶ The axis and angle of rotation for this pose’s orientation.
Parameters: degrees (bool, optional) – Return angle in degrees. Default is True
.Returns: Axis and angle. Return type: tuple (ndarray, float)
-
getPos
(self, ndarray out=None)¶ Position vector X, Y, Z.
The returned object is a NumPy array which contains a copy of the data stored in an internal structure (ovrPosef). The array is conformal with the internal data’s type (float32) and size (length 3).
Parameters: out (ndarray or None) – Optional array to write values to. Must have a float32 data type. Returns: Position coordinate of this pose. Return type: ndarray Examples
Get the position coordinates:
x, y, z = myPose.getPos() # Python float literals # ... or ... pos = myPose.getPos() # NumPy array shape=(3,) and dtype=float32
Write the position to an existing array by specifying out:
position = numpy.zeros((3,), dtype=numpy.float32) # mind the dtype! myPose.getPos(position) # position now contains myPose.pos
You can also pass a view/slice to out:
coords = numpy.zeros((100,3,), dtype=numpy.float32) # big array myPose.getPos(coords[42,:]) # row 42
-
getSwingTwist
(self, twistAxis)¶ Swing and twist decomposition of this pose’s rotation quaternion.
Where twist is a quaternion which rotates about twistAxis and swing is perpendicular to that axis. When multiplied, the quaternions return the original quaternion at ori.
Parameters: twistAxis (array_like) – World referenced twist axis [ax, ay, az]. Returns: Swing and twist quaternions [x, y, z, w]. Return type: tuple Examples
Get the swing and twist quaternions about the up direction of this pose:
swing, twist = myPose.getSwingTwist(myPose.up)
-
getUp
(self, ndarray out=None)¶ Get the ‘up’ vector for this pose.
Parameters: out (ndarray, optional) – Optional array to write values to. Must have shape (3,) and a float32 data type. Returns: The vector for up. Return type: ndarray Examples
Using the up vector with gluLookAt:
up = myPose.getUp() # myPose.up also works center = myPose.pos target = targetPose.pos # some target pose gluLookAt(*(*up, *center, *target))
See also
getAt()
- Get the up vector.
-
getViewMatrix
(self, bool inverse=False, ndarray out=None)¶ Convert this pose into a view matrix.
Creates a view matrix which transforms points into eye space using the current pose as the eye position in the scene. Furthermore, you can use view matrices for rendering shadows if light positions are defined as LibOVRPose objects. Using
calcEyePoses()
andgetEyeViewMatrix()
are preferred when rendering VR scenes since features like visibility culling are not available otherwise.Parameters: - inverse (bool, optional) – Return the inverse of the view matrix. Default it
False
. - out (ndarray, optional) – Alternative place to write the matrix to values. Must be a ndarray of shape (4, 4,) and have a data type of float32. Values are written assuming row-major order.
Returns: 4x4 view matrix derived from the pose.
Return type: ndarray
Notes
- This function create a new ndarray with data copied from cache. Use the viewMatrix attribute for direct cache memory access.
Examples
Compute eye poses from a head pose and compute view matrices:
iod = 0.062 # 63 mm headPose = LibOVRPose((0., 1.5, 0.)) # 1.5 meters up from origin leftEyePose = LibOVRPose((-(iod / 2.), 0., 0.)) rightEyePose = LibOVRPose((iod / 2., 0., 0.)) # transform eye poses relative to head poses leftEyeRenderPose = headPose * leftEyePose rightEyeRenderPose = headPose * rightEyePose # compute view matrices eyeViewMatrix = [leftEyeRenderPose.getViewMatrix(), rightEyeRenderPose.getViewMatrix()]
- inverse (bool, optional) – Return the inverse of the view matrix. Default it
-
getYawPitchRoll
(self, LibOVRPose refPose=None, bool degrees=True, ndarray out=None)¶ Get the yaw, pitch, and roll of the orientation quaternion.
Parameters: - refPose (LibOVRPose, optional) – Reference pose to compute angles relative to. If None is specified, computed values are referenced relative to the world axes.
- degrees (bool, optional) – Return angle in degrees. Default is
True
. - out (ndarray) – Alternative place to write yaw, pitch, and roll values. Must have shape (3,) and a float32 data type.
Returns: Yaw, pitch, and roll of the pose in degrees.
Return type: ndarray
Notes
- Uses
OVR::Quatf.GetYawPitchRoll
which is part of the Oculus PC SDK.
-
interp
(self, LibOVRPose end, float s, bool fast=False)¶ Interpolate between poses.
Linear interpolation is used on position (Lerp) while the orientation has spherical linear interpolation (Slerp) applied.
Parameters: - end (LibOVRPose) – End pose.
- s (float) – Interpolation factor between interval 0.0 and 1.0.
- fast (bool, optional) – If True, use fast interpolation which is quicker but less accurate over larger distances.
Returns: Interpolated pose at s.
Return type: Notes
- Uses
OVR::Posef.Lerp
andOVR::Posef.FastLerp
which is part of the Oculus PC SDK.
-
inverseModelMatrix
¶ Pose as a 4x4 homogeneous inverse transformation matrix.
-
inverseRotate
(self, v, ndarray out=None)¶ Inverse rotate a position vector.
Parameters: - v (array_like) – Vector to inverse rotate (x, y, z).
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Vector rotated by the pose’s inverse orientation.
Return type: ndarray
Notes
- Uses
OVR::Vector3f.InverseRotate
which is part of the Oculus PC SDK.
-
inverseTransform
(self, v, ndarray out=None)¶ Inverse transform a position vector.
Parameters: - v (array_like) – Vector to transform (x, y, z).
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Vector transformed by the inverse of the pose’s position and orientation.
Return type: ndarray
Notes
- Uses
OVR::Vector3f.InverseTransform
which is part of the Oculus PC SDK.
-
inverseTransformNormal
(self, v, ndarray out=None)¶ Inverse transform a normal vector.
Parameters: - v (array_like) – Vector to transform (x, y, z).
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Normal vector transformed by the inverse of the pose’s position and orientation.
Return type: ndarray
Notes
- Uses
OVR::Vector3f.InverseTransformNormal
which is part of the Oculus PC SDK.
-
inverseViewMatrix
¶ View matrix inverse.
-
invert
(self)¶ Invert this pose.
Notes
- Uses
OVR::Posef.Inverted
which is part of the Oculus PC SDK.
- Uses
-
inverted
(self)¶ Get the inverse of the pose.
Returns: Inverted pose. Return type: LibOVRPose Notes
- Uses
OVR::Posef.Inverted
which is part of the Oculus PC SDK.
- Uses
-
isEqual
(self, LibOVRPose pose, float tolerance=1e-5)¶ Check if poses are close to equal in position and orientation.
Same as using the equality operator (==) on poses, but you can specify and arbitrary value for tolerance.
Parameters: - pose (LibOVRPose) – The other pose.
- tolerance (float, optional) – Tolerance for the comparison, default is 1e-5 as defined in OVR_MATH.h.
Returns: True if pose components are within tolerance from this pose.
Return type:
-
isVisible
(self, int eye)¶ Check if this pose if visible to a given eye.
Visibility testing is done using the current eye render pose for eye. This pose must have a valid bounding box assigned to bounds. If not, this method will always return
True
.See
cullPose()
for more information about the implementation of visibility culling. Note this function only works if there is an active VR session.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: True
if this pose’s bounding box intersects the FOV of the specified eye. ReturnsFalse
if the pose’s bounding box does not intersect the viewing frustum for eye or if a VR session has not been started.Return type: bool Examples
Check if a pose should be culled (needs to be done for each eye):
if cullModel.isVisible(): # ... OpenGL calls to draw the model here ...
-
modelMatrix
¶ Pose as a 4x4 homogeneous transformation matrix.
-
normalMatrix
¶ Normal matrix for transforming normals of meshes associated with poses.
-
normalize
(self)¶ Normalize this pose.
Notes
Uses
OVR::Posef.Normalize
which is part of the Oculus PC SDK.
-
ori
¶ Orientation quaternion [X, Y, Z, W].
Type: ndarray
-
pos
¶ Position vector [X, Y, Z].
Examples
Set the position of the pose:
myPose.pos = [0., 0., -1.5]
Get the x, y, and z coordinates of a pose:
x, y, z = myPose.pos
The ndarray returned by pos directly references the position field data in the pose data structure (ovrPosef). Updating values will directly edit the values in the structure. For instance, you can specify a component of a pose’s position:
myPose.pos[2] = -10.0 # z = -10.0
Assigning pos a name will create a reference to that ndarray which can edit values in the structure:
p = myPose.pos p[1] = 1.5 # sets the Y position of 'myPose' to 1.5
Type: ndarray
-
raycastPose
(self, LibOVRPose targetPose, rayDir=(0., 0., -1.), float maxRange=0.0)¶ Raycast a pose’s bounding box.
This function tests if and where a ray projected from the position of this pose in the direction of rayDir intersects the bounding box of another
LibOVRPose
. The bounding box of the target object will be oriented by the pose it’s associated with.Parameters: - targetPose (LibOVRPose) – Target pose with bounding box.
- rayDir (array_like) – Vector specifying the direction the ray should be projected. This direction is in the reference of the pose.
- maxRange (float) – Length of the ray. If 0.0, the ray will be assumed to have infinite length.
Returns: Position in scene coordinates the ray intersects the bounding box nearest to this pose. Returns None if there is no intersect or the target class does not have a valid bounding box.
Return type: ndarray
Examples
Test where a ray intersects another pose’s bounding box and create a pose object there:
intercept = thisPose.raycastPose(targetPose) if intercept is not None: interceptPose = LibOVRPose(intercept)
Check if a user is touching a bounding box with their right index finger:
fingerLength = 0.1 # 10 cm # check if making a pointing gesture with their right hand if getTouch(CONTROLLER_TYPE_RTOUCH, TOUCH_RINDEXPOINTING): isTouching = handPose.raycastPose(targetPose, maxRange=fingerLength) if isTouching is not None: # run some code here for when touching ... else: # run some code here for when not touching ...
-
raycastSphere
(self, targetPose, float radius=0.5, rayDir=(0., 0., -1.), float maxRange=0.0)¶ Raycast to a sphere.
Project an invisible ray of finite or infinite length from this pose in rayDir and check if it intersects with the targetPose bounding sphere.
This method allows for very basic interaction between objects represented by poses in a scene, including tracked devices.
Specifying maxRange as >0.0 casts a ray of finite length in world units. The distance between the target and ray origin position are checked prior to casting the ray; automatically failing if the ray can never reach the edge of the bounding sphere centered about targetPose. This avoids having to do the costly transformations required for picking.
This raycast implementation can only determine if contact is being made with the object’s bounding sphere, not where on the object the ray intersects. This method might not work for irregular or elongated objects since bounding spheres may not approximate those shapes well. In such cases, one may use multiple spheres at different locations and radii to pick the same object.
Parameters: - targetPose (array_like) – Coordinates of the center of the target sphere (x, y, z).
- radius (float, optional) – The radius of the target.
- rayDir (array_like, optional) – Vector indicating the direction for the ray (default is -Z).
- maxRange (float, optional) – The maximum range of the ray. Ray testing will fail automatically if the target is out of range. Ray is infinite if maxRange=0.0.
Returns: True if the ray intersects anywhere on the bounding sphere, False in every other condition.
Return type: Examples
Basic example to check if the HMD is aligned to some target:
targetPose = LibOVRPose((0.0, 1.5, -5.0)) targetRadius = 0.5 # 2.5 cm isAligned = hmdPose.raycastSphere(targetPose.pos, radius=targetRadius)
Check if someone is touching a target with their finger when making a pointing gesture while providing haptic feedback:
targetPose = LibOVRPose((0.0, 1.5, -0.25)) targetRadius = 0.025 # 2.5 cm fingerLength = 0.1 # 10 cm # check if making a pointing gesture with their right hand isPointing = getTouch(CONTROLLER_TYPE_RTOUCH, TOUCH_RINDEXPOINTING) if isPointing: # do raycasting operation isTouching = handPose.raycastSphere( targetPose.pos, radius=targetRadius, maxRange=fingerLength) if isTouching: # do something here, like make the controller vibrate setControllerVibration( CONTROLLER_TYPE_RTOUCH, 'low', 0.5) else: # stop vibration if no longer touching setControllerVibration(CONTROLLER_TYPE_RTOUCH, 'off')
-
rotate
(self, v, ndarray out=None)¶ Rotate a position vector.
Parameters: - v (array_like) – Vector to rotate.
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Vector rotated by the pose’s orientation.
Return type: ndarray
Notes
- Uses
OVR::Posef.Rotate
which is part of the Oculus PC SDK.
-
setIdentity
(self)¶ Clear this pose’s translation and orientation.
-
setOri
(self, ori)¶ Set the orientation of the pose in a scene.
Parameters: ori (array_like) – Orientation quaternion [X, Y, Z, W].
-
setOriAxisAngle
(self, axis, float angle, bool degrees=True)¶ Set the orientation of this pose using an axis and angle.
Parameters:
-
setPos
(self, pos)¶ Set the position of the pose in a scene.
Parameters: pos (array_like) – Position vector [X, Y, Z].
-
transform
(self, v, ndarray out=None)¶ Transform a position vector.
Parameters: - v (array_like) – Vector to transform [x, y, z].
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Vector transformed by the pose’s position and orientation.
Return type: ndarray
Notes
- Uses
OVR::Vector3f.Transform
which is part of the Oculus PC SDK.
-
transformNormal
(self, v, ndarray out=None)¶ Transform a normal vector.
Parameters: - v (array_like) – Vector to transform (x, y, z).
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Normal vector transformed by the pose’s position and orientation.
Return type: ndarray
Notes
- Uses
OVR::Vector3f.TransformNormal
which is part of the Oculus PC SDK.
-
translate
(self, v, ndarray out=None)¶ Translate a position vector.
Parameters: - v (array_like) – Vector to translate [x, y, z].
- out (ndarray, optional) – Optional output array. Must have dtype=float32 and shape=(3,).
Returns: Vector translated by the pose’s position.
Return type: ndarray
Notes
- Uses
OVR::Vector3f.Translate
which is part of the Oculus PC SDK.
-
turn
(self, axis, float angle, bool degrees=True)¶ Turn (or rotate) this pose about an axis. Successive calls of turn are cumulative.
Parameters:
-
up
¶ Up vector of this pose (+Y is up) (read-only).
Type: ndarray
-
viewMatrix
¶ View matrix.
-
class
psychxr.drivers.libovr.
LibOVRPoseState
(thePose=None, linearVelocity=(0., 0., 0.), angularVelocity=(0., 0., 0.), linearAcceleration=(0., 0., 0.), angularAcceleration=(0., 0., 0.), double timeInSeconds=0.0)¶ Class for representing rigid body poses with additional state information.
Pose states contain the pose of the tracked body, but also angular and linear motion derivatives experienced by the pose. The pose within a state can be accessed via the
thePose
attribute.Velocity and acceleration for linear and angular motion can be used to compute forces applied to rigid bodies and predict the future positions of objects (see
timeIntegrate()
). You can create LibOVRPoseState objects using data from other sources, such as nDOF IMUs for use with VR environments.Parameters: - thePose (LibOVRPose, list, tuple or None) – Rigid body pose this state refers to. Can be a LibOVRPose pose
instance or a tuple/list of a position coordinate (x, y, z) and
orientation quaternion (x, y, z, w). If
None
the pose will be initialized as an identity pose. - linearVelocity (array_like) – Linear acceleration vector [vx, vy, vz] in meters/sec.
- angularVelocity (array_like) – Angular velocity vector [vx, vy, vz] in radians/sec.
- linearAcceleration (array_like) – Linear acceleration vector [ax, ay, az] in meters/sec^2.
- angularAcceleration (array_like) – Angular acceleration vector [ax, ay, az] in radians/sec^2.
- timeInSeconds (float) – Time in seconds this state refers to.
-
angularAcceleration
¶ Angular acceleration vector in radians/s^2.
-
angularVelocity
¶ Angular velocity vector in radians/sec.
-
duplicate
(self)¶ Create a deep copy of this object.
Same as calling copy.deepcopy on an instance.
Returns: An independent copy of this object. Return type: LibOVRPoseState
-
linearAcceleration
¶ Linear acceleration vector in meters/s^2.
-
linearVelocity
¶ Linear velocity vector in meters/sec.
-
thePose
¶ Rigid body pose.
-
timeInSeconds
¶ Absolute time this data refers to in seconds.
-
timeIntegrate
(self, float dt)¶ Time integrate rigid body motion derivatives referenced by the current pose.
Parameters: dt (float) – Time delta in seconds. Returns: Pose at dt. Return type: LibOVRPose Examples
Time integrate a pose for 20 milliseconds (note the returned object is a
LibOVRPose
, not anotherLibOVRPoseState
):newPose = oldPose.timeIntegrate(0.02) pos, ori = newPose.posOri # extract components
Time integration can be used to predict the pose of an object at HMD V-Sync if velocity and acceleration are known. Usually we would pass the predicted time to getDevicePoses or getTrackingState for a more robust estimate of HMD pose at predicted display time. However, in most cases the following will yield the same position and orientation as LibOVR within a few decimal places:
tsec = timeInSeconds() ptime = getPredictedDisplayTime(frame_index) _, headPoseState = getDevicePoses( [TRACKED_DEVICE_TYPE_HMD], absTime=tsec, # not the predicted time! latencyMarker=True) dt = ptime - tsec # time difference from now and v-sync headPoseAtVsync = headPose.timeIntegrate(dt) calcEyePoses(headPoseAtVsync)
- thePose (LibOVRPose, list, tuple or None) – Rigid body pose this state refers to. Can be a LibOVRPose pose
instance or a tuple/list of a position coordinate (x, y, z) and
orientation quaternion (x, y, z, w). If
-
class
psychxr.drivers.libovr.
LibOVRTrackingState
¶ Class for tracking state information.
Instances of this class are returned by
getTrackingState()
calls, with data referenced to the specified absolute time. Pose states with tracked position and orientation, as well as first and second order motion derivatives, for the head and hands can be accessed through attributesheadPose
andhandPoses
.Status flags describe the status of sensor tracking when a tracking state was sampled, accessible for the head and hands through the
statusFlags
andhandStatusFlags
, respectively. You can check each status bit by using the following values:STATUS_ORIENTATION_TRACKED
: Orientation is tracked/reported.STATUS_ORIENTATION_VALID
: Orientation is valid for application use.STATUS_POSITION_TRACKED
: Position is tracked/reported.STATUS_POSITION_VALID
: Position is valid for application use.
As of SDK 1.39, *_VALID flags should be used to determine if tracking data is usable by the application.
-
calibratedOrigin
¶ Pose of the calibrated origin.
This pose is used to find the calibrated origin in space if
recenterTrackingOrigin()
orspecifyTrackingOrigin()
was called. If those functions were never called during a session, this will return an identity pose, which reflects the tracking origin type.
-
handOrientationValid
¶ Hand orientation tracking is valid (bool, bool).
Examples
Check if orientation is valid for the right hand’s tracking state:
rightHandOriTracked = trackingState.handOrientationValid[HAND_RIGHT]
-
handPoses
¶ Hand pose states (LibOVRPoseState, LibOVRPoseState).
Examples
Get the left and right hand pose states:
leftHandPoseState, rightHandPoseState = trackingState.handPoses
-
handPositionValid
¶ Hand position tracking is valid (bool, bool).
Examples
Check if position is valid for the right hand’s tracking state:
rightHandOriTracked = trackingState.handPositionValid[HAND_RIGHT]
-
handStatusFlags
¶ Hand tracking status flags (int, int).
-
headPose
¶ Head pose state (LibOVRPoseState).
-
orientationValid
¶ True if orientation tracking is valid.
-
positionValid
¶ True if position tracking is valid.
-
statusFlags
¶ Head tracking status flags (int).
Examples
Check if orientation was tracked and data is valid for use:
# check if orientation is tracked and valid statusFlags = STATUS_ORIENTATION_TRACKED | STATUS_ORIENTATION_VALID if (trackingState.statusFlags & statusFlags) == statusFlags: print("Orientation is tracked and valid")
-
class
psychxr.drivers.libovr.
LibOVRBounds
(extents=None)¶ Class for constructing and representing 3D axis-aligned bounding boxes.
A bounding box is a construct which represents a 3D rectangular volume about some pose, defined by its minimum and maximum extents in the reference frame of the pose. The axes of the bounding box are aligned to the axes of the world or the associated pose.
Bounding boxes are primarily used for visibility testing; to determine if the extents of an object associated with a pose (eg. the vertices of a model) falls completely outside of the viewing frustum. If so, the model can be culled during rendering to avoid wasting CPU/GPU resources on objects not visible to the viewer. See
cullPose()
for more information.Parameters: extents (tuple, optional) – Minimum and maximum extents of the bounding box (mins, maxs) where mins and maxs specified as coordinates [x, y, z]. If no extents are specified, the bounding box will be invalid until defined. Examples
Create a bounding box and add it to a pose:
# minumum and maximum extents of the bounding box mins = (-.5, -.5, -.5) maxs = (.5, .5, .5) bounds = (mins, maxs) # create the bounding box and add it to a pose bbox = LibOVRBounds(bounds) modelPose = LibOVRPose() modelPose.boundingBox = bbox
-
addPoint
(self, point)¶ Resize the bounding box to encompass a given point. Calling this function for each vertex of a model will create an optimal bounding box for it.
Parameters: point (array_like) – Vector/coordinate to add [x, y, z]. See also
fit()
- Fit a bounding box to enclose a list of points.
-
clear
(self)¶ Clear the bounding box.
-
extents
¶ The extents of the bounding box (mins, maxs).
-
fit
(self, points, bool clear=True)¶ Fit an axis aligned bounding box to enclose specified points. The resulting bounding box is guaranteed to enclose all points, however volume is not necessarily minimized or optimal.
Parameters: - points (array_like) – 2D array of points [x, y, z] to fit, can be a list of vertices from a 3D model associated with the bounding box.
- clear (bool, optional) – Clear the bounding box prior to fitting. If
False
the current bounding box will be re-sized to fit new points.
Examples
Create a bounding box around vertices specified in a list:
# model vertices vertices = [[-1.0, -1.0, 0.0], [-1.0, 1.0, 0.0], [1.0, 1.0, 0.0], [1.0, -1.0, 0.0]] # create an empty bounding box bbox = LibOVRBounds() bbox.fit(vertices) # associate the bounding box to a pose modelPose = LibOVRPose() modelPose.bounds = bbox
-
isValid
¶ True
if a bounding box is valid. Bounding boxes are valid if all dimensions of mins are less than each of maxs which is the case afterclear()
is called.If a bounding box is invalid,
cullPose()
will always returnTrue
.
-
maxs
¶ Point defining the maximum extent of the bounding box.
-
mins
¶ Point defining the minimum extent of the bounding box.
-
-
class
psychxr.drivers.libovr.
LibOVRHmdInfo
¶ Class for general HMD information and capabilities. An instance of this class is returned by calling
getHmdInfo()
.-
defaultEyeFov
¶ Default or recommended eye field-of-views (FOVs) provided by the API.
Returns: Pair of left and right eye FOVs specified as tangent angles [Up, Down, Left, Right]. Return type: tuple (ndarray, ndarray)
-
firmwareVersion
¶ Firmware version for this device.
Returns: Firmware version (major, minor). Return type: tuple (int, int)
-
hasMagYawCorrection
¶ True
if this HMD supports yaw drift correction.
-
hasOrientationTracking
¶ True
if the HMD is capable of tracking orientation.
-
hasPositionTracking
¶ True
if the HMD is capable of tracking position.
-
hid
¶ USB human interface device class identifiers.
Returns: USB HIDs (vendor, product). Return type: tuple (int, int)
-
hmdType
¶ HMD type currently used.
Valid values returned are
HMD_NONE
,HMD_DK1
,HMD_DKHD
,HMD_DK2
,HMD_CB
,HMD_OTHER
,HMD_E3_2015
,HMD_ES06
,HMD_ES09
,HMD_ES11
, HMD_CV1`,HMD_RIFTS
,HMD_QUEST
,HMD_QUEST2
.
-
isDebugDevice
¶ True
if the HMD is a virtual debug device.
-
manufacturer
¶ Get the device manufacturer name.
Returns: Manufacturer name string (utf-8). Return type: str
-
maxEyeFov
¶ Maximum eye field-of-views (FOVs) provided by the API.
Returns: Pair of left and right eye FOVs specified as tangent angles in radians [Up, Down, Left, Right]. Return type: tuple (ndarray, ndarray)
-
productName
¶ Get the product name for this device.
Returns: Product name string (utf-8). Return type: str
-
refreshRate
¶ Nominal refresh rate in Hertz of the display.
Returns: Refresh rate in Hz. Return type: ndarray
-
resolution
¶ Horizontal and vertical resolution of the display in pixels.
Returns: Resolution of the display [w, h]. Return type: ndarray
-
symmetricEyeFov
¶ Symmetric field-of-views (FOVs) for mono rendering.
By default, the Rift uses off-axis FOVs. These frustum parameters make it difficult to converge monoscopic stimuli.
Returns: Pair of left and right eye FOVs specified as tangent angles in radians [Up, Down, Left, Right]. Both FOV objects will have the same values. Return type: tuple (ndarray, ndarray)
-
-
class
psychxr.drivers.libovr.
LibOVRHmdColorSpace
¶ Class for HMD color space data.
This class is used to store color space information related to the HMD. The color space value is a symbolic constant accessed through the colorSpace property.
As of version 23.0 of the Oculus PC SDK, the API provides functions for specifying and retrieving data about the color space of the display. This is needed because the chromaticity coordinates of RGB primaries and the white points differ between models, causing differences in perceived color when content authored for one platform is viewed on another. To deal with this, the API allows you to specify the color space the content was intended for and the driver will remap colors to be best represented on the current display.
When developing an application to run across multiple HMD devices, the manufacturer recommends that you target the CV1 or Quest HMDs since the color gamut on those displays are wider than other HMDs in their product lineup (such as the Rift S).
PsychXR provides additional information about these color spaces, such as the chromaticity coordinates used by various devices in the Oculus(tm) product lineup. These values can be accessed using properties associated to instances of this class.
-
blue
¶ Chromaticity coordinate for the blue primary (CIE 1931 xy) used by the display (ndarray). This is set by the value of LibOVRHmdColorSpace.colorSpace.
-
colorSpace
¶ The color space (int). A symbolic constant representing a color space.
Valid values returned are
COLORSPACE_UNKNOWN
,COLORSPACE_UNMANAGED
,COLORSPACE_RIFT_CV1
,COLORSPACE_RIFT_S
,COLORSPACE_QUEST
,COLORSPACE_REC_2020
,COLORSPACE_REC_709
,COLORSPACE_P3
orCOLORSPACE_ADOBE_RGB
.Notes
If colorSpace is set to
COLORSPACE_UNMANAGED
, the chromaticity coordinates will be set to the defaults for the current HMD. For the DK2, Rec. 709 coordinates will be used (COLORSPACE_REC_709
).
-
static
getRGBPrimaries
(int colorSpace)¶ Get chromaticity coordinates (CIE 1931 xy) RGB primaries for a given color model.
Parameters: colorSpace (int) – Symbolic constant representing a color space (e.g., COLORSPACE_RIFT_CV1
).Returns: 3x2 array of RGB primaries corresponding to the specified color model. Return type: ndarray
-
static
getWhitePoint
(int colorSpace)¶ Get chromaticity coordinates (CIE 1931 xy) of te white point for a given color model.
Parameters: colorSpace (int) – Symbolic constant representing a color space (e.g., COLORSPACE_RIFT_CV1
).Returns: Length 2 array of the white point corresponding to the specified color model. Return type: ndarray
-
green
¶ Chromaticity coordinate for the green primary (CIE 1931 xy) used by the display (ndarray). This is set by the value of LibOVRHmdColorSpace.colorSpace.
-
red
¶ Chromaticity coordinate for the red primary (CIE 1931 xy) used by the display (ndarray). This is set by the value of LibOVRHmdColorSpace.colorSpace.
-
whitePoint
¶ Chromaticity coordinate for the white point (CIE 1931 xy) used by the display (ndarray). This is set by the value of LibOVRHmdColorSpace.colorSpace.
-
-
class
psychxr.drivers.libovr.
LibOVRTrackerInfo
¶ Class for storing tracker (sensor) information such as pose, status, and camera frustum information. This object is returned by calling
getTrackerInfo()
. Attributes of this class are read-only.-
farZ
¶ Far clipping plane of the sensor frustum in meters (read-only).
-
horizontalFov
¶ Horizontal FOV of the sensor in radians (read-only).
-
isConnected
¶ True if the sensor is connected and available (read-only).
-
isPoseTracked
¶ True if the sensor has a valid pose (read-only).
-
leveledPose
¶ Gravity aligned pose of the sensor (read-only).
-
nearZ
¶ Near clipping plane of the sensor frustum in meters (read-only).
-
pose
¶ The pose of the sensor (read-only).
-
trackerIndex
¶ Tracker index this objects refers to (read-only).
-
verticalFov
¶ Vertical FOV of the sensor in radians (read-only).
-
-
class
psychxr.drivers.libovr.
LibOVRSessionStatus
¶ Class for storing session status information. An instance of this class is returned when
getSessionStatus()
is called.One can check if there was a status change between calls of
getSessionStatus()
by using the==
and!=
operators on the returnedLibOVRSessionStatus
instances.-
depthRequested
¶ True
if the a depth texture is requested.Notes
- This feature is currently unused by PsychXR.
-
displayLost
¶ True
the the display was lost.If occurs, the HMD was disconnected and the current session is invalid. You need to destroy all resources associated with current session and call
create()
again. Alternatively, you can raise an error and shutdown the application.
-
hasInputFocus
¶ True
if the application has input focus.If the application has focus, the statistics presented by the performance HUD will reflect the current application’s frame statistics.
-
hmdMounted
¶ True
if the HMD is being worn on the user’s head.
-
hmdPresent
¶ True
if the HMD is present.
-
isVisible
¶ True
the application has focus and visible in the HMD.
-
overlayPresent
¶ True
if the system UI is visible.
-
shouldQuit
¶ True
the application was signaled to quit.This can occur if the user requests the application exit through the system UI menu. You can ignore this flag if needed.
-
shouldRecenter
¶ True
if the application was signaled to recenter.This happens when the user requests the application recenter the VR scene on their current physical location through the system UI. You can ignore this request or clear it by calling
clearShouldRecenterFlag()
.
-
-
class
psychxr.drivers.libovr.
LibOVRBoundaryTestResult
¶ Class for boundary collision test data. An instance of this class is returned when
testBoundary()
is called.-
closestDistance
¶ Closest point to the boundary in meters.
-
closestPoint
¶ Closest point on the boundary surface.
-
closestPointNormal
¶ Unit normal of the closest boundary surface.
-
isTriggering
¶ True
if the play area boundary is triggering. Since the boundary fades-in, it might not be perceptible when this is called.
-
-
class
psychxr.drivers.libovr.
LibOVRPerfStatsPerCompositorFrame
¶ Class for frame performance statistics per compositor frame.
Instances of this class are returned by calling
getPerfStats()
and accessing theLibOVRPerfStats.frameStats
field of the returnedLibOVRPerfStats
instance.Data contained in this class provide information about compositor performance. Metrics include motion-to-photon latency, dropped frames, and elapsed times of various stages of frame processing to the vertical synchronization (V-Sync) signal of the HMD.
Calling
resetFrameStats()
will reset integer fields of this class in successive calls togetPerfStats()
.-
appCpuElapsedTime
¶ Time in seconds the CPU spent between calls of
endFrame()
. Form the point whenendFrame()
releases control back to the application, to the next time it is called.
-
appDroppedFrameCount
¶ If
endFrame()
is not called on-time, this will increment (i.e. missed HMD vertical sync deadline).Examples
Check if the application dropped a frame:
framesDropped = frameStats.frameStats[0].appDroppedFrameCount > lastFrameStats.frameStats[0].appDroppedFrameCount
-
appFrameIndex
¶ Index increments after each call to
endFrame()
.
-
appGpuElapsedTime
¶ Time in seconds the GPU spent between calls of
endFrame()
.
-
appMotionToPhotonLatency
¶ Motion-to-photon latency in seconds computed using the marker set by
getTrackingState()
or the sensor sample time set bysetSensorSampleTime()
.
-
appQueueAheadTime
¶ Queue-ahead time in seconds. If >11 ms, the CPU is outpacing the GPU workload by 1 frame.
-
aswActivatedToggleCount
¶ How many frames ASW activated during the runtime of this application.
-
aswFailedFrameCount
¶ Number of frames the compositor failed to present extrapolated frames using ASW.
-
aswIsActive
¶ True
if Asynchronous Space Warp (ASW) was active this frame.
-
aswPresentedFrameCount
¶ Number of frames the compositor extrapolated using ASW.
-
compositorCpuElapsedTime
¶ Time in seconds the compositor spends on the CPU.
-
compositorCpuStartToGpuEndElapsedTime
¶ Time in seconds between the point the compositor executes and completes distortion/timewarp. Value is -1.0 if GPU time is not available.
-
compositorDroppedFrameCount
¶ Number of frames dropped by the compositor. This can happen spontaneously for reasons not related to application performance.
-
compositorFrameIndex
¶ Increments when the compositor completes a distortion pass, happens regardless if
endFrame()
was called late.
-
compositorGpuElapsedTime
¶ Time in seconds the compositor spends on the GPU.
-
compositorGpuEndToVsyncElapsedTime
¶ Time in seconds left between the compositor is complete and the target vertical synchronization (v-sync) on the HMD.
-
compositorLatency
¶ Motion-to-photon latency of the compositor, which include the latency of ‘timewarp’ needed to correct for application latency and dropped application frames.
-
hmdVsyncIndex
¶ Increments every HMD vertical sync signal.
-
timeToVsync
¶ Total time elapsed from when CPU control is handed off to the compositor to HMD vertical synchronization signal (V-Sync).
-
-
class
psychxr.drivers.libovr.
LibOVRPerfStats
¶ Class for frame performance statistics.
Instances of this class are returned by calling
getPerfStats()
.-
adaptiveGpuPerformanceScale
¶ Adaptive performance scale value. This value ranges between 0.0 and 1.0. If the application is taking up too many GPU resources, this value will be less than 1.0, indicating the application needs to throttle GPU usage somehow to maintain performance. If the value is 1.0, the GPU is being utilized the correct amount for the application.
-
anyFrameStatsDropped
¶ True
if compositor frame statistics have been dropped. This occurs ifgetPerfStats()
is called at a rate less than 1/5th the refresh rate of the HMD. You can obtain the refresh rate for your model of HMD by callinggetHmdInfo()
and accessing theLibOVRHmdInfo.refreshRate
field of the returnedLibOVRHmdInfo
instance.
-
aswIsAvailable
¶ True
is ASW is enabled.
-
frameStats
¶ Performance stats per compositor frame. Statistics are in reverse chronological order where the first index is the most recent. Only indices 0 to
LibOVRPerfStats.frameStatsCount
are valid.
-
frameStatsCount
¶ Number of compositor frame statistics available. The maximum number of frame statistics is 5. If 1 is returned, the application is calling
getFrameStats()
at a rate equal to or greater than the refresh rate of the display.
-
visibleProcessId
¶ Visible process ID.
Since performance stats can be obtained for any application running on the LibOVR runtime that has focus, this value should equal the current process ID returned by
os.getpid()
to ensure the statistics returned are for the current application.Examples
Check if frame statistics are for the present PsychXR application:
perfStats = getPerfStats() if perfStats.visibleProcessId == os.getpid(): # has focus, performance stats are for this application
-
-
class
psychxr.drivers.libovr.
LibOVRHapticsInfo
¶ Class for touch haptics engine information.
-
queueMinSizeToAvoidStarvation
¶ Queue size required to prevent starving the haptics engine.
-
sampleRateHz
¶ Haptics engine frequency/sample-rate.
-
sampleTime
¶ Time in seconds per sample. You can compute the total playback time of a haptics buffer with the formula
sampleTime * samplesCount
.
-
submitMaxSamples
¶ Maximum number of samples that can be sent to the haptics engine.
-
submitMinSamples
¶ Minimum number of samples that can be sent to the haptics engine.
-
submitOptimalSamples
¶ Optimal number of samples for the haptics engine.
-
-
class
psychxr.drivers.libovr.
LibOVRHapticsBuffer
(buffer)¶ Class for haptics buffer data for controller vibration.
Instances of this class store a buffer of vibration amplitude values which can be passed to the haptics engine for playback using the
submitControllerVibration()
function. Samples are stored as a 1D array of 32-bit floating-point values ranging between 0.0 and 1.0, with a maximum length ofHAPTICS_BUFFER_SAMPLES_MAX - 1
. You can access this buffer through thesamples
attribute.One can use Numpy functions to generate samples for the haptics buffer. Here is an example were amplitude ramps down over playback:
samples = np.linspace( 1.0, 0.0, num=HAPTICS_BUFFER_SAMPLES_MAX-1, dtype=np.float32) hbuff = LibOVRHapticsBuffer(samples) # vibrate right Touch controller submitControllerVibration(CONTROLLER_TYPE_RTOUCH, hbuff)
For information about the haptics engine, such as sampling frequency, call
getHapticsInfo()
and inspect the returnedLibOVRHapticsInfo
object.Parameters: buffer (array_like) – Buffer of samples. Must be a 1D array of floating point values between 0.0 and 1.0. If an ndarray with dtype float32 is specified, the buffer will be set without copying. -
samples
¶ Haptics buffer samples. Each sample specifies the amplitude of vibration at a given point of playback. Must have a length less than
HAPTICS_BUFFER_SAMPLES_MAX
.Warning
Do not change the value of samples during haptic buffer playback. This may crash the application. Check the playback status of the haptics engine before setting the array.
-
samplesCount
¶ Number of haptic buffer samples stored. This value will always be less than
HAPTICS_BUFFER_SAMPLES_MAX
.
-
Functions¶
-
psychxr.drivers.libovr.
success
(int result)¶ Check if an API return indicates success.
Returns: True
if API call was an successful (result > 0).Return type: bool
-
psychxr.drivers.libovr.
failure
(int result)¶ Check if an API return indicates failure (error).
Returns: True
if API call returned an error (result < 0).Return type: bool
-
psychxr.drivers.libovr.
getBool
(bytes propertyName, bool defaultVal=False)¶ Read a LibOVR boolean property.
Parameters: Returns: Value of the property. Returns defaultVal if the property does not exist.
Return type:
-
psychxr.drivers.libovr.
setBool
(bytes propertyName, bool value=True)¶ Write a LibOVR boolean property.
Parameters: Returns: True
if the property was set successfully,False
if the property was read-only or does not exist.Return type:
-
psychxr.drivers.libovr.
getInt
(bytes propertyName, int defaultVal=0)¶ Read a LibOVR integer property.
Parameters: Returns: Value of the property. Returns defaultVal if the property does not exist.
Return type:
-
psychxr.drivers.libovr.
setInt
(bytes propertyName, int value)¶ Write a LibOVR integer property.
Parameters: Returns: True
if the property was set successfully,False
if the property was read-only or does not exist.Return type: Examples
Set the performance HUD mode to show summary information:
setInt(PERF_HUD_MODE, PERF_HUD_PERF_SUMMARY)
Switch off the performance HUD:
setInt(PERF_HUD_MODE, PERF_OFF)
-
psychxr.drivers.libovr.
getFloat
(bytes propertyName, float defaultVal=0.0)¶ Read a LibOVR floating point number property.
Parameters: Returns: Value of the property. Returns defaultVal if the property does not exist.
Return type:
-
psychxr.drivers.libovr.
setFloat
(bytes propertyName, float value)¶ Write a LibOVR floating point number property.
Parameters: Returns: True
if the property was set successfully,False
if the property was read-only or does not exist.Return type:
-
psychxr.drivers.libovr.
getFloatArray
(bytes propertyName, ndarray values)¶ Read a LibOVR float array property.
Parameters: - propertyName (bytes) – Name of the property to get.
- values (ndarray) – Output array array for values, must be 1-D and have dtype=float32.
Returns: Number of values successfully read from the property.
Return type: Examples
Get the position of the stereo debug guide:
guidePos = numpy.zeros((3,), dtype=np.float32) # array to write to result = getFloatArray(DEBUG_HUD_STEREO_GUIDE_POSITION, guidePos) # check if the array we specified was long enough to store the values if result <= len(guidePos): # success
-
psychxr.drivers.libovr.
setFloatArray
(bytes propertyName, ndarray values)¶ Write a LibOVR floating point number property.
Parameters: - propertyName (bytes) – Name of the property to set.
- values (ndarray) – Value to write, must be 1-D and have dtype=float32.
Returns: True
if the property was set successfully,False
if the property was read-only or does not exist.Return type: Examples
Set the position of the stereo debug guide:
guidePos = numpy.asarray([0., 0., -10.0], dtype=np.float32) setFloatArray(DEBUG_HUD_STEREO_GUIDE_POSITION, guidePos)
-
psychxr.drivers.libovr.
getString
(bytes propertyName, defaultVal=u'')¶ Read a LibOVR string property.
Parameters: Returns: Value of the property. Returns defaultVal if the property does not exist.
Return type: Notes
- Strings passed to this function are converted to bytes before being passed
to
OVR::ovr_GetString
.
- Strings passed to this function are converted to bytes before being passed
to
-
psychxr.drivers.libovr.
setString
(bytes propertyName, value)¶ Write a LibOVR floating point number property.
Parameters: Returns: True
if the property was set successfully,False
if the property was read-only or does not exist.Return type:
-
psychxr.drivers.libovr.
isOculusServiceRunning
(int timeoutMs=100)¶ Check if the Oculus Runtime is loaded and running.
Parameters: timeoutMS (int) – Timeout in milliseconds. Returns: True if the Oculus background service is running. Return type: bool
-
psychxr.drivers.libovr.
isHmdConnected
(int timeoutMs=100)¶ Check if an HMD is connected.
Parameters: timeoutMs (int) – Timeout in milliseconds. Returns: True if a LibOVR compatible HMD is connected. Return type: bool
-
psychxr.drivers.libovr.
getHmdInfo
()¶ Get HMD information.
Returns: HMD information. Return type: LibOVRHmdInfo
-
psychxr.drivers.libovr.
getHmdColorSpace
()¶ Get HMD colorspace information.
Upon starting a new session, the default colorspace used is for the CV1. Can only be called after
start()
was called.Returns: HMD colorspace information. Return type: LibOVRHmdColorSpace Examples
Get the current color space in use:
colorSpaceInfo = getHmdColorSpace()
Get the color coordinates of the RGB primaries:
redX, redY = colorSpaceInfo.red greenX, greenY = colorSpaceInfo.red blueX, blueY = colorSpaceInfo.red
Get the white point in use:
whiteX, whiteY = colorSpaceInfo.whitePoint
-
psychxr.drivers.libovr.
setClientColorSpace
(colorSpace)¶ Set the colorspace used by the client.
This function is used by the driver to transform color values between spaces. The purpose of this is to allow content authored for one model of HMD to appear correctly on others. Can oly be called after start() was called. Until this function is not called, the color space will be assumed to be
COLORSPACE_UNKNOWN
which defaults toCOLORSPACE_RIFT_CV1
.New as of version 0.2.4
Parameters: colorSpace (LibOVRHmdColorSpace or int) – Color space information descriptor or symbolic constant (e.g., COLORSPACE_RIFT_CV1
.Returns: Return code for the ovr_SetClientColorDesc call. Return type: int Examples
Tell the driver to remap colors for an application authored using the Quest to be displayed correctly on the current device:
result = setClientColorSpace(COLORSPACE_QUEST)
-
psychxr.drivers.libovr.
initialize
(bool focusAware=False, int connectionTimeout=0, logCallback=None)¶ Initialize the session.
Parameters: - focusAware (bool, optional) – Client is focus aware.
- connectionTimeout (bool, optional) – Timeout in milliseconds for connecting to the server.
- logCallback (object, optional) – Python callback function for logging. May be called at anytime from
any thread until
shutdown()
is called. Function must accept arguments level and message. Where level is passed the logging level and message the message string. Callbacks message levels can beLOG_LEVEL_DEBUG
,LOG_LEVEL_INFO
, andLOG_LEVEL_ERROR
. The application can filter messages accordingly.
Returns: Return code of the LibOVR API call
OVR::ovr_Initialize
. ReturnsSUCCESS
if completed without errors. In the event of an error, possible return values are:ERROR_INITIALIZE
: Initialization error.ERROR_LIB_LOAD
: Failed to load LibOVRRT.ERROR_LIB_VERSION
: LibOVRRT version incompatible.ERROR_SERVICE_CONNECTION
: Cannot connect to OVR service.ERROR_SERVICE_VERSION
: OVR service version is incompatible.ERROR_INCOMPATIBLE_OS
: Operating system version is incompatible.ERROR_DISPLAY_INIT
: Unable to initialize the HMD.ERROR_SERVER_START
: Cannot start a server.ERROR_REINITIALIZATION
: Reinitialized with a different version.
Return type: Examples
Passing a callback function for logging:
def myLoggingCallback(level, message): level_text = { LOG_LEVEL_DEBUG: '[DEBUG]:', LOG_LEVEL_INFO: '[INFO]:', LOG_LEVEL_ERROR: '[ERROR]:'} # print message like '[INFO]: IAD changed to 62.1mm' print(level_text[level], message) result = initialize(logCallback=myLoggingCallback)
-
psychxr.drivers.libovr.
create
()¶ Create a new session. Control is handed over to the application from Oculus Home.
Starting a session will initialize and create a new session. Afterwards API functions will return valid values. You can only create one session per interpreter thread. All other files/modules within the same thread which import PsychXR make API calls to the same session after create is called.
Returns: Result of the OVR::ovr_Create
API call. A session was successfully created if the result isSUCCESS
.Return type: int
-
psychxr.drivers.libovr.
checkSessionStarted
()¶ Check of a session has been created.
This value should return True between calls of
create()
anddestroy()
. You can use this to determine if you can make API calls which require an active session.Returns: True if a session is present. Return type: bool
-
psychxr.drivers.libovr.
destroyTextureSwapChain
(int swapChain)¶ Destroy a texture swap chain.
Once destroyed, the swap chain’s resources will be freed.
Parameters: swapChain (int) – Swap chain identifier/index.
-
psychxr.drivers.libovr.
destroyMirrorTexture
()¶ Destroy the mirror texture.
-
psychxr.drivers.libovr.
destroy
()¶ Destroy a session.
Must be called after every successful
create()
call. Calling destroy will invalidate the current session and all resources must be freed and re-created.
-
psychxr.drivers.libovr.
shutdown
()¶ End the current session.
Clean-up routines are executed that destroy all swap chains and mirror texture buffers, afterwards control is returned to Oculus Home. This must be called after every successful
initialize()
call.
-
psychxr.drivers.libovr.
getGraphicsLUID
()¶ The graphics device LUID.
Returns: Reserved graphics LUID. Return type: str
-
psychxr.drivers.libovr.
setHighQuality
(bool enable)¶ Enable high quality mode.
This enables 4x anisotropic sampling by the compositor to reduce the appearance of high-frequency artifacts in the visual periphery due to distortion.
Parameters: enable (bool) – Enable high-quality mode.
-
psychxr.drivers.libovr.
setHeadLocked
(bool enable)¶ Set the render layer state for head locking.
Head-locking prevents the compositor from applying asynchronous time warp (ASW) which compensates for rendering latency. Under normal circumstances where head pose data is retrieved from LibOVR using
getTrackingState()
orgetDevicePoses()
calls, it should be enabled to prevent juddering and improve visual stability.However, when using custom head poses (eg. fixed, or from a motion tracker) this system may cause the render layer to slip around, as internal IMU data will be incongruous with externally supplied head posture data. If you plan on passing custom poses to
calcEyePoses()
, ensure that head locking is enabled.Head locking is disabled by default when a session is started.
Parameters: enable (bool) – Enable head-locking when rendering to the eye render layer.
-
psychxr.drivers.libovr.
getPixelsPerTanAngleAtCenter
(int eye)¶ Get pixels per tan angle (=1) at the center of the display.
Values reflect the FOVs set by the last call to
setEyeRenderFov()
(or else the default FOVs will be used.)Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Pixels per tan angle at the center of the screen. Return type: tuple
-
psychxr.drivers.libovr.
getTanAngleToRenderTargetNDC
(int eye, tanAngle)¶ Convert FOV tan angle to normalized device coordinates (NDC).
Parameters: Returns: NDC coordinates X, Y [-1, 1].
Return type:
-
psychxr.drivers.libovr.
getPixelsPerDegree
(int eye)¶ Get pixels per degree at the center of the display.
Values reflect the FOVs set by the last call to
setEyeRenderFov()
(or else the default FOVs will be used.)Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Pixels per degree at the center of the screen (h, v). Return type: tuple
-
psychxr.drivers.libovr.
getDistortedViewport
(int eye)¶ Get the distorted viewport.
You must call
setEyeRenderFov()
first for values to be valid.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.
-
psychxr.drivers.libovr.
getEyeRenderFov
(int eye)¶ Get the field-of-view to use for rendering.
The FOV for a given eye are defined as a tuple of tangent angles (Up, Down, Left, Right). By default, this function will return the default (recommended) FOVs after
create()
is called.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Eye FOV tangent angles [UpTan, DownTan, LeftTan, RightTan], distance to near and far clipping planes in meters. Return type: tuple Examples
Getting the tangent angles:
leftFov, nearClip, farClip = getEyeRenderFOV(EYE_LEFT) # left FOV tangent angles, do the same for the right upTan, downTan, leftTan, rightTan = leftFov
-
psychxr.drivers.libovr.
setEyeRenderFov
(int eye, fov, float nearClip=0.01, float farClip=1000.)¶ Set the field-of-view of a given eye. This is used to compute the projection matrix.
By default, this function will return the default FOVs after
create()
is called (seeLibOVRHmdInfo.defaultEyeFov
). You can override these values usingLibOVRHmdInfo.maxEyeFov
andLibOVRHmdInfo.symmetricEyeFov
, or with custom values (see Examples below).Parameters: - eye (int) – Eye index. Values are
EYE_LEFT
andEYE_RIGHT
. - fov (array_like) – Eye FOV tangent angles [UpTan, DownTan, LeftTan, RightTan].
- farClip (nearClip,) – Near and far clipping planes in meters. Used when computing the projection matrix.
Examples
Setting eye render FOVs to symmetric (needed for mono rendering):
leftFov, rightFov = getSymmetricEyeFOVs() setEyeRenderFOV(EYE_LEFT, leftFov) setEyeRenderFOV(EYE_RIGHT, rightFov)
Using custom values:
# Up, Down, Left, Right tan angles setEyeRenderFOV(EYE_LEFT, [1.0, -1.0, -1.0, 1.0])
- eye (int) – Eye index. Values are
-
psychxr.drivers.libovr.
getEyeAspectRatio
(int eye)¶ Get the aspect ratio of an eye.
Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Aspect ratio of the eye’s FOV (width / height). Return type: float
-
psychxr.drivers.libovr.
getEyeHorizontalFovRadians
(int eye)¶ Get the angle of the horizontal field-of-view (FOV) for a given eye.
Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Horizontal FOV of a given eye in radians. Return type: float
-
psychxr.drivers.libovr.
getEyeVerticalFovRadians
(int eye)¶ Get the angle of the vertical field-of-view (FOV) for a given eye.
Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Vertical FOV of a given eye in radians. Return type: float
-
psychxr.drivers.libovr.
getEyeFocalLength
(int eye)¶ Get the focal length of the eye’s frustum.
Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Focal length in meters. Return type: float Notes
- This does not reflect the optical focal length of the HMD.
-
psychxr.drivers.libovr.
calcEyeBufferSize
(int eye, float texelsPerPixel=1.0)¶ Get the recommended buffer (texture) sizes for eye buffers.
Should be called after
setEyeRenderFov()
. Returns buffer resolutions in pixels (w, h). The values can be used when configuring a framebuffer or swap chain for rendering.Parameters: - eye (int) – Eye index. Use either
EYE_LEFT
orEYE_RIGHT
. - texelsPerPixel (float, optional) – Display pixels per texture pixels at the center of the display. Use a value less than 1.0 to improve performance at the cost of resolution. Specifying a larger texture is possible, but not recommended by the manufacturer.
Returns: Buffer widths and heights (w, h) for each eye.
Return type: Examples
Getting the buffer size for the swap chain:
# get HMD info hmdInfo = getHmdInfo() # eye FOVs must be set first! leftFov, rightFov = hmdInfo.defaultEyeFov setEyeRenderFov(EYE_LEFT, leftFov) setEyeRenderFov(EYE_RIGHT, rightFov) leftBufferSize, rightBufferSize = calcEyeBufferSize() leftW leftH = leftBufferSize rightW, rightH = rightBufferSize # combined size if using a single texture buffer for both eyes bufferW, bufferH = leftW + rightW, max(leftH, rightH) # create a swap chain createTextureSwapChainGL(TEXTURE_SWAP_CHAIN0, bufferW, bufferH)
Notes
- This function returns the recommended texture resolution for a specified eye. If you are using a single buffer for both eyes, that buffer should be as wide as the combined width of both eye’s values.
- eye (int) – Eye index. Use either
-
psychxr.drivers.libovr.
getLayerEyeFovFlags
()¶ Get header flags for the render layer.
Returns: Flags from OVR::ovrLayerEyeFov.Header.Flags
.Return type: unsigned int See also
setLayerEyeFovFlags()
- Set layer flags.
Examples
Check if a flag is set:
layerFlags = getLayerEyeFovFlags() if (layerFlags & LAYER_FLAG_HIGH_QUALITY) == LAYER_FLAG_HIGH_QUALITY: print('high quality enabled!')
-
psychxr.drivers.libovr.
setLayerEyeFovFlags
(unsigned int flags)¶ Set header flags for the render layer.
Parameters: flags (int) – Flags to set. Flags can be ORed together to apply multiple settings. Valid values for flags are:
LAYER_FLAG_HIGH_QUALITY
: Enable high quality mode which tells the compositor to use 4x anisotropic filtering when sampling.LAYER_FLAG_TEXTURE_ORIGIN_AT_BOTTOM_LEFT
: Tell the compositor the texture origin is at the bottom left, required for using OpenGL textures.LAYER_FLAG_HEAD_LOCKED
: Enable head locking, which forces the render layer transformations to be head referenced.
See also
getLayerEyeFovFlags()
- Get layer flags.
Notes
LAYER_FLAG_HIGH_QUALITY
andLAYER_FLAG_TEXTURE_ORIGIN_AT_BOTTOM_LEFT
are recommended settings and are enabled by default.
Examples
Enable head-locked mode:
layerFlags = getLayerEyeFovFlags() # get current flags layerFlags |= LAYER_FLAG_HEAD_LOCKED # set head-locking setLayerEyeFovFlags(layerFlags) # set the flags again
-
psychxr.drivers.libovr.
createTextureSwapChainGL
(int swapChain, int width, int height, int textureFormat=FORMAT_R8G8B8A8_UNORM_SRGB, int levels=1)¶ Create a texture swap chain for eye image buffers.
Parameters: - swapChain (int) – Swap chain handle to initialize, usually
SWAP_CHAIN*
. - width (int) – Width of texture in pixels.
- height (int) – Height of texture in pixels.
- textureFormat (int) –
Texture format to use. Valid color texture formats are:
FORMAT_R8G8B8A8_UNORM
FORMAT_R8G8B8A8_UNORM_SRGB
FORMAT_R16G16B16A16_FLOAT
FORMAT_R11G11B10_FLOAT
Depth texture formats:
FORMAT_D16_UNORM
FORMAT_D24_UNORM_S8_UINT
FORMAT_D32_FLOAT
Other Parameters: levels (int) – Mip levels to use, default is 1.
Returns: The result of the
OVR::ovr_CreateTextureSwapChainGL
API call.Return type: Examples
Create a texture swap chain:
result = createTextureSwapChainGL(TEXTURE_SWAP_CHAIN0, texWidth, texHeight, FORMAT_R8G8B8A8_UNORM) # set the swap chain for each eye buffer for eye in range(EYE_COUNT): setEyeColorTextureSwapChain(eye, TEXTURE_SWAP_CHAIN0)
- swapChain (int) – Swap chain handle to initialize, usually
-
psychxr.drivers.libovr.
getTextureSwapChainLengthGL
(int swapChain)¶ Get the length of a specified swap chain.
Parameters: swapChain (int) – Swap chain handle to query. Must be a swap chain initialized by a previous call to createTextureSwapChainGL()
.Returns: Result of the ovr_GetTextureSwapChainLength
API call and the length of that swap chain.Return type: tuple of int See also
getTextureSwapChainCurrentIndex()
- Get the current swap chain index.
getTextureSwapChainBufferGL()
- Get the current OpenGL swap chain buffer.
Examples
Get the swap chain length for the previously created
TEXTURE_SWAP_CHAIN0
:result, length = getTextureSwapChainLengthGL(TEXTURE_SWAP_CHAIN0)
-
psychxr.drivers.libovr.
getTextureSwapChainCurrentIndex
(int swapChain)¶ Get the current buffer index within the swap chain.
Parameters: swapChain (int) – Swap chain handle to query. Must be a swap chain initialized by a previous call to createTextureSwapChainGL()
.Returns: Result of the OVR::ovr_GetTextureSwapChainCurrentIndex
API call and the index of the buffer.Return type: tuple of int See also
getTextureSwapChainLengthGL()
- Get the length of a swap chain.
getTextureSwapChainBufferGL()
- Get the current OpenGL swap chain buffer.
-
psychxr.drivers.libovr.
getTextureSwapChainBufferGL
(int swapChain, int index)¶ Get the texture buffer as an OpenGL name at a specific index in the swap chain for a given swapChain.
Parameters: - swapChain (int) – Swap chain handle to query. Must be a swap chain initialized by a
previous call to
createTextureSwapChainGL()
. - index (int) – Index within the swap chain to retrieve its OpenGL texture name.
Returns: Result of the
OVR::ovr_GetTextureSwapChainBufferGL
API call and the OpenGL texture buffer name. A OpenGL buffer name is invalid when 0, check the returned API call result for an error condition.Return type: Examples
Get the OpenGL texture buffer name associated with the swap chain index:
# get the current available index swapChain = TEXTURE_SWAP_CHAIN0 result, currentIdx = getSwapChainCurrentIndex(swapChain) # get the OpenGL buffer name result, texId = getTextureSwapChainBufferGL(swapChain, currentIdx) # bind the texture glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texId, 0)
- swapChain (int) – Swap chain handle to query. Must be a swap chain initialized by a
previous call to
-
psychxr.drivers.libovr.
setEyeColorTextureSwapChain
(int eye, int swapChain)¶ Set the color texture swap chain for a given eye.
Should be called after a successful
createTextureSwapChainGL()
call but before any rendering is done.Parameters: - eye (int) – Eye index. Use either
EYE_LEFT
orEYE_RIGHT
. - swapChain (int) – Swap chain handle to query. Must be a swap chain initialized by a
previous call to
createTextureSwapChainGL()
.
See also
createTextureSwapChainGL()
- Create a OpenGL buffer swap chain.
Examples
Associate the swap chain with both eyes (single buffer for stereo views):
setEyeColorTextureSwapChain(EYE_LEFT, TEXTURE_SWAP_CHAIN0) setEyeColorTextureSwapChain(EYE_RIGHT, TEXTURE_SWAP_CHAIN0) # same as above but with a loop for eye in range(EYE_COUNT): setEyeColorTextureSwapChain(eye, TEXTURE_SWAP_CHAIN0)
Associate a swap chain with each eye (separate buffer for stereo views):
setEyeColorTextureSwapChain(EYE_LEFT, TEXTURE_SWAP_CHAIN0) setEyeColorTextureSwapChain(EYE_RIGHT, TEXTURE_SWAP_CHAIN1) # with a loop ... for eye in range(EYE_COUNT): setEyeColorTextureSwapChain(eye, TEXTURE_SWAP_CHAIN0 + eye)
- eye (int) – Eye index. Use either
-
psychxr.drivers.libovr.
createMirrorTexture
(int width, int height, int textureFormat=FORMAT_R8G8B8A8_UNORM_SRGB, int mirrorOptions=MIRROR_OPTION_DEFAULT)¶ Create a mirror texture.
This displays the content of the rendered images being presented on the HMD. The image is automatically refreshed to reflect the current content on the display. This displays the post-distortion texture.
Parameters: - width (int) – Width of texture in pixels.
- height (int) – Height of texture in pixels.
- textureFormat (int) –
Color texture format to use, valid texture formats are:
FORMAT_R8G8B8A8_UNORM
FORMAT_R8G8B8A8_UNORM_SRGB
FORMAT_R16G16B16A16_FLOAT
FORMAT_R11G11B10_FLOAT
- mirrorOptions (int, optional) –
Mirror texture options. Specifies how to display the rendered content. By default,
MIRROR_OPTION_DEFAULT
is used which displays the post-distortion image of both eye buffers side-by-side. Other options are available by specifying the following flags:MIRROR_OPTION_POST_DISTORTION
- Barrel distorted eye buffer.MIRROR_OPTION_LEFT_EYE_ONLY
andMIRROR_OPTION_RIGHT_EYE_ONLY
- show rectilinear images of either the left of right eye. These values are mutually exclusive.MIRROR_OPTION_INCLUDE_GUARDIAN
- Show guardian boundary system in mirror texture.MIRROR_OPTION_INCLUDE_NOTIFICATIONS
- Show notifications received on the mirror texture.MIRROR_OPTION_INCLUDE_SYSTEM_GUI
- Show the system menu when accessed via the home button on the controller.MIRROR_OPTION_FORCE_SYMMETRIC_FOV
- Force mirror output to use symmetric FOVs. Only valid whenMIRROR_OPTION_POST_DISTORTION
is not specified.
Multiple option flags can be combined by using the
|
operator and passed to mirrorOptions. However, some options cannot be used in conjunction with each other, if so, this function may returnERROR_INVALID_PARAMETER
.
Returns: Result of API call
OVR::ovr_CreateMirrorTextureWithOptionsGL
.Return type:
-
psychxr.drivers.libovr.
getMirrorTexture
()¶ Mirror texture ID.
Returns: Result of API call OVR::ovr_GetMirrorTextureBufferGL
and the mirror texture ID. A mirror texture ID == 0 is invalid.Return type: tuple (int, int) Examples
Getting the mirror texture for use:
# get the mirror texture result, mirrorTexId = getMirrorTexture() if failure(result): # raise error ... # bind the mirror texture texture to the framebuffer glFramebufferTexture2D( GL_READ_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, mirrorTexId, 0)
-
psychxr.drivers.libovr.
getSensorSampleTime
()¶ Get the sensor sample timestamp.
The time when the source data used to compute the render pose was sampled. This value is used to compute the motion-to-photon latency. This value is set when
getDevicePoses()
andsetSensorSampleTime()
is called. IfgetTrackingState()
was called with latencyMarker set, sensor sample time will be 0.0.Returns: Sample timestamp in seconds. Return type: float See also
setSensorSampleTime()
- Set sensor sample time.
-
psychxr.drivers.libovr.
setSensorSampleTime
(double absTime)¶ Set the sensor sample timestamp.
Specify the sensor sample time of the source data used to compute the render poses of each eye. This value is used to compute motion-to-photon latency.
Parameters: absTime (float) – Time in seconds. See also
getSensorSampleTime()
- Get sensor sample time.
getTrackingState()
- Get the current tracking state.
getDevicePoses()
- Get device poses.
Examples
Supplying sensor sample time from an external tracking source:
# get sensor time from the mocal system sampleTime = timeInSeconds() - mocap.timeSinceMidExposure # set sample time setSensorSampleTime(sampleTime) calcEyePoses(headRigidBody) # get frame perf stats after calling `endFrame` to get last frame # motion-to-photon latency perfStats = getPerfStats() m2p_latency = perfStats.frameStats[0].appMotionToPhotonLatency
-
psychxr.drivers.libovr.
getTrackingState
(double absTime, bool latencyMarker=True)¶ Get the current tracking state of the head and hands.
Parameters: Returns: Tracking state at absTime for head and hands.
Return type: Examples
Getting the head pose and calculating eye render poses:
t = hmd.getPredictedDisplayTime() trackingState = hmd.getTrackingState(t) # tracking state flags flags = STATUS_ORIENTATION_TRACKED | STATUS_ORIENTATION_TRACKED # check if tracking if (flags & trackingState.statusFlags) == flags: hmd.calcEyePose(trackingState.headPose.pose) # calculate eye poses
-
psychxr.drivers.libovr.
getDevicePoses
(deviceTypes, double absTime, bool latencyMarker=True)¶ Get tracked device poses.
Each pose in the returned array matches the device type at each index specified in deviceTypes. You need to call this function to get the poses for ‘objects’, which are additional Touch controllers that can be paired and tracked in the scene.
It is recommended that
getTrackingState()
is used for obtaining the head and hand poses.Parameters: - deviceTypes (list or tuple of int) –
List of device types. Valid device types identifiers are:
TRACKED_DEVICE_TYPE_HMD
: The head or HMD.TRACKED_DEVICE_TYPE_LTOUCH
: Left touch controller or hand.TRACKED_DEVICE_TYPE_RTOUCH
: Right touch controller or hand.TRACKED_DEVICE_TYPE_TOUCH
: Both touch controllers.
Up to four additional touch controllers can be paired and tracked, they are assigned as:
TRACKED_DEVICE_TYPE_OBJECT0
TRACKED_DEVICE_TYPE_OBJECT1
TRACKED_DEVICE_TYPE_OBJECT2
TRACKED_DEVICE_TYPE_OBJECT3
- absTime (float) – Absolute time in seconds poses refer to.
- latencyMarker (bool, optional) – Insert a marker for motion-to-photon latency calculation. Set this to
False if
getTrackingState()
was previously called and a latency marker was set there. The latency marker is set to the absolute time this function was called.
Returns: Return code (int) of the
OVR::ovr_GetDevicePoses
API call and list of tracked device poses (list ofLibOVRPoseState
). If a device cannot be tracked, the return code will beERROR_LOST_TRACKING
.Return type: Warning
If multiple devices were specified with deviceTypes, the return code will be
ERROR_LOST_TRACKING
if ANY of the devices lost tracking.Examples
Get HMD and touch controller poses:
deviceTypes = (TRACKED_DEVICE_TYPE_HMD, TRACKED_DEVICE_TYPE_LTOUCH, TRACKED_DEVICE_TYPE_RTOUCH) headPose, leftHandPose, rightHandPose = getDevicePoses( deviceTypes, absTime)
- deviceTypes (list or tuple of int) –
-
psychxr.drivers.libovr.
calcEyePoses
(LibOVRPose headPose, originPose=None)¶ Calculate eye poses using a given head pose.
Eye poses are derived from the specified head pose, relative eye poses, and the scene tracking origin.
Calculated eye poses are stored and passed to the compositor when
endFrame()
is called unlessLAYER_FLAG_HEAD_LOCKED
is set. You can access the computed poses via thegetEyeRenderPose()
function. If using custom head poses, ensuresetHeadLocked()
isTrue
or theLAYER_FLAG_HEAD_LOCKED
render layer flag is set.Parameters: - headPose (
LibOVRPose
) – Head pose. - originPose (
LibOVRPose
, optional) – Optional world origin pose to transform head pose. You can apply transformations to this pose to simulate movement through a scene.
Examples
Compute the eye poses from tracker data:
abs_time = getPredictedDisplayTime() tracking_state, calibrated_origin = getTrackingState(abs_time, True) headPoseState, status = tracking_state[TRACKED_DEVICE_TYPE_HMD] # calculate head pose hmd.calcEyePoses(headPoseState.pose) # computed render poses appear here renderPoseLeft, renderPoseRight = hmd.getEyeRenderPoses()
Using external data to set the head pose from a motion capture system:
# rigid body in the scene defining the scene origin rbHead = LibOVRPose(*headRb.posOri) calcEyePoses(rbHead)
Note that the external tracker latency might be larger than builtin tracking. To get around this, enable forward prediction in your mocap software to equal roughly to average getPredictedDisplayTime() - mocapMidExposureTime, or time integrate poses to mid-frame time.
- headPose (
-
psychxr.drivers.libovr.
getHmdToEyePose
(int eye)¶ HMD to eye pose.
These are the prototype eye poses specified by LibOVR, defined only after
create()
is called. These poses are referenced to the HMD origin. Poses are transformed by callingcalcEyePoses()
, updating the values returned bygetEyeRenderPose()
.The horizontal (x-axis) separation of the eye poses are determined by the configured lens spacing (slider adjustment). This spacing is supposed to correspond to the actual inter-ocular distance (IOD) of the user. You can get the IOD used for rendering by adding up the absolute values of the x-components of the eye poses, or by multiplying the value of
getEyeToNoseDist()
by two. Furthermore, the IOD values can be altered, prior to calling :func`calcEyePoses`, to override the values specified by LibOVR.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Copy of the HMD to eye pose. Return type: tuple (LibOVRPose, LibOVRPose) See also
setHmdToEyePose()
- Set the HMD to eye pose.
Examples
Get the HMD to eye poses:
leftPose = getHmdToEyePose(EYE_LEFT) rightPose = getHmdToEyePose(EYE_RIGHT)
-
psychxr.drivers.libovr.
setHmdToEyePose
(int eye, LibOVRPose eyePose)¶ Set the HMD eye poses.
This overwrites the values returned by LibOVR and will be used in successive calls of
calcEyePoses()
to compute eye render poses. Note that the poses store the view space translations, not the relative position in the scene.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.See also
getHmdToEyePose()
- Get the current HMD to eye pose.
Examples
Set both HMD to eye poses:
eyePoses = [LibOVRPose((0.035, 0.0, 0.0)), LibOVRPose((-0.035, 0.0, 0.0))] for eye in enumerate(eyePoses): setHmdToEyePose(eye, eyePoses[eye])
-
psychxr.drivers.libovr.
getEyeRenderPose
(int eye)¶ Get eye render poses.
Pose are those computed by the last
calcEyePoses()
call. Returned objects are copies of the data stored internally by the session instance. These poses are used to derive the view matrix when rendering for each eye, and used for visibility culling.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.Returns: Copies of the HMD to eye poses for the left and right eye. Return type: tuple (LibOVRPose, LibOVRPose) See also
setEyeRenderPose()
- Set an eye’s render pose.
Examples
Get the eye render poses:
leftPose = getHmdToEyePose(EYE_LEFT) rightPose = getHmdToEyePose(EYE_RIGHT)
Get the left and right view matrices:
eyeViewMatrices = [] for eye in enumerate(EYE_COUNT): eyeViewMatrices.append(getHmdToEyePose(eye).asMatrix())
Same as above, but overwrites existing view matrices:
# identity 4x4 matrices eyeViewMatrices = [ numpy.identity(4, dtype=numpy.float32), numpy.identity(4, dtype=numpy.float32)] for eye in range(EYE_COUNT): getHmdToEyePose(eye).asMatrix(eyeViewMatrices[eye])
-
psychxr.drivers.libovr.
setEyeRenderPose
(int eye, LibOVRPose eyePose)¶ Set eye render pose.
Setting the eye render pose will update the values returned by
getEyeRenderPose()
.Parameters: eye (int) – Eye index. Use either EYE_LEFT
orEYE_RIGHT
.See also
getEyeRenderPose()
- Get an eye’s render pose.
-
psychxr.drivers.libovr.
getEyeRenderViewport
(int eye, ndarray out=None)¶ Get the eye render viewport.
The viewport defines the region on the swap texture a given eye’s image is drawn to.
Parameters: - eye (int) – Eye index. Use either
EYE_LEFT
orEYE_RIGHT
. - out (ndarray, optional) – Optional NumPy array to place values. If None, this function will return a new array. Must be dtype=int and length 4.
Returns: Viewport rectangle [x, y, w, h].
Return type: ndarray
- eye (int) – Eye index. Use either
-
psychxr.drivers.libovr.
setEyeRenderViewport
(int eye, values)¶ Set the eye render viewport.
The viewport defines the region on the swap texture a given eye’s image is drawn to.
Parameters: - eye (int) – Eye index. Use either
EYE_LEFT
orEYE_RIGHT
. - values (array_like) – Viewport rectangle [x, y, w, h].
Examples
Setting the viewports for both eyes on a single swap chain buffer:
# Calculate the optimal eye buffer sizes for the FOVs, these will define # the dimensions of the render target. leftBufferSize, rightBufferSize = calcEyeBufferSizes() # Define the viewports, which specifies the region on the render target a # eye's image will be drawn to and accessed from. Viewports are rectangles # defined like [x, y, w, h]. The x-position of the rightViewport is offset # by the width of the left viewport. leftViewport = [0, 0, leftBufferSize[0], leftBufferSize[1]] rightViewport = [leftBufferSize[0], 0, rightBufferSize[0], rightBufferSize[1]] # set both viewports setEyeRenderViewport(EYE_LEFT, leftViewport) setEyeRenderViewport(EYE_RIGHT, rightViewport)
- eye (int) – Eye index. Use either
-
psychxr.drivers.libovr.
getEyeProjectionMatrix
(int eye, ndarray out=None)¶ Compute the projection matrix.
The projection matrix is computed by the runtime using the eye FOV parameters set with
libovr.LibOVRSession.setEyeRenderFov
calls.Parameters: - eye (int) – Eye index. Use either
EYE_LEFT
orEYE_RIGHT
. - out (ndarray or None, optional) – Alternative matrix to write values to instead of returning a new one.
Returns: 4x4 projection matrix.
Return type: ndarray
Examples
Get the left and right projection matrices as a list:
eyeProjectionMatrices = [] for eye in range(EYE_COUNT): eyeProjectionMatrices.append(getEyeProjectionMatrix(eye))
Same as above, but overwrites existing view matrices:
# identity 4x4 matrices eyeProjectionMatrices = [ numpy.identity(4, dtype=numpy.float32), numpy.identity(4, dtype=numpy.float32)] # for eye in range(EYE_COUNT) also works for eye in enumerate(eyeProjectionMatrices): getEyeProjectionMatrix(eye, out=eyeProjectionMatrices[eye])
Using eye projection matrices with PyOpenGL (fixed-function):
P = getEyeProjectionMatrix(eye) glMatrixMode(GL.GL_PROJECTION) glLoadTransposeMatrixf(P)
For Pyglet (which is the stardard GL interface for PsychoPy), you need to convert the matrix to a C-types pointer before passing it to glLoadTransposeMatrixf:
P = getEyeProjectionMatrix(eye) P = P.ctypes.data_as(ctypes.POINTER(ctypes.c_float)) glMatrixMode(GL.GL_PROJECTION) glLoadTransposeMatrixf(P)
If using fragment shaders, the matrix can be passed on to them as such:
P = getEyeProjectionMatrix(eye) P = P.ctypes.data_as(ctypes.POINTER(ctypes.c_float)) # after the program was installed in the current rendering state via # `glUseProgram` ... loc = glGetUniformLocation(program, b"m_Projection") glUniformMatrix4fv(loc, 1, GL_TRUE, P) # `transpose` must be `True`
- eye (int) – Eye index. Use either
-
psychxr.drivers.libovr.
getEyeViewMatrix
(int eye, ndarray out=None)¶ Compute a view matrix for a specified eye.
View matrices are derived from the eye render poses calculated by the last
calcEyePoses()
call or update bysetEyeRenderPose()
.Parameters: Returns: 4x4 view matrix. Object out will be returned if specified.
Return type: ndarray
-
psychxr.drivers.libovr.
getPredictedDisplayTime
(unsigned int frameIndex=0)¶ Get the predicted time a frame will be displayed.
Parameters: frameIndex (int) – Frame index. Returns: Absolute frame mid-point time for the given frame index in seconds. Return type: float
-
psychxr.drivers.libovr.
timeInSeconds
()¶ Absolute time in seconds.
Returns: Time in seconds. Return type: float
-
psychxr.drivers.libovr.
waitToBeginFrame
(unsigned int frameIndex=0)¶ Wait until a buffer is available so frame rendering can begin. Must be called before
beginFrame()
.Parameters: frameIndex (int) – The target frame index. Returns: Return code of the LibOVR API call OVR::ovr_WaitToBeginFrame
. ReturnsSUCCESS
if completed without errors. May returnERROR_DISPLAY_LOST
if the device was removed, rendering the current session invalid.Return type: int
-
psychxr.drivers.libovr.
beginFrame
(unsigned int frameIndex=0)¶ Begin rendering the frame. Must be called prior to drawing and
endFrame()
.Parameters: frameIndex (int) – The target frame index. Returns: Error code returned by OVR::ovr_BeginFrame
.Return type: int
-
psychxr.drivers.libovr.
commitTextureSwapChain
(int eye)¶ Commit changes to a given eye’s texture swap chain. When called, the runtime is notified that the texture is ready for use, and the swap chain index is incremented.
Parameters: eye (int) – Eye buffer index. Returns: Error code returned by API call OVR::ovr_CommitTextureSwapChain
. Will returnSUCCESS
if successful. Returns error codeERROR_TEXTURE_SWAP_CHAIN_FULL
if called too many times without callingendFrame()
.Return type: int Warning
No additional drawing operations are permitted once the texture is committed until the SDK dereferences it, making it available again.
-
psychxr.drivers.libovr.
endFrame
(unsigned int frameIndex=0)¶ Call when rendering a frame has completed. Buffers which have been committed are passed to the compositor for distortion.
Successful
waitToBeginFrame()
andbeginFrame()
call must precede callingendFrame()
.Parameters: frameIndex (int) – The target frame index. Returns: Error code returned by API call OVR::ovr_EndFrame and the absolute time in seconds OVR::ovr_EndFrame returned. Return type: tuple (int, float)
-
psychxr.drivers.libovr.
getTrackingOriginType
()¶ Get the current tracking origin type.
The tracking origin type specifies where the origin is placed when computing the pose of tracked objects (i.e. the head and touch controllers.) Valid values are
TRACKING_ORIGIN_EYE_LEVEL
andTRACKING_ORIGIN_FLOOR_LEVEL
.See also
setTrackingOriginType()
- Set the tracking origin type.
-
psychxr.drivers.libovr.
setTrackingOriginType
(int value)¶ Set the tracking origin type.
Specify the tracking origin to use when computing eye poses. Subsequent calls of
calcEyePoses()
will use the set tracking origin.Parameters: value (int) – Tracking origin type, must be either TRACKING_ORIGIN_FLOOR_LEVEL
orTRACKING_ORIGIN_EYE_LEVEL
.Returns: Result of the OVR::ovr_SetTrackingOriginType
LibOVR API call.Return type: int See also
getTrackingOriginType()
- Get the current tracking origin type.
-
psychxr.drivers.libovr.
recenterTrackingOrigin
()¶ Recenter the tracking origin.
Returns: The result of the LibOVR API call OVR::ovr_RecenterTrackingOrigin
.Return type: int Examples
Recenter the tracking origin if requested by the session status:
sessionStatus = getSessionStatus() if sessionStatus.shouldRecenter: recenterTrackingOrigin()
-
psychxr.drivers.libovr.
specifyTrackingOrigin
(LibOVRPose newOrigin)¶ Specify a new tracking origin.
Parameters: newOrigin (LibOVRPose) – New origin to use.
-
psychxr.drivers.libovr.
clearShouldRecenterFlag
()¶ Clear the
LibOVRSessionStatus.shouldRecenter
flag.
-
psychxr.drivers.libovr.
getTrackerCount
()¶ Get the number of attached trackers.
Returns: Number of trackers reported by LibOVR. Return type: int Notes
- The Oculus Rift S uses inside-out tracking, therefore does not have external trackers. For compatibility, LibOVR will return a tracker count of 3.
-
psychxr.drivers.libovr.
getTrackerInfo
(int trackerIndex)¶ Get information about a given tracker.
Parameters: trackerIndex (int) – The index of the sensor to query. Valid values are between 0 and getTrackerCount()
- 1.Notes
- The Oculus Rift S uses inside-out tracking, therefore does not have external trackers. For compatibility, LibOVR will dummy tracker objects.
-
psychxr.drivers.libovr.
getSessionStatus
()¶ Get the current session status.
Returns: Result of LibOVR API call OVR::ovr_GetSessionStatus
and aLibOVRSessionStatus
.Return type: tuple (int, LibOVRSessionStatus) Examples
Check if the display is visible to the user:
result, sessionStatus = getSessionStatus() if sessionStatus.isVisible: # begin frame rendering ...
Quit if the user requests to through the Oculus overlay:
result, sessionStatus = getSessionStatus() if sessionStatus.shouldQuit: # destroy any swap chains ... destroy() shutdown()
-
psychxr.drivers.libovr.
getPerfStats
()¶ Get detailed compositor frame statistics.
Returns: Frame statistics. Return type: LibOVRPerfStats Examples
Get the time spent by the application between
endFrame()
calls:result = updatePerfStats() if getFrameStatsCount() > 0: frameStats = getFrameStats(0) # only the most recent appTime = frameStats.appCpuElapsedTime
-
psychxr.drivers.libovr.
resetPerfStats
()¶ Reset frame performance statistics.
Calling this will reset frame statistics, which may be needed if the application loses focus (eg. when the system UI is opened) and performance stats no longer apply to the application.
Returns: Error code returned by OVR::ovr_ResetPerfStats
.Return type: int
-
psychxr.drivers.libovr.
getLastErrorInfo
()¶ Get the last error code and information string reported by the API.
This function can be used when implementing custom error handlers for dealing with exceptions raised by LibOVR. You must call
getLastErrorInfo()
every time after any function which makes an LibOVR API call if you wish to catch all errors, since only the most recent is returned.Returns: Tuple of the API call result and error string. If there was no API error, the function will return tuple (0, ‘<unknown>’). Return type: tuple (int, str) Examples
Raise a Python exception if LibOVR reports an error:
result = create() if failure(result): errorVal, errorMsg = getLastErrorInfo() raise RuntimeError(errorMsg) # Python generic runtime error!
-
psychxr.drivers.libovr.
setBoundaryColor
(float red, float green, float blue)¶ Set the boundary color. Deprecated as of version 0.2.4. Do not use in new applications.
The boundary is drawn by the compositor which overlays the extents of the physical space where the user can safely move.
Parameters: Returns: Result of the LibOVR API call
OVR::ovr_SetBoundaryLookAndFeel
.Return type:
-
psychxr.drivers.libovr.
resetBoundaryColor
()¶ Reset the boundary color to system default. Deprecated as of version 0.2.4. Do not use in new applications.
Returns: Result of the LibOVR API call OVR::ovr_ResetBoundaryLookAndFeel
.Return type: int
-
psychxr.drivers.libovr.
getBoundaryVisible
()¶ Check if the Guardian boundary is visible.
The boundary is drawn by the compositor which overlays the extents of the physical space where the user can safely move.
Returns: Result of the LibOVR API call OVR::ovr_GetBoundaryVisible
and the boundary state.Return type: tuple (int, bool) Notes
- Since the boundary has a fade-in effect, the boundary might be reported as visible but difficult to actually see.
-
psychxr.drivers.libovr.
showBoundary
()¶ Show the boundary.
The boundary is drawn by the compositor which overlays the extents of the physical space where the user can safely move.
Returns: Result of LibOVR API call OVR::ovr_RequestBoundaryVisible
.Return type: int
-
psychxr.drivers.libovr.
hideBoundary
()¶ Hide the boundry.
Returns: Result of LibOVR API call OVR::ovr_RequestBoundaryVisible
.Return type: int
-
psychxr.drivers.libovr.
getBoundaryDimensions
(int boundaryType)¶ Get the dimensions of the boundary.
Parameters: boundaryType (int) – Boundary type, can be BOUNDARY_OUTER
orBOUNDARY_PLAY_AREA
.Returns: Result of the LibOVR APi call OVR::ovr_GetBoundaryDimensions
and the dimensions of the boundary in meters [x, y, z].Return type: tuple (int, ndarray)
-
psychxr.drivers.libovr.
testBoundary
(int deviceBitmask, int boundaryType)¶ Test collision of tracked devices on boundary.
Parameters: - deviceBitmask (int) –
Devices to test. Multiple devices identifiers can be combined together. Valid device IDs are:
TRACKED_DEVICE_TYPE_HMD
: The head or HMD.TRACKED_DEVICE_TYPE_LTOUCH
: Left touch controller or hand.TRACKED_DEVICE_TYPE_RTOUCH
: Right touch controller or hand.TRACKED_DEVICE_TYPE_TOUCH
: Both touch controllers.TRACKED_DEVICE_TYPE_OBJECT0
TRACKED_DEVICE_TYPE_OBJECT1
TRACKED_DEVICE_TYPE_OBJECT2
TRACKED_DEVICE_TYPE_OBJECT3
- boundaryType (int) – Boundary type, can be
BOUNDARY_OUTER
orBOUNDARY_PLAY_AREA
.
Returns: Result of the
OVR::ovr_TestBoundary
LibOVR API call and collision test results.Return type: - deviceBitmask (int) –
-
psychxr.drivers.libovr.
getConnectedControllerTypes
()¶ Get connected controller types.
Returns: IDs of connected controller types. Possible values returned are: CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
Return type: list of int See also
updateInputState()
- Poll a controller’s current state.
Examples
Check if the left touch controller is paired:
controllers = getConnectedControllerTypes() hasLeftTouch = CONTROLLER_TYPE_LTOUCH in controllers
Update all connected controller states:
for controller in getConnectedControllerTypes(): result, time = updateInputState(controller)
-
psychxr.drivers.libovr.
updateInputState
(int controller)¶ Refresh the input state of a controller.
Subsequent
getButton()
,getTouch()
,getThumbstickValues()
,getIndexTriggerValues()
, andgetHandTriggerValues()
calls using the same controller value will reflect the new state.Parameters: controller (int) – Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
Returns: Result of the OVR::ovr_GetInputState
LibOVR API call and polling time in seconds.Return type: tuple (int, float) See also
getConnectedControllerTypes()
- Get a list of connected controllers.
getButton()
- Get button states.
getTouch()
- Get touches.
-
psychxr.drivers.libovr.
getButton
(int controller, int button, unicode testState=u'continuous')¶ Get a button state.
The controller to test is specified by its ID, defined as constants starting with
CONTROLLER_TYPE_*
. Buttons to test are specified using their ID, defined as constants starting withBUTTON_*
. Button IDs can be ORed together for testing multiple button states. The returned value represents the button state during the lastupdateInputState()
call for the specified controller.An optional trigger mode may be specified which defines the button’s activation criteria. By default, testState is ‘continuous’ will return the immediate state of the button. Using ‘rising’ (or ‘pressed’) will return True once when the button transitions to being pressed between subsequent
updateInputState()
calls, whereas ‘falling’ (and ‘released’) will return True once the button is released. IfupdateInputState()
was called only once, ‘rising’ and ‘falling’ will return False.Parameters: - controller (int) –
Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
- button (int) –
Button to check. Values can be ORed together to test for multiple button presses. If a given controller does not have a particular button, False will always be returned. Valid button values are:
BUTTON_A
BUTTON_B
BUTTON_RTHUMB
BUTTON_RSHOULDER
BUTTON_X
BUTTON_Y
BUTTON_LTHUMB
BUTTON_LSHOULDER
BUTTON_UP
BUTTON_DOWN
BUTTON_LEFT
BUTTON_RIGHT
BUTTON_ENTER
BUTTON_BACK
BUTTON_VOLUP
BUTTON_VOLDOWN
BUTTON_HOME
BUTTON_PRIVATE
BUTTON_RMASK
BUTTON_LMASK
- testState (str) – State to test buttons for. Valid states are ‘rising’, ‘falling’, ‘continuous’, ‘pressed’, and ‘released’.
Returns: Result of the button press and the time in seconds it was polled.
Return type: See also
getTouch()
- Get touches.
Examples
Check if the ‘X’ button on the touch controllers was pressed:
isPressed = getButtons(CONTROLLER_TYPE_TOUCH, BUTTON_X, 'pressed')
Test for multiple buttons (e.g. ‘X’ and ‘Y’) being released:
buttons = BUTTON_X | BUTTON_Y controller = CONTROLLER_TYPE_TOUCH isReleased = getButtons(controller, buttons, 'released')
- controller (int) –
-
psychxr.drivers.libovr.
getTouch
(int controller, int touch, unicode testState=u'continuous')¶ Get a touch state.
The controller to test is specified by its ID, defined as constants starting with
CONTROLLER_TYPE_*
. Touches to test are specified using their ID, defined as constants starting withTOUCH_*
. Touch IDs can be ORed together for testing multiple touch states. The returned value represents the touch state during the lastupdateInputState()
call for the specified controller.An optional trigger mode may be specified which defines a touch’s activation criteria. By default, testState is ‘continuous’ will return the immediate state of the button. Using ‘rising’ (or ‘pressed’) will return
True
once when something is touched between subsequentupdateInputState()
calls, whereas ‘falling’ (and ‘released’) will returnTrue
once the touch is discontinued. IfupdateInputState()
was called only once, ‘rising’ and ‘falling’ will return False.Parameters: - controller (int) –
Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
However, touches are only applicable for devices which support that feature.
- touch (int) –
Touch to check. Values can be ORed together to test for multiple touches. If a given controller does not have a particular touch,
False
will always be returned. Valid touch values are:TOUCH_A
TOUCH_B
TOUCH_RTHUMB
TOUCH_RSHOULDER
TOUCH_X
TOUCH_Y
TOUCH_LTHUMB
TOUCH_LSHOULDER
TOUCH_LINDEXTRIGGER
TOUCH_LINDEXTRIGGER
TOUCH_LTHUMBREST
TOUCH_RTHUMBREST
TOUCH_RINDEXPOINTING
TOUCH_RTHUMBUP
TOUCH_LINDEXPOINTING
TOUCH_LTHUMBUP
- testState (str) – State to test touches for. Valid states are ‘rising’, ‘falling’, ‘continuous’, ‘pressed’, and ‘released’.
Returns: Result of the touches and the time in seconds it was polled.
Return type: See also
getButton()
- Get a button state.
Notes
- Special ‘touches’
TOUCH_RINDEXPOINTING
,TOUCH_RTHUMBUP
,TOUCH_RTHUMBREST
,TOUCH_LINDEXPOINTING
,TOUCH_LINDEXPOINTING
, andTOUCH_LINDEXPOINTING
, can be used to recognise hand pose/gestures.
Examples
Check if the user is making a pointing gesture with their right index finger:
isPointing = getTouch( controller=CONTROLLER_TYPE_LTOUCH, touch=TOUCH_LINDEXPOINTING)
- controller (int) –
-
psychxr.drivers.libovr.
getThumbstickValues
(int controller, bool deadzone=False)¶ Get analog thumbstick values.
Get the values indicating the displacement of the controller’s analog thumbsticks. Returns two tuples for the up-down and left-right of each stick. Values range from -1 to 1.
Parameters: - controller (int) –
Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
- deadzone (bool) – Apply a deadzone if True.
Returns: Thumbstick values.
Return type: Examples
Get the thumbstick values with deadzone for the touch controllers:
ovr.updateInputState() # get most recent input state leftThumbStick, rightThumbStick = ovr.getThumbstickValues( ovr.CONTROLLER_TYPE_TOUCH, deadzone=True) x, y = rightThumbStick # left-right, up-down values for right stick
- controller (int) –
-
psychxr.drivers.libovr.
getIndexTriggerValues
(int controller, bool deadzone=False)¶ Get analog index trigger values.
Get the values indicating the displacement of the controller’s analog index triggers. Returns values for the left an right sticks. Values range from -1 to 1.
Parameters: controller (int) – Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
Returns: Trigger values (left, right). Return type: tuple (float, float) See also
getThumbstickValues()
- Get thumbstick displacements.
getHandTriggerValues()
- Get hand trigger values.
Examples
Get the index trigger values for touch controllers (with deadzone):
leftVal, rightVal = getIndexTriggerValues(CONTROLLER_TYPE_TOUCH, deadzone=True)
Cast a ray from the controller when a trigger is pulled:
_, rightVal = getIndexTriggerValues(CONTROLLER_TYPE_TOUCH, deadzone=True) # handPose of right hand from the last tracking state if rightVal > 0.75: # 75% thresholds if handPose.raycastSphere(target): # target is LibOVRPose print('Target hit!') else: print('Missed!')
-
psychxr.drivers.libovr.
getHandTriggerValues
(int controller, bool deadzone=False)¶ Get analog hand trigger values.
Get the values indicating the displacement of the controller’s analog hand triggers. Returns two values for the left and right sticks. Values range from -1 to 1.
Parameters: controller (int) – Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
Returns: Trigger values (left, right). Return type: tuple (float, float) See also
getThumbstickValues()
- Get thumbstick displacements.
getIndexTriggerValues()
- Get index trigger values.
Examples
Get the hand trigger values for touch controllers (with deadzone):
leftVal, rightVal = getHandTriggerValues(CONTROLLER_TYPE_TOUCH, deadzone=True)
Grip an object if near a hand. Simply set the pose of the object to match that of the hand when gripping within some distance of the object’s origin. When the grip is released, the object will assume the last pose before being released. Here is a very basic example of object gripping:
_, rightVal = getHandTriggerValues(CONTROLLER_TYPE_TOUCH, deadzone=True) # thing and handPose are LibOVRPoses, handPose is from tracking state distanceToHand = abs(handPose.distanceTo(thing.pos)) if rightVal > 0.75 and distanceToHand < 0.01: thing.posOri = handPose.posOri
-
psychxr.drivers.libovr.
setControllerVibration
(int controller, unicode frequency, float amplitude)¶ Vibrate a controller.
Vibration is constant at fixed frequency and amplitude. Vibration lasts 2.5 seconds, so this function needs to be called more often than that for sustained vibration. Only controllers which support vibration can be used here.
There are only two frequencies permitted ‘high’ and ‘low’, however, amplitude can vary from 0.0 to 1.0. Specifying frequency=’off’ stops vibration.
Parameters: - controller (int) –
Controller name. Valid values are:
CONTROLLER_TYPE_XBOX
: XBox gamepad.CONTROLLER_TYPE_REMOTE
: Oculus Remote.CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
- frequency (str) – Vibration frequency. Valid values are: ‘off’, ‘low’, or ‘high’.
- amplitude (float) – Vibration amplitude in the range of [0.0 and 1.0]. Values outside this range are clamped.
Returns: Return value of API call
OVR::ovr_SetControllerVibration
. Can returnSUCCESS_DEVICE_UNAVAILABLE
if no device is present.Return type: - controller (int) –
-
psychxr.drivers.libovr.
submitControllerVibration
(int controller, LibOVRHapticsBuffer buffer)¶ Submit a haptics buffer to Touch controllers.
Parameters: - controller (int) –
Controller name. Valid values are:
CONTROLLER_TYPE_TOUCH
: Combined Touch controllers.CONTROLLER_TYPE_LTOUCH
: Left Touch controller.CONTROLLER_TYPE_RTOUCH
: Right Touch controller.CONTROLLER_TYPE_OBJECT0
: Object 0 controller.CONTROLLER_TYPE_OBJECT1
: Object 1 controller.CONTROLLER_TYPE_OBJECT2
: Object 2 controller.CONTROLLER_TYPE_OBJECT3
: Object 3 controller.
- buffer (LibOVRHapticsBuffer) – Haptics buffer to submit.
Returns: Return value of API call
OVR::ovr_SubmitControllerVibration
. Can returnSUCCESS_DEVICE_UNAVAILABLE
if no device is present.Return type: - controller (int) –
-
psychxr.drivers.libovr.
cullPose
(int eye, LibOVRPose pose)¶ Test if a pose’s bounding box or position falls outside of an eye’s view frustum.
Poses can be assigned bounding boxes which enclose any 3D models associated with them. A model is not visible if all the corners of the bounding box fall outside the viewing frustum. Therefore any primitives (i.e. triangles) associated with the pose can be culled during rendering to reduce CPU/GPU workload.
If pose does not have a valid bounding box (
LibOVRBounds
) assigned to itsbounds
attribute, this function will test is if the position of pose is outside the view frustum.Parameters: - eye (int) – Eye index. Use either
EYE_LEFT
orEYE_RIGHT
. - pose (LibOVRPose) – Pose to test.
Returns: True
if the pose’s bounding box is not visible to the given eye and should be culled during rendering.Return type: Examples
Check if a pose should be culled (needs to be done for each eye):
cullModel = cullPose(eye, pose) if not cullModel: # ... OpenGL calls to draw the model here ...
Notes
- Frustums used for testing are defined by the current render FOV for the
eye (see:
getEyeRenderFov()
andgetEyeSetFov()
). - This function does not test if an object is occluded by another within the
frustum. If an object is completely occluded, it will still be fully
rendered, and nearer object will be drawn on-top of it. A trick to
improve performance in this case is to use
glDepthFunc(GL_LEQUAL)
withglEnable(GL_DEPTH_TEST)
and render objects from nearest to farthest from the head pose. This will reject fragment color calculations for occluded locations.
- eye (int) – Eye index. Use either