Bringing Vulkan to VR Cass Everitt, Oculus
A Presentation in Two Acts The Graphics-API-Agnostic Design of the VrApi The Vulkan-Samples atw Sample Family as Proving Grounds
Act One The Graphics-API-Agnostic Design of the VrApi
The Graphics-API-Agnostic Design of the VrApi Can be natural to build in graphics API dependencies to a VR API For some implementations, there will be only one graphics API VR APIs do need: Access to image data Access to synch primitives to know when images are ready to use Initialization may also need platform-specific details e.g. WIN32 will want some handles, Android will want some VM pointers, etc.
The Oculus Mobile VrApi Assumed the platform was Android in early days So JavaVM, JNIEnv, and Activity objects were visible Found it useful to run on other platforms too Access to platform-specific dev environments and tools Platform portability of some apps But the vestigial remnants of Android are still visible And Android still the main platform, of course
Initialization // Utility function to default initialize the ovrinitparms ovrinitparms vrapi_defaultinitparms( const ovrjava * java ) { ovrinitparms parms; memset( &parms, 0, sizeof( parms ) ); parms.type = VRAPI_STRUCTURE_TYPE_INIT_PARMS; parms.productversion = VRAPI_PRODUCT_VERSION; parms.majorversion = VRAPI_MAJOR_VERSION; parms.minorversion = VRAPI_MINOR_VERSION; parms.patchversion = VRAPI_PATCH_VERSION; parms.graphicsapi = VRAPI_GRAPHICS_API_OPENGL_ES_2; parms.java = *java; return parms; }
API Choice at Initialization This makes sense for all apps we know about today Precludes changing APIs without full tear-down first Or app could use interop on its side, and use only one API to communicate with the VR API In general, seems a reasonable trade-off Other thing that happens at initialization: Negotiating the VrApi version Driver needs to confirm its support for the version Could fail if app newer than driver (nearly impossible with our current driver distro) Or if very old app, and driver dropped support (also very unlikely, but possible)
Swap Chains Image abstraction came in the form of SwapChains Opaque data type ovrtextureswapchain * vrapi_createtextureswapchain2( ovrtexturetype type, ovrtextureformat format, int width, int height, int levels, int buffercount ); void vrapi_destroytextureswapchain( ovrtextureswapchain * chain ); int vrapi_gettextureswapchainlength( ovrtextureswapchain * chain ); int vrapi_gettextureswapchainhandle( ovrtextureswapchain * chain, int index );
Swap Chains (2) VR Runtime allocates / deletes App can get handle Must build its own FBOs, etc for rendering Note, this API still OpenGL-centric (int handles) typedef enum { VRAPI_GRAPHICS_API_OPENGL_ES_2 = ( 0x10000 0x0200 ), VRAPI_GRAPHICS_API_OPENGL_ES_3 = ( 0x10000 0x0300 ), VRAPI_GRAPHICS_API_OPENGL_COMPAT = ( 0x20000 0x0100 ), VRAPI_GRAPHICS_API_OPENGL_CORE_3 = ( 0x20000 0x0300 ), VRAPI_GRAPHICS_API_OPENGL_CORE_4 = ( 0x20000 0x0400 ) } ovrgraphicsapi;
Fences Fences passed in to vrapi_submitframe() are allowed to be zero Implies that the runtime should create them This is not an API-portable solution In Vulkan, we wouldn t even know which queue to submit a fence to. Same problem exists in OpenGL / ES if multiple contexts are used to render. Fences are not allocated by the runtime This is not a portable or flexible solution Fence allocation is more complex for out-of-process composition As is image allocation, however we already control image allocation!
VR Compositor
What is the VR Compositor? Worth taking a step back and talking about VR at a VERY high level App renders eye buffers Stereo pair, perspective renders, matching the FOV optics of the HMD App passes those images along with metadata and fences to compositor Compositor applies lens distortion correction (warp) Compositor applies just in time tracking correction When there was only one layer, Oculus called the above the timewarp. When we made the operation of the timewarp asynchronous to the app s synthesis of eye buffers, we called it asynchronous timewarp. Today, we also have layers of composition in which all this happens, and we just refer to this whole subsystem as the compositor.
In-Process vs Out-of-Process Composition Once the app hands over it s frame data in vrapi_submitframe(), how the rest of the magic happens is implementation dependent In some systems, the compositor may live in the process with the app This is the current Oculus Mobile implementation In other systems, the compositor may live in its own process This is the current Oculus Rift implementation Different approaches have different trade-offs In the long term, out-of-process seems the likely direction Compositor as system service and potentially privileged process common for non-vr compositors
Summary VR relies on some kind of image synthesis But it is very loosely attached to the means of synthesis As long as it can get images And synchronization info on when it s safe to use them System level implementation choices affect API design
Act Two The Vulkan-Samples atw Sample Family as Proving Grounds https://github.com/khronosgroup/vulkan-samples
Problem Robust VR support on modern hardware has some some unusual requirements The key one for graphics APIs is making sure that the compositor can render display synchronous while app renders display asynchronous Must be true under highly variable app rendering loads How can GPU vendors be certain their hardware is ready to be used for VR? Would be great to have a simple standalone program that could verify that a GPU / driver design is VR-capable This is essentially the origin story of the atw samples
The atw samples Originally written by Johannes van Waveren in OpenGL / ES Quickly realized, would be useful to have a Vulkan version as well As a means for verifying that the Vulkan API had the minimum necessary support for VR And really, why stop at just two APIs? Internally we developed versions for D3D and Metal With these, we can understand whether there are missing basic features Not just know the problem ourselves, but also share the code with relevant parties, to make it easy to fix the APIs if necessary In extensions and/or future revisions
Additional Benefits Once you have a such an app, ported to many graphics APIs, you Try to keep the app logically the same on all apps Which instructs VR API developers about how to write a VR API that is portable between graphics APIs Adding Vulkan support to the VrApi has provided good insights into having an API for VR that is truly graphics API agnostic Also helps us understand how to make changes that are portable Same sample app, different platforms and graphics APIS is educational Keeping the app the same, does mean keeping the abstraction used in all the samples consistent This provides a Rosetta stone for understanding how numerous graphics APIs can be used to achieve the same goals
What do the atw samples do? Model the app + compositor in a single app and single source file Compositor runs in a separate thread With higher graphics priority, if possible Compositor does not support multiple layers This will likely be added, but adds significant relevant testing space App rendering does a number of different things But mostly a block of things being rendered With varying geometric or fragment complexity or display resolution Stresses the GPU in different ways
How do theyknow if it worked? Log messages illustrate timing info Diagnostic ticker tape graphs show timing Full source provided So hardware vendors can make alterations to debug, etc Meaning of diagnostic graphs described in source comments
Rosetta All the atw samples have a similar class breakdown, e.g. ksdriverinstance ksgpuqueueinfo ksgpudevice ksgpucontext ksgpubuffer ksgputexture ksgpugeometry ksgpurenderpass ksgpuframebuffer ksgpugraphicsprogram ksgpufence ksgpugraphicscommand ksgpucommandbuffer
Abstractions As you might imagine, abstractions look a bit different in each API But the core concepts have a good common basis
Driver Instance Vulkan OpenGL typedef struct { int dummy; } ksdriverinstance;
Texture Vulkan OpenGL
Graphics Program Vulkan OpenGL
Graphics Command Vulkan OpenGL
Abstraction Comparison Side by side is handy These abstractions aren t perfect for all users They re not complete only what was needed for the atw samples But they provide a good first approximation for others
Review VR APIs don t have to be very tightly bound to graphics APIs The atw sample code is a treasure trove of working code Benefits driver developers and hw vendors by providing a must-work app Benefits app developers by offering a really useful set of abstractions that cross graphics APIs
Questions? cass.everitt@oculus.com