Bringing Vulkan to VR. Cass Everitt, Oculus

Similar documents
VR Rendering Improvements Featuring Autodesk VRED

Vulkan: Scaling to Multiple Threads. Kevin sun Lead Developer Support Engineer, APAC PowerVR Graphics

Vulkan: Architecture positive How Vulkan maps to PowerVR GPUs Kevin sun Lead Developer Support Engineer, APAC PowerVR Graphics.

Standardizing All the Realities: A Look at OpenXR

LIQUIDVR TODAY AND TOMORROW GUENNADI RIGUER, SOFTWARE ARCHITECT

Vulkan 1.1 March Copyright Khronos Group Page 1

Copyright Khronos Group Page 1. Vulkan Overview. June 2015

EXPLICIT SYNCHRONIZATION

Instruction for IVR interface

Breaking Down Barriers: An Intro to GPU Synchronization. Matt Pettineo Lead Engine Programmer Ready At Dawn Studios

Achieving High-performance Graphics on Mobile With the Vulkan API

PROFESSIONAL VR: AN UPDATE. Robert Menzel, Ingo Esser GTC 2018, March

Vulkan Timeline Semaphores

Raise your VR game with NVIDIA GeForce Tools

OpenACC Course. Office Hour #2 Q&A

Going to cover; - Why we have SPIR-V - Brief history of SPIR-V - Some of the core required features we wanted - How OpenCL will use SPIR-V - How

At NVIDIA, being a GPU company, clearly rendering performance is the area we re going to concentrate on, as that s where we can help the most.

Rendering Objects. Need to transform all geometry then

Lecture 1 Introduction to Android. App Development for Mobile Devices. App Development for Mobile Devices. Announcement.

Vulkan (including Vulkan Fast Paths)

Operating Systems (2INC0) 2018/19. Introduction (01) Dr. Tanir Ozcelebi. Courtesy of Prof. Dr. Johan Lukkien. System Architecture and Networking Group

CS 498 VR. Lecture 20-4/11/18. go.illinois.edu/vrlect20

D3D12 & Vulkan Done Right. Gareth Thomas Developer Technology Engineer, AMD

Copyright Khronos Group, Page Graphic Remedy. All Rights Reserved

Jomar Silva Technical Evangelist

Inside VR on Mobile. Sam Martin Graphics Architect GDC 2016

Chapter 3 Processes. Process Concept. Process Concept. Process Concept (Cont.) Process Concept (Cont.) Process Concept (Cont.)

Vulkan API 杨瑜, 资深工程师

PowerVR Framework. October 2015

Copyright Khronos Group Page 1

EECS 487: Interactive Computer Graphics

Mobile AR Hardware Futures

Your logo here. October, 2017

EE 4702 GPU Programming

September 10,

Unix Device Memory. James Jones XDC 2016

Parallel Programming on Larrabee. Tim Foley Intel Corp

Programming shaders & GPUs Christian Miller CS Fall 2011

Mesa i965 Scenes from a Quiet Revolution

Introduction to Computer Graphics. Knowledge basic concepts 2D and 3D computer graphics

Order Matters in Resource Creation

Lecture 25: Board Notes: Threads and GPUs

Optimisation. CS7GV3 Real-time Rendering

Lecture 13: OpenGL Shading Language (GLSL)

Last class: Today: Thread Background. Thread Systems

IEEE Consumer Electronics Society Calibrating a VR Camera. Adam Rowell CTO, Lucid VR

CS179 GPU Programming Introduction to CUDA. Lecture originally by Luke Durant and Tamas Szalay

Renderscript. Lecture May Android Native Development Kit. NDK Renderscript, Lecture 10 1/41

OpenGL BOF Siggraph 2011

Seeing the world through a depth-sensing camera

Khronos Connects Software to Silicon

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Swapchains Unchained!

Metal for OpenGL Developers

Lecture 4: Threads; weaving control flow

Coding OpenGL ES 3.0 for Better Graphics Quality

Lecture 8 Data Structures

CS 220: Introduction to Parallel Computing. Introduction to CUDA. Lecture 28

PowerVR Hardware. Architecture Overview for Developers

Vulkan C++ Markus Tavenrath, Senior DevTech Software Engineer Professional Visualization

CHAPTER 3 - PROCESS CONCEPT

Operating System. Chapter 4. Threads. Lynn Choi School of Electrical Engineering

DEVELOPER DAY MONTRÉAL APRIL Copyright Khronos Group Page 1

Operating Systems Overview

Kernel Scalability. Adam Belay

Mobile Graphics Trends: Applications. Marco Agus, KAUST & CRS4

CS370 Operating Systems

API Background. Prof. George Wolberg Dept. of Computer Science City College of New York

Press Briefing SIGGRAPH 2015 Neil Trevett Khronos President NVIDIA Vice President Mobile Ecosystem. Copyright Khronos Group Page 1

The Legion Mapping Interface

Investigating real-time rendering techniques approaching realism using the Vulkan API

SLICING THE WORKLOAD MULTI-GPU OPENGL RENDERING APPROACHES

Input/Output Systems

Models and Architectures

Programming with CUDA WS 08/09. Lecture 7 Thu, 13 Nov, 2008

Open API Standards for Mobile Graphics, Compute and Vision Processing GTC, March 2014

Performance OpenGL Programming (for whatever reason)

Free Downloads OpenGL ES 3.0 Programming Guide

Porting Roblox to Vulkan. Arseny

Windowing System on a 3D Pipeline. February 2005

Moving Mobile Graphics Advanced Real-time Shadowing. Marius Bjørge ARM

Adding Advanced Shader Features and Handling Fragmentation

White Paper: Delivering Enterprise Web Applications on the Curl Platform

Standardizing All the Realities: A Look at OpenXR

Java Internals. Frank Yellin Tim Lindholm JavaSoft

Operating System: Chap2 OS Structure. National Tsing-Hua University 2016, Fall Semester

CPSC 341 OS & Networks. Introduction. Dr. Yingwu Zhu

Operating Systems. Operating System Structure. Lecture 2 Michael O Boyle

Multi-Frame Rate Rendering for Standalone Graphics Systems

Next-Generation Graphics on Larrabee. Tim Foley Intel Corp

Introduction to Android

VTRemote An Android Application for the VirtuTrace 3D Simulator

Visibility and Occlusion Culling

CUDA Programming Model

Metal. GPU-accelerated advanced 3D graphics rendering and data-parallel computation. source rebelsmarket.com

Running Android on the Mainline Graphics Stack. Robert

THREADS: (abstract CPUs)

Open Standards for Building Virtual and Augmented Realities. Neil Trevett Khronos President NVIDIA VP Developer Ecosystems

Profiling and Debugging OpenCL Applications with ARM Development Tools. October 2014

Building scalable 3D applications. Ville Miettinen Hybrid Graphics

Transcription:

Bringing Vulkan to VR Cass Everitt, Oculus

A Presentation in Two Acts The Graphics-API-Agnostic Design of the VrApi The Vulkan-Samples atw Sample Family as Proving Grounds

Act One The Graphics-API-Agnostic Design of the VrApi

The Graphics-API-Agnostic Design of the VrApi Can be natural to build in graphics API dependencies to a VR API For some implementations, there will be only one graphics API VR APIs do need: Access to image data Access to synch primitives to know when images are ready to use Initialization may also need platform-specific details e.g. WIN32 will want some handles, Android will want some VM pointers, etc.

The Oculus Mobile VrApi Assumed the platform was Android in early days So JavaVM, JNIEnv, and Activity objects were visible Found it useful to run on other platforms too Access to platform-specific dev environments and tools Platform portability of some apps But the vestigial remnants of Android are still visible And Android still the main platform, of course

Initialization // Utility function to default initialize the ovrinitparms ovrinitparms vrapi_defaultinitparms( const ovrjava * java ) { ovrinitparms parms; memset( &parms, 0, sizeof( parms ) ); parms.type = VRAPI_STRUCTURE_TYPE_INIT_PARMS; parms.productversion = VRAPI_PRODUCT_VERSION; parms.majorversion = VRAPI_MAJOR_VERSION; parms.minorversion = VRAPI_MINOR_VERSION; parms.patchversion = VRAPI_PATCH_VERSION; parms.graphicsapi = VRAPI_GRAPHICS_API_OPENGL_ES_2; parms.java = *java; return parms; }

API Choice at Initialization This makes sense for all apps we know about today Precludes changing APIs without full tear-down first Or app could use interop on its side, and use only one API to communicate with the VR API In general, seems a reasonable trade-off Other thing that happens at initialization: Negotiating the VrApi version Driver needs to confirm its support for the version Could fail if app newer than driver (nearly impossible with our current driver distro) Or if very old app, and driver dropped support (also very unlikely, but possible)

Swap Chains Image abstraction came in the form of SwapChains Opaque data type ovrtextureswapchain * vrapi_createtextureswapchain2( ovrtexturetype type, ovrtextureformat format, int width, int height, int levels, int buffercount ); void vrapi_destroytextureswapchain( ovrtextureswapchain * chain ); int vrapi_gettextureswapchainlength( ovrtextureswapchain * chain ); int vrapi_gettextureswapchainhandle( ovrtextureswapchain * chain, int index );

Swap Chains (2) VR Runtime allocates / deletes App can get handle Must build its own FBOs, etc for rendering Note, this API still OpenGL-centric (int handles) typedef enum { VRAPI_GRAPHICS_API_OPENGL_ES_2 = ( 0x10000 0x0200 ), VRAPI_GRAPHICS_API_OPENGL_ES_3 = ( 0x10000 0x0300 ), VRAPI_GRAPHICS_API_OPENGL_COMPAT = ( 0x20000 0x0100 ), VRAPI_GRAPHICS_API_OPENGL_CORE_3 = ( 0x20000 0x0300 ), VRAPI_GRAPHICS_API_OPENGL_CORE_4 = ( 0x20000 0x0400 ) } ovrgraphicsapi;

Fences Fences passed in to vrapi_submitframe() are allowed to be zero Implies that the runtime should create them This is not an API-portable solution In Vulkan, we wouldn t even know which queue to submit a fence to. Same problem exists in OpenGL / ES if multiple contexts are used to render. Fences are not allocated by the runtime This is not a portable or flexible solution Fence allocation is more complex for out-of-process composition As is image allocation, however we already control image allocation!

VR Compositor

What is the VR Compositor? Worth taking a step back and talking about VR at a VERY high level App renders eye buffers Stereo pair, perspective renders, matching the FOV optics of the HMD App passes those images along with metadata and fences to compositor Compositor applies lens distortion correction (warp) Compositor applies just in time tracking correction When there was only one layer, Oculus called the above the timewarp. When we made the operation of the timewarp asynchronous to the app s synthesis of eye buffers, we called it asynchronous timewarp. Today, we also have layers of composition in which all this happens, and we just refer to this whole subsystem as the compositor.

In-Process vs Out-of-Process Composition Once the app hands over it s frame data in vrapi_submitframe(), how the rest of the magic happens is implementation dependent In some systems, the compositor may live in the process with the app This is the current Oculus Mobile implementation In other systems, the compositor may live in its own process This is the current Oculus Rift implementation Different approaches have different trade-offs In the long term, out-of-process seems the likely direction Compositor as system service and potentially privileged process common for non-vr compositors

Summary VR relies on some kind of image synthesis But it is very loosely attached to the means of synthesis As long as it can get images And synchronization info on when it s safe to use them System level implementation choices affect API design

Act Two The Vulkan-Samples atw Sample Family as Proving Grounds https://github.com/khronosgroup/vulkan-samples

Problem Robust VR support on modern hardware has some some unusual requirements The key one for graphics APIs is making sure that the compositor can render display synchronous while app renders display asynchronous Must be true under highly variable app rendering loads How can GPU vendors be certain their hardware is ready to be used for VR? Would be great to have a simple standalone program that could verify that a GPU / driver design is VR-capable This is essentially the origin story of the atw samples

The atw samples Originally written by Johannes van Waveren in OpenGL / ES Quickly realized, would be useful to have a Vulkan version as well As a means for verifying that the Vulkan API had the minimum necessary support for VR And really, why stop at just two APIs? Internally we developed versions for D3D and Metal With these, we can understand whether there are missing basic features Not just know the problem ourselves, but also share the code with relevant parties, to make it easy to fix the APIs if necessary In extensions and/or future revisions

Additional Benefits Once you have a such an app, ported to many graphics APIs, you Try to keep the app logically the same on all apps Which instructs VR API developers about how to write a VR API that is portable between graphics APIs Adding Vulkan support to the VrApi has provided good insights into having an API for VR that is truly graphics API agnostic Also helps us understand how to make changes that are portable Same sample app, different platforms and graphics APIS is educational Keeping the app the same, does mean keeping the abstraction used in all the samples consistent This provides a Rosetta stone for understanding how numerous graphics APIs can be used to achieve the same goals

What do the atw samples do? Model the app + compositor in a single app and single source file Compositor runs in a separate thread With higher graphics priority, if possible Compositor does not support multiple layers This will likely be added, but adds significant relevant testing space App rendering does a number of different things But mostly a block of things being rendered With varying geometric or fragment complexity or display resolution Stresses the GPU in different ways

How do theyknow if it worked? Log messages illustrate timing info Diagnostic ticker tape graphs show timing Full source provided So hardware vendors can make alterations to debug, etc Meaning of diagnostic graphs described in source comments

Rosetta All the atw samples have a similar class breakdown, e.g. ksdriverinstance ksgpuqueueinfo ksgpudevice ksgpucontext ksgpubuffer ksgputexture ksgpugeometry ksgpurenderpass ksgpuframebuffer ksgpugraphicsprogram ksgpufence ksgpugraphicscommand ksgpucommandbuffer

Abstractions As you might imagine, abstractions look a bit different in each API But the core concepts have a good common basis

Driver Instance Vulkan OpenGL typedef struct { int dummy; } ksdriverinstance;

Texture Vulkan OpenGL

Graphics Program Vulkan OpenGL

Graphics Command Vulkan OpenGL

Abstraction Comparison Side by side is handy These abstractions aren t perfect for all users They re not complete only what was needed for the atw samples But they provide a good first approximation for others

Review VR APIs don t have to be very tightly bound to graphics APIs The atw sample code is a treasure trove of working code Benefits driver developers and hw vendors by providing a must-work app Benefits app developers by offering a really useful set of abstractions that cross graphics APIs

Questions? cass.everitt@oculus.com