Using Unity at Valve - Vision Summit 2016

-smash-

Content Director
Joined
Aug 27, 2004
Messages
1,823
Reaction score
340

vision-summit-logo.png

Speaking at the Vision Summit 2016, Joe Ludwig gave an overview of the SteamVR, OpenVR, and Unity VR APIs, and what power they bring to VR applications developers. I've gone ahead and basically done a transcription of his presentation in order to provide a document for future reference. I think this is important because Joe does outline some long-term goals of SteamVR and OpenVR, both very important sets of standards for the VR ecosystem.

In the spoiler tag below, you can find a video of Joe Ludwig's presentation, as well as my transcript. Some of the information is new, some is a refresher, but most of it is technical jargon, so you have been warned. Enjoy!


In the beginning of his presentation, Joe Ludwig first took some time to explain the differences between SteamVR and OpenVR.


steam-vr-logo.png

SteamVR is considered to be the work that Valve is doing in VR. That includes the technology they've developed that's shipping with the HTC Vive, the APIs that are provided to application and hardware developers, and Steam itself running in VR.

SteamVR provides an in-application VR dashboard. Steam itself is included in this dashboard, and the Steam client uses public APIs to provide its overlay in the dashboard. The set of operations that Steam provides are what you would expect: launch games, browse the store, buy games, chat with friends, etc.. The dashboard also provides the user with access to VR settings from inside of the VR experience. It also provides access to controls of the whole system, such as turning off controllers, or exiting the VR system. SteamVR also provides the render-model API that gives access to high-quality models of whatever device the user is holding in their hand at the moment - this includes animation data for the device, and renders an animation depending on the current state of the controller, i.e. a button is pressed, or a finger is touching a point on the track pad. All of this mesh and texture data is provided to the application so that it can recolor or light the controller model in an appropriate way.

steamvr-1.jpg steamvr-2.jpg
steamvr-5.jpg
steamvr-3.jpg steamvr-4.jpg
SteamVR also provides a system to allow the user to define the boundary of obstacles that exist around the them in their room - this is part of the chaparone system. If a user gets too close to the boundary, the outline will appear in their HMD to warn them that they're about to collide with the real world. For headsets that have a forward-facing camera, such as the HTC Vive, chaparone includes the camera view of what the user is about the run into. Inset into the chaparone bounds system is the safe-play area that provides applications with a guide of where to place objects for the user to interact with. If your application puts something inside of that blue rectangle, the user will be able to reach it. This all scales from a seated experience up to a full-room experience.

steamvr-6.jpg steamvr-7.jpg
Another API that the application interface allows is the notifications API. Right now Valve is using it for Steam toast alerts, but VR applications can use this API for whatever they want. These notifications can appear wherever the user is and they can interact with these notifications however the developer wants them to. There's also support for a VR keyboard that allows the user to enter text with the controllers.

steamvr-9.jpg steamvr-10.jpg

In addition to these user-facing tools, SteamVR also provides performance timing tools to developers. The graph below shows how much CPU and GPU time is being spent by each component of the system, and this helps the developers determine where the bottlenecks are.

steamvr-8.jpg

This is all available inside of every OpenVR application with no effort from the developer.

open-vr-logo.png

OpenVR is the pair of APIs that Valve provides for interacting with the VR system.

The first API is the API that's used for developing VR applications. This includes providing object transforms, an interface to the compositor to send textures to display on the HMD, up-to-date input state of the controllers, access to device models at runtime, access to the user Chaperone system configuration, and more. Supporting this API through an application allows developers the flexibility of not only accessing current VR hardware in an abstract way, but also hardware to come from new and existing manufactures.

The other OpenVR API is for devices - the driver API. Hardware developers use this API to add new devices to the set of things that work with OpenVR. When hardware developers use this API, existing applications have immediate access to these new devices. So, for example, as Joe Ludwig said, if 100 OpenVR applications ship this year, and next year a hardware vendor releases a new OpenVR driver for their hardware, that hardware will immediately gain access to all 100 of those applications. And this happens without application developers having to update their titles.

openvr-1.jpg

What makes OpenVR "Open" is the lack of barriers between application & hardware developers and their customers. There's no requirement to ship on Steam. There's no restrictions on the type of content that can be made. The only people involved in deciding what's good enough are the developers and customers.

So in conclusion, where OpenVR is the API, SteamVR is the customer-facing name that users actually install as part of a larger system.


unity-logo.png

While many of the games being developed for VR are using the OpenVR standard, they're also using the Unity game engine through the SteamVR plugin. Noticing the large number of developers using Unity, Valve wanted to gain some experience with it firsthand...

Valve used the plugin firsthand to develop the Secret Shop VR demo that debuted at The International 2015. Secret Shop uses characters from Dota 2 and pulls them into a 5-minute interactive story. Those Dota 2 assets were pulled in straight from Dota 2, and the demo was built-up from there.

unity-2.jpg

Unity is also used for tools that are shipped as part of SteamVR, such as:
  • The Room Setup tool that the user runs to tell the system where the physical obstacles are in their environment.
  • Demo-transition content.
But why choose Unity? Joe says that the main reason why Valve is using Unity is the same reasons other developers are using Unity: it's fast to get up and running, and allows developers to focus a lot more on content. Valve recognized the popularity of the engine among developers, so they wanted to get firsthand experience for themselves. But this is not to say that Valve has abandoned their own in-house engine (obviously not), but Valve thinks that solving everyone's problems also solves their own problems.

Valve has some challenges with the SteamVR Unity plugin, and both have to do with performance.

For one thing, traversing the game scene is slow. Because SteamVR is supported through a plugin, it doesn't have access to the VR-specific optimizations that Unity has added to the engine. Rendering the scene from two independent cameras is what you have to do with a plugin, and so the scene is traversed twice, effectively doubling the width of necessary optimizations for performance. To fix this problem, Valve has a few things they'd like to do.

First, OpenVR is being added to the native VR API in Unity 5.4. Valve has been working with Unity on this, and it should be in the 5.4 beta in a few weeks. This will be a free integration for all Unity developers. This means that the SteamVR plugin is going to change. Some of the work that the plugin does in Unity 5.3 will be moved over to the native Unity VR API in 5.4 - specifically rendering and tracking. Features that are not supported by the Unity VR API will continue to be supported by the plugin, and that includes controller input, overlays, and render models.

When you write application to the Unity VR API, it selects the Oculus SDK, OpenVR, or mobile or console VR SDKs. But if you are writing your application for a platform that is not yet supported, you will continue to go through the SteamVR plugin.

unity-3.jpg

unity-4.jpg

A new feature coming to Unity 5.4, either in the SteamVR plugin or a new, separate plugin, is Enhanced Rendering. VR applications on the PC need to render at 90 frames-per-second, and that's the native framerate for the HTC Vive and Oculus Rift. It's important to hit that target so that the user is as comfortable as possible. It's hard to do, but that's where Enhanced Rendering comes in. Valve's Alex Vlachos gave a presentation at GDC 2015, and he described many advanced rendering techniques for VR, some of which will be pulled into the SteamVR plugin for Unity. The techniques are comprised of shaders and scripts that improve rendering performance for VR applications.

Lighting, specifically dynamic lighting, is a big part of the Enhanced Rendering plugin. Level designers and artists want to include as many dynamic lights as possible because it increases the richness of a scene. But having many dynamic lights has a cost, and so that's where deferred rendering will come in. Unfortunately, deferred rendering does not support MSAA, which is very important for VR experiences. So with Valve's Enhanced Rendering plugin, they're taking a different path. Dynamic lighting is the goal, but instead of deferred rendering, they're going to specific provide better support for dynamic lights in Unity. The plugin supports up to 10 shadow-casting lights per one draw call, which is an upgrade from current Unity specs which only supports 4 per draw call. So because the plugin will still use the forward renderer, MSAA will be available.

The Enhanced Rendering plugin itself is easy to use. It adds a camera component to the camera properties, and this allows you to control shadows and also hides the faster materials to make it easy to find the ones that haven't yet been switched over to the new model. There's also a new realtime light component to set lighting parameters. Finally, there's a new materials shader. This Enhanced Rendering plugin should arrive in the Unity Asset Store for free sometime around the GDC 2016 timeframe of early March.
 
Last edited:
Am I crazy, or did the dev speaking through-out that video sound a lot like Gabe Newell?

For some reason at the beginning of the video I thought it was Gabe talking and for the rest of the video I could only hear his voice as Gabe's

Though that's not necessarily a bad thing...
 
Keeps the post from being too long on the front page. There's nothing actually spoilerish in it.

We used to have a pre-break function, but we don't have a very active web dev to implement something new.
 
i know this is not the place but i hope we don't see gabe newell shaking hands with unity CEO John Riccitiello. Or anyone of remotely good standing from their customers pov and those they respect.

Who btw has a salary of $800k. Don't understand why unity let a wolf in, he isn't to be trusted.
again i know not the place. just had to be mentioned. I don't think much of anyone out there would shake that hand. rather go for the smack
 
Last edited:
Keeps the post from being too long on the front page. There's nothing actually spoilerish in it.

We used to have a pre-break function, but we don't have a very active web dev to implement something new.
Sigh maybe it's time for me to get involved again.

Great post by the way. I really appreciate the transcript as I much prefer to read this sort of thing rather than watching a video.
 
Back
Top