Skip to main content

Unity Debug Diaries: Camera Glitch in Unity VR

·374 words·2 mins
Table of Contents

I have been doing a bunch of VR work lately and working with the Gear VR in particular. The Gear is actually a nice piece of kit and I can’t wait to get my hands on the Oculus Go coming out… soon. As of this post, the Gear VR has the largest market penetration of any VR headset. One bug that I encountered that was tricky to nail down, was an optical glitch when moving from one part of my app to the next. It looks like an uninitialized RenderTexture, different every time and with glitchy artifacts everywhere.

Turns out the culprit was render scale, now known as XRSettings.eyeTextureResolutionScale. ETRS (because I’m not typing it out every time) is great for getting "free" anti-aliasing on everything in your scene. It basically renders at whatever ETRS is set to (eg:1.0, 1.2) and then downscales to 1 for the final blit. Since its a downscale (bilinear I think), its best never to exceed 1.5 or things will start to look crunchy. If you just want to downscale, use  XRSettings.renderViewportScale, which is way cheaper than ETRS. It can only accept values from 0 to 1, hence downscale. By doing this, you would trade visual fidelity for performance.

Back to the issue at hand, changing eyeTextureResolutionScale at runtime will cause a new texture set to be made for each camera, and those textures are uninitialized. That means that it has been allocated space in memory, but whatever was occupying that memory is still there. This creates the weird memory artifacts that you see for an instant before all cameras render (the glitch).

Update>

Update #

So to fix… I could not find an elegant solution. I tried a whole bunch of things from calling GL.clear on pre & post render to changing the clearing flags, forcing a render and then restoring them. Nothing worked (Note, this is especially on GearVR). What I ended up doing was using OVROverlay as a fade canvas that is permanently attached to my camera. So when I fade to black, I fade the 2x2 (because 1x1 does not work as it’s not a power of 2?) texture, using SetPixel32. When faded, nothing rendered in Unity is visible.

Right, now go make something cool!