20170318 - Custom VR Resolves


Continuing from 20170318 but with a new shadertoy...

Shadertoy
Direct Link to the 2nd Shadertoy
Change the #define MODE in the source to try different options, would have made it a slideshow but the shader explodes.

Warning, this is a post about temporal effects, so actually running the shadertoy and trying options is required to see what is being described. The stills below look quite different than the live version.

Unwarped
The unwarped view from last time.



Warped Nearest
#define MODE 0
NoAA and then warped with a nearest fetch. The "warp" isn't any specific VR warp, but just a proxy operation which has a mix of under and over-sampled regions. Enough to be able to compare techniques.



Warped Bilinear (Linear Filter)
#define MODE 1
NoAA and then warped with a bilinear fetch (in a linear colorspace, it will look smoother if done incorrectly in a perceptual space).



Standard PC VR Practice
#define MODE 2
4xMSAA (or 4xSGSSAA if super-sampled) box resolve then warped with a bilinear fetch. Might look different than standard practice, this is a linear resolve. Skipping the chromatic abberation to make it easier to see resolve quality. Shows the typical problems with the standard practice.

The MSAA box resolve is a poor quality lowpass operation, so all the high-frequency data is destroyed. Then a poor bilinear tap to warp. Impossible to get good pixel quality. In motion, on mega-sized pixels of VR HMDs, the artifacts are instantly immersion breaking for me.



App Owned Warp
#define MODE 3
Showing what is possible with standard triangle rendering if the app owned the warping operation instead of the common PC practice of being forced to submit the pre-warped (non-warped) image. 4xMSAA (or 4xSGSSAA if super-sampled) gaussian resolve and warp in one operation. Shader isn't showing an optimal implementation by any means, but is showing possible results. Tuned to use a little grain and a kernel which doesn't fully remove the sample pattern on under-sampling.

Notice in the live version in motion, that the image does not feel locked to pixels. Pixel quality is good, and the eye can resolve an edge to much higher spatial precision than the common practice.


And unwarped for comparison (to see how much under and over-sampling is happening).




Focusing on Quality
Ironically PSVR's lower resolution screen has slightly more pixels per second at 120Hz than VIVE at 90Hz, and typically PC VR uses over 4x the GPU than a PS4. I find this amazing, and a testimate to what can be done when people focus on quality at the systems level. See Higher Res Without Sacrificing Quality for a taste of what is possible.

In a world where everyone is so focused on hitting VR problems with inefficient tools like multi-view, Simon Hall and Joe Milner-Moore just show exactly how well it can be done with something better, and something which is fully backwards compatible and portable.

Mixing the kind of custom resolves I'm describing with the stencil masking in the above presentation is possible. I prefer to keep the standard 4xMSAA pattern instead of using programmable sample positions to move back to a 2x2 grid. Also with the technique I'm describing it can be best to do the RGB resolve+warp to the position for the green channel, without attempting to add in chromatic aberration (optimization to reduce the filter cost). Apply aberration to the Red and Blue in a second pass after the resolve.