20120101 - Games vs Film

2012 through 2014 could be a very interesting time for real-time graphics if game developers realize that even at native 1920x1080 without MSAA they are already effectively super-sampling when compared to movies. In case this is not obvious, go and look at some direct screen grabs from BluRays. For example, here is a crop then the full direct 1080p screen grab from Ironman (click images in this post for full size),

Won't find any pixel width detail in a film, and anything anywhere near a pixel in width looses contrast.

Where are the forum flame fests about how the director "added blur" by not getting the in-focus circle of confusion smaller than a pixel or where the textures used in the Ironman CG shots lack detail?

Compare to a possibly doctored "perfect" driver SSAA override screen grab from Skyrim posted on Neogaf (again click images for full size),

The industry status quo is to push ultra high display resolution, ultra high texture resolution, and ultra sharpness.

IMO a more interesting next-generation metric is can an engine on a ultra-highend PC rendering at 720p look as real as a DVD quality movie? Note, high end PC at 720p can have upwards of a few 1000's of texture fetches and upwards of 100,000 flops per pixel per frame at 720p at 30Hz.


Another interesting comparison which provides a numerical answer, is to down-sample with un-filtered decimation shots from BR movies, and see at what point edge gradients start to match high quality MSAA or CSAA shots from games. For instance with the same Ironman shot, one can use nearest neighbor down-sample at 33% of the original size and get something which has visually similar edge gradients to high quality AA on games,

And the full size un-filtered down-sample,

In practice, it seems as if somewhere between a 50% and 33% nearest neighbor down-sample would be typically required to make BR movies have the sharpness of the high quality AA found in games. Intuitively this makes sense as "50%" is the natural lower bound: detail under 2 pixels in width will have aliasing under motion. Inversely, it seems as if 1080p BR movies typically have somewhere between only 350p and 540p of effective "game resolution". Or alternatively in game AA terms, BR movies get an extra 2x2 or 3x3 OGSSAA but instead of getting resolved to the 350p to 540p resolution, they get a DOF "blur" applied at 1080p...