Personal Opinion and Reply
Video makes it clear that vendors doing HDR demos should not artifically cripple the SDR/LDR display,
because even non-technical press will hold the vendor accountable.
I'm happy to see people thinking for themselves and challenging what they see.
If you are a vendor, do yourself a favor and take the time to do it right and use color calibrated displays.
It is also clear that the industry is doing a horrible job explaining the importance of ambient room lighting level on HDR content.
Based on the video, it is clear that the game has a different exposure bias for the SDR and HDR grades.
The SDR grade's exposure bias is targeted at a brigher room ambient level,
and the HDR grade's exposure bias is targeted for a much darker room.
Clearly when the room gets bright enough, the mid-level of the HDR grade will feel "too dark".
The only way to resolve these kinds of issues is for the game to have an exposure level control in the video settings for both SDR and HDR,
then for the reviewer to set the exposure so the mid-levels match on both SDR and HDR output,
and so that the mid-level is appropriate for the viewing condition.
Once mid-level is matching on the displays, then one can make some more objective comparisons of shadows and highlights.
Ultimately the burden is on the industry to teach gamers how to properly adjust in-game exposure controls to get the right experience.
Another problem with "HDR" is definitely accidental false advertising, so few people actually understand color.
The reality is that LCD ANSI contrast for consumer panels has not changed much: panels are still around 10-stops typically.
The industry already had been selling at least one 12-stop ANSI contrast panel in an "LDR" display before "HDR" was marketed.
If some new "HDR" displays ship with 12-stop panels, it really actually isn't anything new,
and you don't actually need HDR10 or anything else to use the 12-stops of dynamic range that the panel offers.
In-app quality temporal dithering with standard sRGB or Gamma 2.2 at 8-bits per pixel is more than good enough to remove visible banding for a 12-stop panel.
Even "HDR" wide-gamut isn't anything new, the industry has been selling AdobeRGB LCD panels targeted at print professionals for a long time.
The shift to DCI-P3 instead of AdobeRGB does not substantially change the area of the gamut.
And ultimately if the game is mastered for Rec709 (sRGB and standard HDTV) primaries, then wide-gamut display does not matter.
The real advantage of future "HDR" displays is if they support something like FreeSync2 which enables the game to query the monitor characteristics,
and then adapt the output to the contrast and gamut of the display.
This push for better control over color is what is actually important for future displays.
Hopefully at some point displays will pick up ambient sensors, so games can do some amount of automatic exposure adjustment.
The real danger IMO of current "HDR" displays is things like "local dimming" which create massive uniformity artifacts,
and the in-display tonemappers which are random transfer functions from display to display and make it harder for a developer
to get their content correctly displayed for the user.
There is one great prospect for LCD HDR displays, Panasonic's per-pixel backlight control, no more "local dimming" artifacts.
I have yet to see one of these displays, so I don't actually know how good they are.
Also not sure how long it will take for these to reach consumer level displays ...