The HDR myth & suggested setup /
notes for the AppleTV X
I
am not a fan of HDR. I can kinda live with Dolby Vision, but HDR has a LOT of
technical issues that can only be compensated for, not fixed.
Both HDR and Dolby Vision are perceptually lossy compression schemes
I would suggest calibrators do a calibration for the ATVX in 4K SDR and a second
calibration for 4K Dolby Vision and avoid 4K HDR if possible.
I highly recommend 4K SDR.
My fierce dislike of HDR comes from a deep understanding of the technical
aspects of it. I have been doing best in class calibrations and taken many best
of show awards for my pictures going back 30 years. I am a SMPTE, SID and SPIE
member. I am deeply into the science of displays. I have participated on
steering committees for some of these standards and regularly attend various
conferences by SMPTE and SID on display science. I have a deep understanding of
HDR and Dolby Vision. I have post production clients with ATVXs who do current
top series work and film work.
I will try and make the HDR debacle as simple as I can as its a complex thing
many do not fully understand - even people in the industry. This is a simplified
description of what is going on.
A HDR display is a theoretical
display device that does not exist that has a range of brightness that our
current science cannot do. I am not sure you would want one as it defines that
the screen would have light literally as bright as the sun with dark scenes as
dim as moonless nights. Can you imagine the transition from a dark to light
scene ?. Zero doubt people will turn down the brightness, reducing the HDR
display to SDR again.
You have a SDR display. This is defined
as less then 1 million to 1 measured contrast ratio. Not a manufacturer claimed
contrast ratio, a real standards based measurement done in a controlled test by
a qualified 3rd party. SMPTE standards going back 50 years and well established
perceptual science have worked out a max brightness for human vision that is
comfortable. This is not sunlit bright HDR screens.
The eye has dynamic limits it can see. If
half the screen is super bright, and the other half dark, you cant see the dark
as well. SO there are limits to how much contrast ratio is useful. HDR/Dolby
Vision is a compression scheme to bring a wider range of brightness into the
consumer living room without using more bandwidth ( bits per second ) by using
perceptual coding and lossy compression. HDR/Dolby Vision remove things the
average person can't see based on all sorts of super complex modeling of what
the average human can't see in order to push brighter range pictures. Just like
what audio compression does for things like MP3. HDR/DV are a lossy luminance
compression scheme that provides a theoretical brighter white and darker black
on a display that science has yet to invent.
I dont want HDR, its not a good thing. It
was pushed by the Consumer Electronics Association mfgrs as a way to sell TVs,
rcvrs, even cables.
It really stands for "High Dynamic
Revenue"
Your TV/projector is SDR. Its contrast ratio is not HDR. It is a SDR display.
Full stop.
When they shoot content for HDR they capture SDR and "metadata" about what is
bright and dark in scenes. As HDR DOES NOT HAVE MORE BITS they spread the bits
they have around to cover a wider range and use perceptual coding to compress
what would normally take 14 or 16 bits to cover. They then store this "meta
data" that describes where the bits could go if the display is actually HDR. So
HDR starts off by chopping up the luminance ( brightness ) pixel by pixel /
scene by scene into disconnected chunks and doing perceptual compression based
on a "std" human. So it puts some bits here and some there and discards data the
"avg" person might not pick up on under "Normal viewing conditions". HDR does not add any new resolution to
luminance. No more steps between black and white. It just spreads them out in
chunks and does compression.
The SDR captured tho is very accurate and 10 bits
is plenty for the eye with TONS of science and decades of research behind it. So
a HDR program is SDR + HDR meta data.
Capture and post production are more complicated
because how do you set the right HDR brightness and see the picture when a true
HDR display does not exist for use in post ? Only tone mapped SDR displays can
be used. No one has ever seen a actual HDR picture. They just hope and guess.
On the display side.. If you had a true HDR display then the bits would simply
land in the right place in brightness, no processing required. In fact the
perceptual compression might be OK. BUUTTT,, No one has a HDR display.. And
science does not know how to make one.
SO.. On the consumer display side.. The incoming SDR + HDR metadata decodes into
a HDR set of bits. This chunky blocking of compressed brightness bits needs to
be reduced to fit on the SDR display brightness range. This is tone mapping. So
tone mapping is, at best, a guess by each mfgr on how to handle this reduction
of contrast ratio and what to do with all these blocks to somehow remap and
reconstruct a continuous black to white range that works on your display device.
Color also needs tweaking if you play with the brightness of a pixel. Of course
what was tossed out in perceptual coding is gone forever. There is no standard
for tone mapping. Each mfgr does it different. No 2 pics are alike. This math is highly
complex as it varies scene by scene, area by area, pixel by pixel. Lumigen has
made a LOT of money off HDR doing this math. They do it really well by the way.
But it is by no means perfect and some material was lost by the perceptual
coding. Also its different then the director saw because every HDR display has a
different set of tone maps. So a director / DP watches a display and a set of
tone maps that is different then what a consumer sees. Tone maps are literally
hand tweaked. Some TVs use AI and every time you run a scene thru it looks
different. Dark areas just never work well because of bit starvation. They have
banding most times as the crude incomplete perceptual compression then undoing
of it onto a real TV does not fully model dark well and banding results along
with noise.
At times, very rarely, the tone mapping can give more bits to a section or a
scene. So under IDEAL conditions, with a perfect calibration, with the right
movie from the right app, with the moon in the right phase,, parts of a scene can
look better then the SDR. Maybe. This is rare and comes at a cost to the rest of
the scene VS just doing SDR. As SDR is the master, you really wont get any
better then the SDR, HDR will just have darker dark scenes and brighter bright
scenes. Which I personally dont want to have the sun in a outdoor scene be as
bright as the real thing. I dont need sunburn from my watching beach scenes.
Tone mapping is fitting a square peg in a round hole with a hammer and every
mfgr has a different hammer along with every peg and hole being different moment
by moment.
BUT.. You can simply discard the HDR metadata and use the original SDR and skip
a lumigen and feed that to the display directly. Set the Apple TV X to 4K SDR
and turn off "match content - dynamic range" Leave "match content - frame rate"
on. You then get the uncompressed unaltered SDR That the director saw and
intended and its just stunning. MAth/compression induced banding and noise are
gone. This native SDR matches your SDR display. No
need to process the image with tons of math. I have done a LOT of work on this
using all the high end projectors and flat panels and the best pic comes from
SDR directly into the display with a short HDMI cable. This also makes technical
sense. I realize tho a lot of people with a Lumigen use it for HDMI
switching and so its hard to just pull it or work around it. Any device between
the ATVX and the display/sound processor will degrade the pic. So try and keep
the path pure as the electrical characteristics of the HDMI coming out of the
ATVX is VERY low jitter and VERY low noise and a lab grade clock. A good path is
AppleTV X > Sound processor > Display.
Pass this email to your calibrator if they are of the mind that HDR is better.
Many people somehow think HDR is better. Its not.
So I would have your calibrator do one for 4K HDR setting on the apple TV and
then one with the setting for 4K SDR. And you can judge for yourself.
Its important for a calibrator to
use test patterns from the ATVX from places like maybe youtube or his own
uploaded vids or pics. The ATVX can play media off a local UPnP server. So A
calibrator would bring a server like a small qnap and run calibration material
off it right on the AppleTV X. Apps like Plex or VLC. Different apps use
different CODECs tho so its important to look at real content on apps like
Paramont+ and Disney for example. Youtube can land high bitrate material. I have
some evaul clips on youtube that are good for this kind of use.
IMPORTANT NOTES AND SETTINGS ON VARIOUS
TVs
Some TVs, like LGs, have horrendous
settings for SDR. To me these seem intentional in order to make SDR look bad and
HDR / Dolby Vision look good. So it might be best to use 4K HDR on some TVs in
order to get the best picture because of choices hard to circumvent in simple
settings. Dolby Vision is ALWAYS the better setting VS HDR. Avoid HDR.
Sony OLED panels and both std projectors
and laser projectors are best on 4K SDR. Sony knows what they are doing.
JVC projectors also seem best on 4K SDR
No matter what,,, you should try out the
settings and see what looks best. You should get a professional calibration if
possible but they need to read the above as they might force you into HDR and a
lesser picture while claiming its best
HDMI CABLES MATTER TO PICTURE AND SOUND
QUALITY. Experiment and see what works best in your system
HOOK Up THE ATVX via ethernet if you can.
Wireless is not the best way as it generates a lot of RF noise
Power and ethernet cables matter I am also
told. These can affect the RF environment.
|