28
BROADCAST TECHNOLOGY 2018
Incorporating HDR/WCG
in the broadcast workflow
BY A. RAZA
High dynamic range (HDR)
and wide colour gamut (WCG)
have made a big splash on the
broadcasting world. Numerous
manufacturers have lined up
to show the spectacular colours
with their cameras, monitors and
projectors. The challenge, then, is
to incorporate these new standards
into broadcast workflows.
Before we go forward, let us
understand what is HDR and WCG.
SMPTE defines HDR as system specified
and designed for capturing, processing
and reproducing a scene — conveying
the full range of perceptible shadow and
highlight detail, with sufficient precision
and acceptable artifacts, including
sufficient separation of diffuse white and
specular highlights.
HDR is specified and designed for
capturing, processing and reproducing
scene imagery, with increased shadow and
highlight details beyond current standard
dynamic range (SDR) video and cinema
systems capabilities.
Human vision has a wide latitude
for scene brightness, and has multiple
adaptation mechanisms that provide an
automatic ‘gain’ to the visual system.
The brightness range that people can
see is much greater than the available
simultaneous contrast range of current
displays. HDR systems are intended
to present more perceptible details
in shadows and highlights, thus
better matching human visual system
capabilities under the several image
viewing conditions typically found in
consumer environments.
In particular, HDR allows
distinguishing bright details in highlights
that are often compressed in traditional
video systems, including allowing
separation of colour details in diffuse
near-white colours, and in strongly
chromatic parts of the image.
SMPTE defines WCG as chromaticity
gamut significantly larger than the
chromaticity gamut defined by
Recommendation ITU-R BT.709.
Before we move forward and discuss
the workflow issues related to HDR and
WCG, let me mention that there are at
least three standards of HDR :
n Dolby has developed a standard
known as Dolby Vision.
n HLG (Hybrid Log Gamma) has been
developed by the BBC and NHK.
n HDR10 is the standard for Blu-ray.
The first problem is, which HDR
standard will be used by broadcasters?
The second problem is, lack of related
equipment that can read HDR content,
such as reference monitors, consumer
displays, and video links capable of higher
An
Supplement
bandwidth required to transfer HDR
content.
The third problem is, for the most part,
HDR/WCG interfaces are compatible
with HDTV and 4K/Ultra HD (UHD)
TV 10-bit and 12-bit signals, and can
be carried over existing 3Gps-12Gbps
interfaces. If, however, high frame rate
(HFR) signals such as 100Hz and 120Hz
are to become part of an implementation,
new interfaces and infrastructure will be
required. The existence of HDR signals
and different display colourimetry can put
new demands on systems interoperation.
Displays, image processors, up/down
colour convertors will all need to detect
the HDR encoding and colourimetry in
use to correctly process and display the
signal.
HDR and WCG are emerging
technologies that are still undergoing
much development, and there are
various approaches needed to create,
transport, distribute and display HDR/
WCG content. This is an implementation
challenge for broadcast workflows that
are complex in nature, highly automated
and expensive to build. Broadcast
networks rely on standards to ensure
interoperability and to build cost-effective
workflows.
HDR/WCG with frame rates
limited to a max of 50/60Hz can be
accommodated by existing multi-link
1.5Gbps, or multi-link 3Gbps interfaces, or
10Gbps optical links. HDR/WCG signals
will require that displays be changed to
correctly display the images. The use of
frame rates beyond 60Hz that also include
pixel matrixes at 4K/UHD and 8K will
require building new infrastructures.
It is expected that systems in a CER
control room that switch, record, measure,
display, process overlay graphics or
playback HDR/WCG content will need
upgrading or replacement to support new
features. Multiple output signals for HDR
and SDR may be created automatically
from an HDR signal and these all need to
be monitored.
Existing interface metadata tables need
to be adjusted to reflect the addition of
HDR/WCG content types, and messaging
protocols need to be extended to cover
these new content types. Broadcast
workflows for terrestrial, satellite, cable
and IP distribution rely heavily on
automated processing system workflows.
To enable HDR/WCG processing, as
well as conversion between HDR/WCG
and traditional SDR content in these
workflows, dynamic-, scene- or frame-
based metadata may be needed.
There is uncertainty on how such
metadata can be bound to content
and transported through automated
workflows in a persistent manner.
Processing and conversion systems
such as video mixers, encoding systems,
and graphics systems might delete the
metadata. Other processing systems might
alter the image content in a way that the
associated metadata no longer reflects the
image content.
Metadata would need to be updated to
There are numerous problems
related to production
Displays capable of showing the entire
captured image might not initially be
available to production staff. Postium,
Eizo and some other manufacturers have
announced HDR displays recently, so that
may solve the problem.
Both HDR and SDR monitoring
systems and processing equipment for on
set and in-studio are required to measure
and view the full signal range that is
recorded or transmitted.
New lighting systems may be required
for HDR and WCG, taking advantage of
the greater dynamic range in HDR. There
will be a need for more flexible and artistic
requirements for lighting systems in the
studio and on set.
The characteristics of the image
dynamic range must be preserved, yet
not all of the range coming from a camera
can be seen with monitors. Additional
metadata created at capture may need
to be defined — describing viewing
equipment and conditions, and methods
must be created to deliver it to later users.
reflect the new image parameters, as well
as a history on how the image was altered.
If large number of audio channels are
present in the video, it is possible that the
interface may not be able to carry all the
channels. This could be the case when
dealing with HFR signals being converted
to lower frame rates — there may be lip
sync issues or other audio issues.
Real-time conversion will also be a
problem. Many broadcasters convert
content and broadcast in multiple content
types. It is possible that HDR and WCG
content will be delivered in a variety of
HDR/WCG combinations of colourimetry,
peak luminance, maximum dynamic
range and transfer function. It is further
expected that some content will be
conveyed in both HDR and SDR versions
and with different colour spaces.
These different HDR/WCG content
types may need to be converted to
conform to an in-house specification
to allow seamless processing and
distribution of content, and to conform to
content delivery/transmission standards.
Graphic overlays, tickers and logos
will have to be accommodated in such
a way that HDR presentations when
converted to SDR presentations produce
acceptable results without complex
conversions.
Ingest, storage, playout systems
may need upgrades or replacement to
support HDR/WCG file formats, codecs
and metadata. Media asset management
systems may need updates to support
storage, processing and distribution of
metadata about HDR/WCG content, Web
service messages and user interfaces. It is
expected that at least 10-bit representation
will generally be required for support
of HDR/WCG content in codecs, signal
paths, file formats and in applications, as
well as metadata to flag the presence of
such content.
In the case of file formats, most file
formats such as MXF are already 10-bit
capable. Interfaces for broadcast playout
will need to be upgraded to allow
signalling for HDR/WCG content and, if
applicable, the synchronised transport of
content-dependent metadata.
Conclusion
“Implementing only
4K or UHD (higher
resolution) — without
HDR and WCG — is
obviously not the way
forward.”
— A. Raza
Founder and CEO,
Whiteway Systems,
a systems integrator
While HDR and WCG are great
enhancements to the image quality and
will provide a huge advantage to the
broadcaster, they need to be implemented
after the entire process has been thought
through carefully. Implementing only 4K
or UHD (higher resolution) — without
HDR and WCG — is obviously not
the way forward. We are sure that as
things move forward and HDR and
WCG become a necessary part of the
broadcasting infrastructure, the missing
gaps will be filled up and things will
become easier.