Does 4K Need High Dynamic Range to Succeed?

Does 4K Need High Dynamic Range to Succeed?

 

As mentioned in my previous blog, before the summer break, high dynamic range (HDR) has featured large on the agenda as being the missing piece of UHDTV armory to trigger commercial success for the format. Certainly the recent DVB-EBU trials in Munich showed impressive results, but how this technology will be realized, what production workflow can be used (particularly for live event coverage) and whether this technology can be launched in cost-effective TVs are yet to be answered.

Screens are evolving and prices tumbling to levels likely to attract buyers, who will no doubt feel they are purchasing now on enhanced web streaming ability and upscaling HD, as well as investing in a future-proof TV that will be able to decode and display 4K. By and large this is true, with most of the 2014 crop of screens supporting the latest version of the HDMI spec and an HEVC decoder capable of operating with the limited number of 4K streaming movie services available. Buyer beware though, for two reasons, there is still legacy functionality on the current “latest products” that will only become apparent when a 4K streaming service is applied to the screen. More importantly, we are still in the midst of a phased introduction of UHDTV, which could make a purchase now quickly lack the latest wow factor feature for 4K. HDR is one such feature, but there are likely to be more, an unfortunate aspect of companies drip feeding features to unsuspecting buyers. The lack of genuine 4K sources is disguising this fact from many early adopters, as is the limited number of viewers who have sufficient broadband bandwidth to sign up for Netflix’s 4K streaming service.

Should you not have 15-20 Mbps broadband connectivity, 1080p would be the fall back option, but is this really a second best option? Many of the HDR demonstrations were made in 1080p and as many of you who visited the Harmonic booth at IBC last year and NAB this past spring will know, I have long been an advocate of 1080p transmission of 4K sourced and displayed material at the bit rates recommended for 4K streaming services. This may seem counter intuitive at first that 1080p be preferred over native 4K transmission, but it is a credible stance. Of the few 4K streamed services available at the moment, many are in fact 1080p or showing 2K digital cinema content. Even though the bitrates recommended for 4K are high, they may not be enough to prove 4K supremacy over 1080p for demanding sports content. This factor will only become apparent when early adopters switch over from 4K streaming of movies to the much debated rollout of live UHDTV broadcasts.

What is certain is that HDR will not yet be factored into the current crop of screens that are clearly only targeted at 4K streaming services. Broadcast formats and specifications are still in a state of flux, and more worryingly, there appears to be no effort to merge the TV and cinema needs from a workflow perspective leaving 1080p to be the safest interim until UHDTV is fully sorted.

-Ian Trow, Senior Director, Emerging Technology and Strategy

10 Reasons Why Virtualization Will Happen in Broadcast

virtualization-in-broadcast

 

 

  1. Virtualization is now mainstream in data centers with up to 70% adoption. Broadcast architectures are now dominated by IT infrastructure so further rationalization within this sector will mean embracing a technology that is already dominant in the IT and enterprise sector.
  2. Video processing is heavily reliant on CPU and storage. This has traditionally meant bespoke products and solutions have been required to address the processing, bandwidth and storage requirements. The performance requirements of video are now down to the levels that are manageable on servers for all but the most demanding video applications.
  3. Traditional broadcast headends have been architected around technology implementations in products that result in a non-optimal partitioning of functionality.
  4. Redundancy in broadcast has all been about replicating functionality to achieve a high degree of resilience to failure within a system. This has led to significant overprovisioning which could be avoided through a more dynamic approach to resource allocation in the event of failure.
  5. Technology refresh in the broadcast domain is seldom about a “like for like” replacement these days, so a separation between the functionality implemented and the base hardware is logical to provide flexibility.
  6. Agility is now a key aspect in today’s media with scheduling and evolving playout platforms dictating a more flexible approach to deployments in order to adapt to media changes and content streaming needs.
  7. While undoubtedly the ability of servers to absorb functionality previously only available in dedicated hardware is a major initial draw to virtualization, the medium to long-term appeal is to layer a range of media functionality dynamically across virtual machines without leading to network or storage contention.
  8. Social media, audience preference and targeted advertising are all examples of analytical data that are essential complements to any form of programming today. These back office functions are already heavily virtualized to mine vast big data, transforming the unstructured into the essential drivers behind programming.
  9. Running multiple functions on virtual machines reduces port counts, interconnections, rack space and power.
  10. Bespoke broadcast infrastructure is expensive to support when compared to more commonplace IT/enterprise installations. Virtualization promotes moves to enable media specialists to concentrate on media content rather than the background infrastructure.

Interested in Knowing More?

I’ll be hosting a Virtualizing Video webinar on Thursday August 21st at 12 p.m. EDT in conjunction with TV Technology. Details on the free webinar are available here.

- Ian Trow, Senior Director, Emerging Technology and Strategy

 

4K in Context

My job presents some great last minute opportunities to view the video market from different perspectives. Having just recovered from NAB 2014, I was dispatched to New York for Streaming Media East.

While both shows were on the 4K bandwagon, those attending were keen to dig deeper on the key issues. For NAB, those attending were quizzing me on Color Space and High Dynamic Range. Streaming Media East was a contrast though with most questions concerning whether HEVC is delivering on the performance promises compared with H.264.

So what are the takeaways? On color space it came as a surprise to most at NAB that the vast majority of 4K / UHD screens were still locked into the gamut defined for HD, basically, buyer beware for those contemplating an early adopter 4K screen purchase.

High Dynamic Range (HDR) was understood to be a requirement for 4K to deliver the necessary Wow factor over HD, but what baffled the Vegas crowd was exactly how this was to be achieved. At a superficial level all seemed good at NAB, and even at CES for that matter, with screen manufacturers keen to claim conformance to UHD color space specifications, but the really critical question is how to convey the correct color mapping to the latest screens. Capability means nothing if you can’t access it.

Basically a screen that can support an expanded color space is useless unless there exists a mechanism to provide the correct color mapping to unlock the extended capability. The industry is struggling to address how to handle delivering color mapping to screens with different capabilities. I’ll write more on this subject in June, after attending the DVB/EBU event on HDR in Munich. Seems a bit late to me though with 4K screens enticing those with deep pockets to upgrade only to be disappointed when the real deal arrives in time for mass adoption.

No such concerns were worrying those at Streaming Media East this week, who were far more pragmatic and wished to understand what players were viable on commonly available platforms. 1080P capability seemed to be the limit for most player-based platforms at the moment, although we’ve really yet to hear from the gaming fraternity about what the latest crop of consoles can support. This is more of a commercial question than a technical one, I think. Bitrates, encoding turnaround times and delivering broadcast quality in a world obsessed with net neutrality seemed to be the order of the day. So much so that I’ll dedicate my next blog to unraveling the myriad of technical details disclosed in New York in next week’s blog.

- Ian Trow, Senior Director, Emerging Technology and Strategy

ProMedia Live Powers OTT Delivery of One of the World’s Most Prestigious Classical Music Competitions

The 14th Arthur Rubinstein International Piano Master Competition is being telecast live over YouTube, thanks to Harmonic’s ProMedia Live real-time multiscreen transcoder. ProMedia Live is being used to power the live transmission of the competition, which kicked off on May 13th with the Opening Gala Concert at the Tel Aviv Museum of Art, and continues until May 29th. With pianists coming from around the globe to compete, this event boasts a large following, making the YouTube live broadcast of the recitals a vital part of the competition.

The competition began in 1974 to unite the name and the artistic legacy of Arthur Rubinstein with the cultural life of Israel. Conceived in the spirit of this legendary pianist, the Competition is an important international forum for presenting talented, aspiring young pianists and fostering their artistic careers. For more information on the competition, visit www.arims.org.il/competition2014/.

Future Video Strategies for Network Delivery – An Exclusive NAB Show Breakout Session

In conjunction with Integrated Media Technologies, Harmonic invites you to an exclusive NAB 2014 update on video solutions for increasing production capabilities, improving video quality and achieving ultra-efficient video delivery.

A delicate balance between the worlds of video and network engineering are required if services providers are to deliver a quality of experience (QoE) to match scheduled linear broadcasting. So, what video parameters are the most relevant and which new technologies and standards will shift the balance in favor of IP based network delivery? This presentation intends to equip those attending with the key issues influencing how broadcasters, content aggregators and Internet service providers handle adapting to the new opportunities for video delivery.

Guest Speaker Ian Trow, Senior Director Emerging Technology & Strategy has over 20 years of systems and design experience in High Definition and MPEG video products.

Date: Tuesday, April 8, 2014

Time: 3:00pm

Location: The Renaissance Hotel, 2nd Floor Conference rooms

3400 Paradise Road

Las Vegas, NV 89169

Please RSVP as space is limited.

Colorimetry, how does it relate to the success of Ultra HD?

colorimetry-ultra-hd-4k

 

The initial justification for a move towards 4K was largely made based on improved resolution. However, frame rate has long been the parameter the experts recommend to give the biggest performance improvement over HD. A third factor critical to the success of Ultra HD is colorimetry — the science of color perception. Even if the relative importance of these parameters is questionable, what is not up for debate is the fact that colorimetry is sure to be the most complicated parameter to implement being made up of many interrelated issues, if it is adopted for Ultra HD.

As it stands,  Rec. 709 (ITU-R specifications for HDTV)  color space is being used as the basis for Ultra HD, at least until the industry decides how to handle Rec. 2020 (ITU-R specifications for UHDTV). To the casual observer it may seem strange that Rec.709 is being considered at all for Ultra HD when Rec.2020 has been around since 2012 and is implemented in both the latest version of HDMI (the one we are not supposed to call 2.0) and the latest MPEG compression spec (HEVC or H.265 depending on your perspective). The lack of adoption is nothing to do with transfer characteristics of Rec. 2020, for 10 bits per sample the non-linear transfer function is identical to that deployed for Rec.709. This largely explains why Rec. 709 can be applied to Ultra HD, so why consider 2020 at all then? To answer this we have to return to the aims of Ultra HD, which is namely to improve on HD and deliver the wow factor associated with color rendition in the cinema. In addition to addressing frame rate and resolution, Ultra HD aims to close the gap between cinema and television by improving the dynamic range of Ultra HD screens to match the capability of cinema.

So how do we do this? One approach is to define a wider color space. Rec. 2020 offers a vast improvement over Rec.709 in this respect. Another possibility would be to extend the number of bits per sample. Rec. 2020 supports both 10 and 12 bits per sample. So let’s take each of these approaches in turn. A wider color space has obvious benefits, especially when Rec. 709 compares poorly with the color space supported by film. Put simply, Rec. 709 doesn’t have the required scope to support the color space extension hoped for by the industry or indeed possible with the latest generation of Ultra HD screens. Extending the color space does come with problems though. For Rec. 709, a single color space conversion is adopted, for reasons of compatibility with HD screens, but this is far from ideal in a world where Ultra HD screens may be marketed and priced according to dynamic range capability. For Ultra HD screens to have varying dynamic range capabilities, multiple color space conversions need to be supported according to screen and application type. This greatly increases the complexity and adversely impacts the viability of adoption in media workflows. High Dynamic Range systems aim to overcome the need for multiple color space conversions by retaining a native color space and only applying tone mapping according to the desired screen. Herein lies the explanation for the difficulty in implementing Rec. 2020, namely what native color space should be adopted and how is the tonal metadata handled?

Lastly, what sample bit depth is really required for Ultra HD? I’d speculate that 10 bit can more than utilize the enhanced color space offered by Rec. 2020 for Ultra HD distribution, provided of course at least 12 bits per sample are utilized further up the production workflow. This is contentious positioning though, so I will use my next blog to explain my reasoning.

- Ian Trow, Senior Director, Emerging Technology and Strategy