Increasing consumer demand for 4K displays and content is driving adoption in both acquisition and post-production across every film and television segment.
To create more immersive and engaging viewing experiences, content creators are capturing and generating programs with higher resolution and frame rates, as well as higher dynamic range, greater color depth, and wider color gamut. Together, these enhancements enable creation of richer, more captivating content, but also require production facilities to employ faster, more expansive storage environments.
Shift toward 4K
Though fewer than half of media facilities had some 4K running through them at the close of 2016, many are now moving toward full 4K adoption. The move to capturing 4K resolution or greater has a couple of major drivers. First, when capturing content at higher-than-delivery resolution, editors can zoom and crop 6K or 8K material for better-quality content delivered at 4K. At the same time, editing at higher resolutions and archiving both original footage and completed content makes them more valuable for future content creation and re-monetization opportunities.
Whether ramping up production to 4K today or preparing for the future, content creators will need to deal with not only a dramatic rise in data volumes, but also a corresponding increase in the bandwidth and processing power needed to work effectively with higher-resolution content. While the massive size and quantity of files present their own challenges, the variety of formats – different types of HD, 2K, 4K, and soon 8K – used in higher-resolution workflows likewise throw up significant hurdles for storage infrastructure. By closely examining key capabilities of workflow storage, such as performance, scalability, cost, and network protocol, as well as the data formats and workstation operating systems to be used, a facility can determine an optimal storage infrastructure for higher-resolution content creation, delivery, and archive.
Capacity and Performance in the 4K Realm
Mastering in uncompressed 4K format consumes up to 10 times as much capacity per hour as compressed 4K. Thus, preparation of a full-length feature film in 4K can easily generate hundreds of terabytes of data. The data in figure not only shows the enormous jumps in capacity involved in moving from HD to 4K UHD formats but also highlights the very different requirements of working with compressed and uncompressed media. As the table shows, multiple ongoing projects can demand an underlying storage environment capable of scaling from dozens of terabytes to petabyte-levels. Once a facility attaches numbers to elements such as data rate, capacity, and stream count, it can set about implementing a storage infrastructure that best meets its needs. More specifically, it can identify the right combination of drive arrays, networking, and file management in a system that ensures high reliability, continuous scalability, and availability that are fundamental to efficient operations and future growth.
High-resolution workflows can require tremendous gigabyte-per-second performance, and these speeds are delivered by the aggregate performance of underlying drives. Two basic drive types typically are used to support media workflows: individual spinning hard-disk drives (HDDs) and flash-based solid-state drives. HDDs provide bandwidth ranging from 75 to 150 Mbps, and SSDs deliver up to 1900 Mbps.
While flash-based SSDs provide significant increases over HDDs in both random and sequential performance, random SSD performance is orders-of-magnitude higher, making it ideal for workflows that require high numbers of compressed streams. HDDs have proved to be an especially good solution for accommodating the high data volumes typical of uncompressed 4K workflows – and for enabling sequential playback of uncompressed frames.
The Cost of Performance
So, while the speed of SSDs is impressive, this speed comes at a cost. A minute of uncompressed 4K requires about 50 GB, and the most expensive enterprise SSDs accommodate up to 2 TB, or about 40 minutes per drive. Given these constraints, SSD prices must continue to drop before these drives become compelling – in terms of per-gigabyte cost – for many 4K storage infrastructures. That said, many advanced workflow storage platforms give facilities the means to take advantage of both drive types and to automate data migration across higher-performing SSDs and more affordable HDDs.
Data Rates versus Stream Counts
Not all HDDs are created equal; media facilities must delve a bit further into performance in order to select the right drive for their storage infrastructure. Although it would seem reasonable to assume that greater data rates of 4K streams make the highest impact on storage arrays, results from Quantum testing clearly show that thehigher numbers of compressed 4K streams have just as much impact on storage array performance.
In the case of HDDs, the need to support concurrent compressed streams causes drive heads to race to read multiple streams, thereby increasing latency and undermining performance. So, even when the cumulative bandwidth of individual compressed streams is much less than the theoretical maximum performance of an array’s controller, the latency caused by reading high numbers of streams can quickly overwhelm HDD-based arrays. As a result, arrays with faster, 10K RPM 2.5-inch HDDs are better suited for higher compressed stream counts than ones with slower 7200 RPM 3.5-inch HDDs. In comparison, the much higher data rates of uncompressed streams exhaust the throughput capacity of the storage controller long before drive latency becomes a factor.
Taking Optimization Step Further
The network connections attached to storage infrastructure likewise have a direct impact on performance. By providing fixed, predictable bandwidth between workstations and shared storage, fiber channel storage area network (SAN) connections address post-production processes that demand fast, reliable access. More affordable Ethernet-based network-attached storage (NAS) connections require time-consuming processing overhead but are appropriate for many other operations largely unaffected by small delays.
More and more facilities moving into 4K are using a combination of the two, unified by a single workflow storage platform, to optimize workstation and client connectivity across all parts of the production workflow. Taking optimization step further, facilities are creating and implementing best practices for system-level settings such as stripe groups, LUN sizes, cache settings, and inode stripe width. For storage systems that support this level of tuning, such settings can have a major positive impact on overall performance.
Flexible, robust file systems can bring tremendous efficiencies to 4K workflows. For one, it can enable multiple Windows, Linux, and Mac workstations to access shared storage over a fiber channel network, allowing them to read and write to the same storage volume at the same time without network delays. This capability yields the immediate, shared access to media files that are central to effective collaboration.
In high-resolution workflows, it is essential to maximize the available space in primary storage and move everything other than work-in-progress content to lower-cost tiers of near-line or offline storage. When storage extends across these multiple tiers – even cloud storage – the file system can simplify the presentation of files to both applications and users and automate policy-based file movement of content to support the workflow.
A facility can choose storage with the optimal drive-type and configuration, network, and file system to create and deliver 4K content with confidence. With a solid storage infrastructure in place, the transition to 4K/UHD content production can go smoothly, allowing content creators to focus on the beauty these richer higher-resolution formats bring.
Based on an article by Dave Frederick, Senior Director, Media & Entertainment at Quantum