New Data Management Techniques and Technologies to Accelerate Live Broadcast Events


Event Time

Originally Aired - Tuesday, April 16   |   3:20 PM - 3:40 PM PT

Event Location

Pass Required: Core Education Collection Pass

Don't have this pass? Register Now!

Info Alert

Create or Log in to myNAB Show to see Videos and Resources.

Videos

Resources

{{video.title}}

Log in to your myNAB Show to join the zoom meeting!

Resources

Info Alert

This Session Has Not Started Yet

Be sure to come back after the session starts to have access to session resources.

Today’s live event broadcast workflows present unique challenges for storage solutions and data management. These multi-camera environments need high performance storage to keep up with the high bandwidth of streaming data, which can total anywhere from 30TB to hundreds of TBs per event. Not to mention that data then needs to be captured in real time, stored, and moved to different locations, throughout the post-production and delivery process. Factor in ever-increasing video capture resolutions and frame rates, infrastructure and bandwidth variations from location to location, and constant security concerns driving calls for encryption—it’s no surprise many businesses in the industry are finding data management an increasingly difficult task. Especially as this type of production environment has rigid and often short timelines that the crew has to operate within. In short, a new kind of data storage solution is long overdue. 

To address this need, broadcast teams are turning to Data-Transfer-As-A-Services (DTaaS) to support these live environment productions.   

Competitive DTaaS services enable customers to pay for high capacity, 90+ terabyte storage arrays with high performance that meet data challenges head on, enabling users to store data anywhere—while only paying for the hardware each specific project needs—and physically transport that data securely to their post-production landing destination of choice. Not only does an on-demand, consumption-as-a-service model simplify users’ device management, preventing unnecessary on-set IT costs, it also gives video production teams the flexibility they need to alter the numbers of devices they deploy to complete certain projects, as storage needs change. By circumnavigating the headaches that come with owning data storage infrastructure outright, like maintenance fees and technology upgrades, DTaaS-based strategies give users the freedom to expand their production, accelerate their timeline, and even reallocate their budget without having to worry about your data. 

This technical paper will explore how DTaaS accelerates live capture workflows resulting in cost savings and streamlined content collaboration.   

  • Preliminary research and application results include:   
  • DAS performance can enable in-field, direct editing and transcoding when logistics prevent the standard workflow  
  • Cloud import services quickly and securely move dailies from camera to the cloud of  choice for collaborative post-production workflows  
  • Secure, encrypted, rugged solution, ensure customers won’t lose or leak content during transport for greater peace of mind  
  • Transfer content in days, not weeks: high-performance storage arrays on-set can save up to 12-15 hours per week of overtime costs across multiple departments  

In an industry as fast paced as broadcast streaming, data storage capacity, security, and mobility should be the least of studios and production companies’ concerns. With DTaaS, IT professionals can trust their enterprise-level video data sets will get where they need to go, right on time.   


Presented as part of:

Striving for Efficiency in Video Technology


Speakers

Jonathan Bauder
Product Manager
Seagate Technology