AI-Enabled Horizons: Pioneering Multilingual Content Integrity in Broadcasting

Event Time

Originally Aired - Monday, April 15   |   3:30 PM - 4:00 PM PT

Event Location

Pass Required: Core Education Collection Pass

Don't have this pass? Register Now!

Info Alert

Create or Log in to myNAB Show to see Videos and Resources.




Log in to your myNAB Show to join the zoom meeting!


Info Alert

This Session Has Not Started Yet

Be sure to come back after the session starts to have access to session resources.

In the dynamic world of global broadcasting, delivering multilingual content with precision is a strategic necessity and revenue enabler. This panel discussion will explore the transformative impact of Artificial Intelligence (AI) and Machine Learning (ML) on advanced audio and video monitoring and distribution, particularly in live news and sports.

The session will address the challenges in delivering accurate language translations and subtitles, and the complexity of multi-language distribution. We will demonstrate how AI and ML are pivotal in automating and refining monitoring processes, leading to a new era in content assurance. This presents a compelling “blue ocean” opportunity for broadcasters in terms of internal content management and external audience expansion.

Our panel will discuss AI technologies like real-time language detection and adaptive speech models, fine-tuned on domain-specific datasets for continuous performance improvements. These automatic speech recognition and natural language processing technologies are constantly evolving, and serve an important role for broadcasters in their content distribution strategies as the world of viewers continues to become more globalized and diverse.

We will introduce the concept of 'Software Watching Television'— an AI-driven monitoring paradigm that interprets content with human-like understanding. This includes AI's ability to auto-generate subtitles, transcribe live content, and assess caption and translation quality. The panel will explore how this integrates with deep probing, monitoring, and visualization systems to ensure content delivery accuracy and end-user satisfaction.

A live demonstration will showcase an AI-powered monitoring system, highlighting its adaptability, scalability, and impact on viewer experience.

The discussion will also cover AI integration with media asset management systems, showing how enriched metadata extraction from transcriptions and analyses can improve content indexing and retrieval, thus enhancing broadcaster efficiency and user experience.

Concluding, the panel will provide insights into the strategic advantages of AI in broadcasting, preparing attendees to navigate the future of worldwide, multilingual content delivery with confidence.


Paul Briscoe, Chief Architect, TAG Video Systems


Kyle Suess, CTO, co-founder, Amira Labs

Stefan Cardenas, CEO, co-founder, Amira Labs

Scott Olson, System Engineer, Warner Brothers Discovery

TBD User Engineer

Presented as part of:

Examining AI in Media Uses Cases


Paul Briscoe
Chief Architect
TAG Video Systems