Workflows within the industry are moving towards file based content usage. In recent years there have been significant changes in the broadcasting industry with the introduction of multiple technologies such as MPEG-2, H.264. In addition, there is now a multitude of different delivery mediums such as Traditional TV, DTH, IPTV, Mobile TV and VOD, each of which imposes its own set of requirements on the content structure. These technologies bring with them their own individual set of challenges which a broadcaster has to deal with on daily basis. Broadcasters which are already dealing with new equipment and infrastructure introduced due to new digital workflows can easily become overwhelmed with content management issues in an attempt to ensure a rich experience for their viewers. In a typical broadcasting environment, the content is received from multiple sources that may be different in formats. This difference in formats adds further to the complexity and requires special attention before the content can be played out from a broadcaster’s facility.
As a first step this content needs to be ingested and validated before it can be processed further. The content can then either move to archive or processed further depending on the broadcaster’s schedule. Whether the content is prepared for playout or stored into archive for later use, both these procedures have intermediate steps which are error prone and can potentially introduce problems in the content.
Broadcasters have been using home grown scripts and manual verification to validate the content until recently but with the increased complexity and the volume of content, these are no longer practical solutions. Manual verification imposes the following constraints:
- Manual testing is error prone and Audio/Visual errors can be easily missed just by blinking or loss of concentration on the part of tester.
- The errors found vary with the skill and experience levels of testers.
- Human beings can’t easily look into various parameters of the stream. Some of these parameters may include compression, aspect ratio, GOP length etc. Looking into each file for all the parameters is tedious and the tester may still miss some of these.
- Difficult for a human being to consistently maintain the same level of testing.
There is an urgent need to automate the verification process so that content can be validated at various stages in the broadcasting environment. The figure below shows a typical broadcasting setup:
The broadcasting setup can be further divided into smaller workflows, each of which caters to specific needs. The following section describes the three main workflows and the corresponding verification needs. The workflows also indicate the points at which PulsarTM could be used to conduct the verification.
Ingest to Playout
Broadcasters typically receive content from multiple sources. This workflow describes the high-level steps covering the process from ingest to content playout in the broadcaster’s network. As a first step, the received content is ingested in a high quality format. The content can then be repurposed based on the delivery platform used by the broadcaster. Pulsar can be deployed after Ingest, after repurposing and before Playout, to detect the quality issues. Post-ingest, Pulsar can detect faults which may have occurred during ingest or which may already have been present in the original content, such as black frames, frozen frames, silence etc. Pulsar can also detect faults which may have been introduced during repurposing. Some of these faults could include blockiness and codec compliance errors. Verification could also include values for specific audio/video parameters such as aspect ratio, Profile/Level etc.
Ingest to Archive
Broadcasters typically receive content that may not be played out for some time due to regulatory reasons or due to program schedules. This content then needs to be archived for playback at a later date. Early detection of errors in the content is particularly important, especially for content that need to be archived since Broadcasters do not have access to the master copy of the content once it goes to archive. If an error is detected while the content is being retrieved from archive, then broadcasters have no way of correcting the content as they no longer have the original content.
In this workflow, the content is ingested and moved to short-term storage and then onto long-term storage depending on the playout schedule. The archiving process is typically controlled by specialized archiving systems. Certain audio/video errors may be introduced during the archiving process, both while moving to short-term and long-term storage. Some of these errors will typically include black frames, colored frames, silence, mute etc. Pulsar can monitor the content at both these stages to ensure that any errors in the content are reported immediately so that the user can take the appropriate actions.
Archive to Playout
During this process, the content is retrieved from archive and played out with specialized archiving systems used to manage the process. The content goes through multiple stages, many of which may potentially introduce errors. If the content is moved to the short-term storage then errors may be introduced during this stage. If the content requires repurposing into a specified format, then again faults related to quality or format compliance may be introduced into the content. Some of the issue requiring monitoring could include blockiness, codec conformance, parameter value validation etc. Pulsar can monitor the content at different stages in this workflow and analyze it using customized testing rules. The following figure shows a typical workflow from Archiving to Playout:
To facilitate an efficient verification, the verification system should have the following capabilities:
- Allow broadcasters to define their own set of rules/profiles/templates that can be applied to a particular verification process. Different verification rules can be used at different points in the workflow depending on the QC process required.
- Sophisticated application, yet easy-to-use with a clear and uncluttered reporting structure customised to highlighting only the checks required.
- Ability to automatically and simultaneously pick up the content from multiple locations corresponding to different stages in the workflow.
- Feature set to span a wide variety of quality and parameter checks for audio and video streams.
- Highly reliable and robust system that can be used for 24×7 operations in any broadcasting environment.
- Faster than real-time performance to maximise throughput. Coupled with the ability to ‘gang’ systems in master/slave configurations, this will allow broadcasters to cope up with the growing volumes of content.
- Have rich reporting capabilities including reports, alerts so that appropriate actions can be taken in a timely manner.
- Ability to co-exist in the existing workflow system providing a uniform experience to the users.
- Web based multi-user interface for remote and local management.
Pulsar can be used at various stages in the workflow and the gains in workflow efficiency by using Pulsar are noteworthy.