Interviews, Manufacturers

The appropriate meta data approach to content

Vinita Bhatia asked industry experts – Ben Davenport, Director-Marketing of Dalet; Paul Thompson, Director-Strategic Solutions of Avid and Julian Fernandez-Campon, Business Solution Director of Tedial – about effective business practices to assign a structured meta data approach to designing content. Here is what they had to say:

Why is content preservation garnering great interest in the broadcasting and media industry?
Ben Davenport: Firstly, much of our legacy content forms part of a rich tapestry that documents our history. Secondly, often, this content can actually be monetised – providing incremental revenue streams with much smaller investment than generating new content. This is particularly the case as emerging distribution platforms (OTT) reach increasingly niche markets.
Paul Thompson (PT): An efficient and profitable media business can be characterized as having all their content available to the whole organization and being able to maximize the return on each asset. It is likely that content is only thought of as ‘preserved’ if it was originally ingested from an old archive somewhere. Successful media businesses today will automatically meta-data enrich, index and store the content that is being produced today – this will be done within the enterprise workflow.
Anyone in the business should be able to search once from their web browser and immediately find the content they are looking for from any content repository within the organization – perhaps to re-purpose for a different platform and capture more value from that asset. If an asset cannot be easily located then it is worthless, regardless of what it contains!
Julian Fernandez-Campon: The demand for content has increased drastically, requiring support for multiple distribution platforms and multiple territories leading to a slew of new competitive operations and technologies. New platforms mean that media companies can monetise their content across different media. Increasing the content portfolio to make the OTT platform more attractive is key to getting new subscribers, even by recovering quality legacy content, which cannot easily be found. This forces today’s broadcasters to optimize their ‘Media Factory’ operation. The most cost-effective and reliable way to do this is to deploy a flexible MAM system with tested and trusted BI modules to monitor, manage and deeply understand the complete operation.

While most MAM systems have tools to manage new content, can they efficiently manage legacy archives that existed before this system was deployed?

Ben Davenport: In order to monetise content, you need to know what content you have. As business models and drivers evolve, so too does the metadata required to feed that business. It is therefore highly likely that legacy content will require ‘updating’ in any technology or business migration. Furthermore, it is likely that there will be a human element in providing these updates. The key to doing this successfully is in the smart management of processes (both human and automated) using standardised models (e.g. BPMN 2.0) that enable orchestration software to make intelligent decisions in highly automation workflows.
Paul Thompson: Most media businesses have multiple content databases typically used by a single department. And there are also the legacy archives that may be difficult to manage even by the department that manages that database. With a legacy database, one must understand how the content has been indexed and what meta data is available. In some cases, the content will need to have richer meta-data added to make it more ‘searchable’ – this is a task that can be done as part of a migration strategy.
Once the index and meta-data is understood, a decision can be made to:
• Leave the database untouched and simply add an Indexing agent which will push to a Central Index. This makes the content searchable but may need a manual process to retrieve the asset
• Integrate the database to the Asset Management solution, which makes workflows more automated
• Migrate the database to new infrastructure which is already integrated
Julian Fernandez-Campon: Managing brand new content can be relatively easy as the metadata is consistent and there is no inheritance from the ‘tape world’. Managing legacy content throws up inherent issues with many scenarios such as multiple clips in a tape, multi-tape clips, metadata in legacy databases, etc., without considering the media processing tasks such as restoration, trimming, segmentation, etc. For that reason, next-generation MAM needs to provide necessary mechanisms for content relationships and deliver multiple repositories and unification of the data model into a standard, in addition to the workflows to drive the content digitizing process.

Most companies still subscribe to the traditional MAM approach that involves bulk migration and conversion of archived assets. Is there is a better alternative to it?
Ben Davenport: The obvious disadvantage to the bulk migration and conversion of assets is the processing cycles required and the possibility of degrading video quality through generational loss. However, these disadvantages are nearly always outweighed by clear advantages.
In addition to the business drivers mentioned above, there are two further significant reasons that migrating archived assets is usually beneficial. First is the testing dilemma. If you have a combined number of ‘x’ metadata schemas/file formats on the input to your workflows from ingest or archive and then ‘y’ output schemas/formats, you have a test matrix of the size x*y.
If, instead, you normalise your metadata and file formats through migration, your test matrix is instead (where x or y is greater than 2) the size x+y – and therefore, with the number of distribution formats increasing, nearly always much smaller. This results in less opportunity for error and a far more robust system.
The second is perhaps a little more subtle but important to effective business planning as normalising your content enables you to calculate a ‘cost per unit’ – a key factor in the equation for calculating ROI.
Additionally, it is very often the case that technical innovation may provide a compelling reason to convert an archive – e.g. new encoding algorithms that yield far greater storage efficiency.
Paul Thompson: The business will need to make a commercial decision as to the potential value locked within the assets in the legacy archive. Bulk migration is fine if that is what the business wants to do. However, according to the quantity of physical assets, this can become a daunting task that can become very expensive.
A better approach is to decide:
• Which archives do I want to make searchable only? This is cheap, but inefficient.
• Which archives do I want to make searchable and gives the MAM access to automate future workflows? Maximizes efficiency
• Should I integrate the MAM to the old storage hardware?
• Should I migrate assets to new hardware that is already integrated with the MAM?
We recommend an approach that balances investment with the value that is will be released from the assets held within the legacy archive.
Julian Fernandez-Campon: A clear disadvantage of bulk migration is content classification and accessibility. It can be suitable for preservation but with previous selection to decide which content to preserve. The best approach is to combine this bulk migration with legacy database migration, define a proper strategy and include maximum metadata to ensure the digitised content is referenced and accessible.

Unstructured or structured metadata – which is better when it comes to sourcing legacy content easier from a MAM system?
Ben Davenport: Structured metadata, and even more so, extensible structured metadata, will always be required and the more accurate and efficient for cataloguing and browsing content. However, unstructured metadata, when combined with fast, intelligent search can also be highly effective. Ultimately, a combination of both will provide the best user experience and efficiencies.
Paul Thompson: The primary question is how to make the target asset as ‘findable’ as possible by anyone within the business who can possibly release more value from it. This is a sensible business practice to mandate a structured meta-data approach to content. This is especially important for content that is being produced today.
For archive content, the amount of meta data available can range from TapeIDs in an Excel spreadsheet all the way up to a fully structured comprehensive metadata schema.
In most cases, a minimum meta data standard is set for any content that is made searchable by the MAM. This gives the best balance between cost and efficiency – it may cost to create the basic meta-data but it makes the overall business more efficient by allowing the asset to be located by a search.
Media companies need to address a common taxonomy definition to the entire organization. In conclusion – it is important for the business to be aware of how the value is being released from the content across its lifecycle.
Julian Fernandez-Campon: Ideally structured metadata is the final goal, but this is not usually achievable due to different criteria and the complexity of unstructured source metadata. For that reason, efficient indexing and the definition of different repositories and schemas is key for content discovery in new MAMs. In addition, a good method is to extend the traditional ‘search’ approach to a new ‘search and navigate’ approach through collections or hierarchical structures to simplify content discovery.

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *