The International Press Telecommunications Council (IPTC) released the new Video Metadata Hub Recommendation (VMHub), a comprehensive solution for video metadata management that allows exchange of metadata over multiple existing standards.
The VMHub supports various technical solutions with the key goal of storing and exchanging metadata in a safe and reliable way, with a universal metadata schema.
“Users of videos of different standards told IPTC they need a common ground in metadata for efficient workflows,” said Michael Steidl, managing director of IPTC, at IPTC’s Autumn Meeting in Berlin, during a day devoted to video. “This is what we deliver now with the Video Metadata Hub.”
Diverse video technology methods have made standardisation challenging – the various approaches for embedding metadata and rights information. There are also many different metadata schemas for video, many of them somewhat limiting.
“Organisations and individuals can benefit from implementing the VMHub because it helps to streamline workflows, with guidelines for organising metadata of videos from different sources and standards in a common way,” said Steidl, who is also the lead of IPTC’s Video Metadata Working Group.
Likewise, the VMHub supports workflow, exchange of metadata, and search functions across existing standards, and provides mappings to Apple Quicktime, PBCore, MPEG7, Schema.org, and IPTC’s NewsML-G2.
“The Hub also supports organisations switching from an ‘old’ to a ‘new’ standard by providing a stable metadata schema and gives the ability to search across videos from different standards,” Steidl said.
IPTC’s Video Metadata Working Group – which consists of delegates from news organisations, system vendors and experts in the metadata field – collaborated for two years to review technical elements, rights and administrative information, and metadata terms for describing audio-visual content, to ensure IPTC’s VMHub was a comprehensive solution for video metadata management.
Documentation & Specification
- Specification, technical implementation, and mappings to Apple Quicktime, MPEG 7 (ISO 15938-5), IPTC’s NewsML-G2, PB Core and Schema.org.
- The recommendation documents are available at www.iptc.org/std/videometadatahub/recommendation/1.0, and include specifications of the properties of the metadata schema and their technical implementation by EBU Core, for stand-alone documents, and XMP, for embedded metadata.
IPTC is looking for software developers to design, develop, document and test EXTRA, an open source rules-based classification engine for news. First preference will be given to applications received by 21st October 2016, and review will continue until the positions are filled.
“Classification” means assigning one or more categories to the text of a news document. Rules-based classifiers use a set of Boolean rules, rather than machine-learning or statistical techniques, to determine which categories to apply.
EXTRA is the EXTraction Rules Apparatus, a multilingual open-source platform for rules-based classification of news content. IPTC was awarded a grant of €50,000 from the first round of Google’s Digital News Initiative Innovation Fund to build and freely distribute the initial version of EXTRA. DNI granted IPTC €50,000 for the entire project.
We are working with news providers to supply sets of news documents and with linguists to write rules to classify the documents. IPTC is looking for qualified developers to create the rules engine to accurately and efficiently categorize the documents using the rules.
Please consult this page for more information and to let us know if you’re interested in being considered.