Mark Sandler Making Metadata Work, 23 June 2014 Semantic Media – Problem Area TV Productions Music / Radio Productions Consumer: How to find relevant content in large media collections? Producer: How to monetize, how to subvert piracy? Film Productions Photo Productions Source of images: Google Navigation in Content Collections: Previous Approaches Automatic annotations often not as detailed and robust as needed Reason: Automatic methods have no access to knowledge only available during production, so at best does partial reverse engineering User interfaces are not as rich as needed Reason: Metadata does not incorporate relevant external information Semantic Media - Concept 1: Annotation As Part of Production Workflow Employing knowledge of the production process leads to simplified and hence more robust (automatic/assisted) metadata generation procedures Integrating additional information usually discarded after production allows for richer annotations Resulting novel workflow systems facilitate automation and assist content producers as well consumers throughout the content life-cycle Semantic Media - Concept 1: Example Metadata: Where was this picture taken? What is in it? What’s the weather like? Source of image: Wikipedia Semantic Media - Concept 1: Example Metadata: Who are the actors (in this episode)? What are the story lines? Find the scene with crying. Semantic Media - Concept 2: Incorporating Global Knowledge Using Linked Data Technology Managing and exposing enhanced metadata using semantic web and linked data technology allows for uniting various sources of information and thus improving the user experience with richer interfaces Semantic Media - Concept 2: Example BBC Music website + Structured Wikipedia Data = Improved User Experience More about this later… Catfishsmooth: Linked Data Demo Originally by Kurt Jacobsen See also http://musicweb.academiccharts.com Linked Open Data in Sept 2011 Goals of the Semantic Media Project Creating a forum for researchers / developers Encouraging interdisciplinary research bringing together specialists across the entire ICT sector Sparking new collaborations between researchers (including industry partners) by funding mini-projects, student exchanges and internships Encourage leading researchers to develop roadmaps guiding the direction of future research efforts and grant applications Encourage substantial grant applications: UK & EU Funding - Opportunities and Examples Exchange of students across working groups and internships / placements Construction of ontologies appropriate for 3D+t content description (sound, video, objects) Capturing of motion information in a film/tv set to capture scene-descriptive metadata to associate with the primary media stream (i.e. video) Fusion of metadata from disparate sources to build a composite metadata stream associated with a single media stream, propagating through the value chain from producer to consumer, e.g: Metadata from several musical instruments to create a composite harmony stream Motion metadata streams from several actors in a scene to create a composite action stream Combining rights-related metadata (e.g. using MPEG Value Chain Ontology [9]), user generated and other tags downstream from creation Application of temporal logic on (time-structured) media metadata streams [8] Use of capture-at-source metadata to enhance the production workflow Ethnographic studies of metadata-enhanced production tools to assess their fitness for purpose Large-Scale capture of Producer-Defined Musical Semantics Aims • Capture semantics behind parameters in audio production software • Map low-level parameters to high-level concepts (timbre, ‘bright’, ‘warm’) • Create infrastructure to semantically annotate produced music (for meta-data based retrieval and research purposes) • Technology: • Develop several audio plugins, which capture/ output parameter settings using semantic web data structures • Analyse audio and map parameters to perceptual entities Project Partners: • Birmingham City University • Queen Mary Univ of London • Birmingham Conservatoire SemanticNews: enriching publishing of news stories Aims • Contextualise broadcast news and discussion around it by identifying concepts and linking them to additional information available as linked open data • Demonstrator running at the BBC using ‘BBC Question Time programme’ data • Technology: • Named Entity Recognition in BBC subtitles, BBC programme data and surrounding Twitter discussions • Linking to external authorities (dbpedia) • Visualisation Project Partners: • University of Southampton • University of Sheffield • BBC Semantic Linking of Information, Content and Metadata for Early Music (SLICKMEM) Aims • Link data/meta data from several information sources about early music • Early Music Online (JISC project) • Electronic Corpus of Lute Music (AHRC) • External sources (e.g. dbpedia) • Create unifying ontology for all available data • Extract Music Features from scanned score data to support content-based search • Link musically similar section using similarity ontology Project Partners: • Goldsmiths College • City University • BBC • Oxford eResearch Centre Tawny Overtone Aims • Overtone: a fully programmable music composition and synthesis environment • Tawny-OWL: a programmable, interactive environment for the definition of Semantic Web data schemes (ontologies) • Goal: Integrate Tawny and Overtone to generate ontologies appropriate to capture the semantics behind an Overtone ‘music programme’ Project Partners: • University of Newcastle • University of Manchester Second Screen - a fingerprinting driven semantic music recommendation service Aims • Use finger-printing technology to identify a music recording off-the-air using a smartphone/tablet • Use ID to retrieve wide range of artist metadata from multiple internet data sources • Provide an interface to discover more information about the song/artist/related artist/genres Project Partners: • Queen Mary Univ. of London • MPEG Ongoing Projects Computational Analysis of the Live Music Archive (CALMA) MUSIC - Metadata Used in Semantic Indexes and Charts Content-based analysis (tempo, key, etc) of freely available music content and publication of results as linked data. Merging the Academic Charts Online music meta-data service with linked open data services. Project Partners: • University of Manchester • Queen Mary Univ. of London Project Partners: • University of Northampton • Queen Mary Univ. of London • • Oxford e-Research Centre The Internet Archive WhatTheySaid Automatic generation of timelines from speech data, which summarize main concepts and statements made Project Partners: • University of Southampton • University College London • BBC • Academic Rights Press Upcoming Projects Semantic Linking of BBC Radio (SLoBR) Programme Data and Early Music POWkist – Visualising Cultural Heritage Linked Datasets Building a live-demonstrator at the BBC that enriches/contextualizes BBC Radio 3 programme data with EMO/ECOLM information Enriched visualization of digitized cultural heritage data (prisoner of war diaries) by integrating linked open data. Project Partners: • Oxford e-Research Centre • BBC Project Partners: • University of Aberdeen • Northumbria University • • Goldsmiths College City University An Argument Workbench - extracting structured arguments from social media Extraction and semantic representation of discussion threads and arguments from comments to articles and news. Project Partners: • University of Aberdeen • University of Sheffield • DebateGraph • Dot.rural Digital Economy Hub Getting involved Join our mailing list for announcements and discussions Have an idea for a feasibility study and put it on our idea-wiki Help organizing meetings (maybe focused on a specific subfield) Help documenting the research landscape by participating in the landscape-wiki Participate in future meetings, sandpits, tutorials, as well as collaborative grants and paper submissions Help identifying people who might be interested in this network and invite them (or tell us) Check our website: semanticmedia.org.uk and contact [email protected] (sebastian ewert)
© Copyright 2026 Paperzz