US20090307258A1 - Multimedia distribution and playback systems and methods using enhanced metadata structures - Google Patents

Multimedia distribution and playback systems and methods using enhanced metadata structures Download PDF

Info

Publication number
US20090307258A1
US20090307258A1 US12/480,251 US48025109A US2009307258A1 US 20090307258 A1 US20090307258 A1 US 20090307258A1 US 48025109 A US48025109 A US 48025109A US 2009307258 A1 US2009307258 A1 US 2009307258A1
Authority
US
United States
Prior art keywords
metadata
content
media file
track
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/480,251
Inventor
Shaiwal Priyadarshi
Kourosh Soroushian
Jason Braness
Loren Kirkby
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sonic IP LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/480,251 priority Critical patent/US20090307258A1/en
Assigned to DIVX, INC. reassignment DIVX, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRANESS, JASON, KIRKBY, LOREN, PRIYADARSHI, SHAIWAL, SOROUSHIAN, KOUROSH
Publication of US20090307258A1 publication Critical patent/US20090307258A1/en
Assigned to JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT reassignment JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT SECURITY AGREEMENT Assignors: ALL MEDIA GUIDE, LLC, DIVX, LLC, SONIC SOLUTIONS LLC
Assigned to DIVX, LLC reassignment DIVX, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: DIVX, INC.
Assigned to SONIC SOLUTIONS LLC, DIVX, LLC, ALL MEDIA GUDE, LLC reassignment SONIC SOLUTIONS LLC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT
Assigned to SONIC IP, INC. reassignment SONIC IP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DIVX, LLC
Assigned to DIVX, LLC reassignment DIVX, LLC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT
Assigned to ALL MEDIA GUIDE, LLC, DIVX, LLC, SONIC SOLUTIONS LLC reassignment ALL MEDIA GUIDE, LLC PATENT RELEASE Assignors: JPMORGAN CHASE BANK N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234318Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • H04N21/8405Generation or processing of descriptive data, e.g. content descriptors represented by keywords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8543Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Definitions

  • Typical multimedia container formats offer practical and efficient methods of encapsulating standard multimedia data types such as audio, video and subtitles. The same efficiency, however, does not typically extend to metadata, especially in most consumer targeted multimedia container formats. Often the descriptive and interactive metadata associated with content is collectively placed in a distinct section of the same file, or stored in secondary files using proprietary formats. To date, practical implementations of metadata have been limited to simple descriptions of the video title, rarely extending to any direct associations with the actual scenes in the video. Moreover, in systems where secondary metadata files are employed, many challenges come to light when delivery occurs over the Internet due to factors such as the re-naming and re-grouping of files by caches between the publisher and the consumer.
  • the requirements for a metadata system that can be applied to multimedia files are complex as the files may include a combination of video, audio and subtitle tracks.
  • some multimedia formats, such as DVD require the playback of the video presentation to follow an authored path, such as the displaying of copyright notices, trailers, chapter menus, etc.
  • BDs Blu-ray Discs
  • digital video distribution and playback systems and methods that provide an enriched and versatile metadata structure are provided.
  • a method of playing back metadata content stored in a media file comprises providing a media file to a playback device.
  • the media file has at least one metadata object and an association with content data in which the metadata object references at least one facet of the content data.
  • the method further comprises decoding the content data by the playback device, displaying content on a display screen from the decoded content data, and decoding the at least one metadata object based on the displayed content by the playback device.
  • a system for playback of a media file comprises a media server and a client processor.
  • the media server is configured to locate media files with each media file having an immutable global identifier.
  • the client processor is in network communication with the media server and is configured to send requests for a media file to the media server.
  • the media server is also configured to locate and transmit the requested media file based on the global identifier and the client processor further comprises a playback engine configured to decode a metadata track within the transmitted media file.
  • the metadata track refers to content data in the transmitted media file.
  • a method of creating a media file having metadata information comprises supplying a source of metadata information to an encoder; supplying a source of content to the encoder; generating a metadata object from the supplied metadata information by the encoder, the generated metadata object referencing at least one portion of the supplied content and integrating the metadata object with the supplied content to form a media file by the encoder.
  • a single unique identifier is used to refer to a repeatedly referenced metadata object.
  • FIG. 1 is a semi-schematic diagram of networked and local-file playback systems in accordance with embodiments of the invention.
  • FIG. 2 is a graphical representation of metadata structure within a multimedia file in accordance with an embodiment of the invention.
  • FIG. 3 is a graphical representation of metadata table in accordance with an embodiment of the invention.
  • FIG. 4 is a graphical representation of metadata object in accordance with an embodiment of the invention.
  • FIG. 5 is a graphical representation of metadata track header in accordance with an embodiment of the invention.
  • FIG. 6 is a graphical representation of metadata track entry in accordance with an embodiment of the invention.
  • FIG. 7 is a graphical representation of metadata object in accordance with an embodiment of the invention.
  • FIG. 8 is a graphical representation of metadata track entry relative to a display and a video track entry in accordance with an embodiment of the invention.
  • FIG. 9 is a graphical representation of a metadata track entry relative to a display and a video track entry in accordance with an embodiment of the invention.
  • FIG. 10 is a graphical representation of metadata structure within a multimedia file in accordance with an embodiment of the invention.
  • FIG. 11 is a flowchart of a process encoding a multimedia file to include metadata information in accordance with an embodiment of the invention.
  • FIG. 12 is a flowchart of a process decoding a multimedia file having metadata information in accordance with an embodiment of the invention.
  • multimedia metadata systems and methods are provided that enable associations to be maintained by immutable logical properties that remain robust to changes to the mutable logical properties of the data, such as the file names and paths.
  • the systems and methods allow description of audio and video, as well as subtitles and any other types of presentation data tracks within a file.
  • the content can be contained in a single file, or be distributed across a multi-segment range of files.
  • many embodiments of the invention support both the DVD experience of authored presentations, as well as the Internet-based dynamic and collaborative experience.
  • the metadata system maintains the same level of experience across PC and embedded platforms, regardless of whether the player has a live Internet connection or not.
  • the metadata can be associated with the content, regardless of whether the metadata is stored in-file with the content, or in another file.
  • the metadata can describe a variety of entities and properties of the content including the entire file, each feature in the file, each chapter of the features, segments or segments of those chapters and even spatial regions of the video.
  • Metadata frameworks in accordance with embodiments of the invention utilize three items of support from the containers that incorporate it.
  • First the ability to store a Globally Unique Identifier (GUID) or Universally Unique Identifier (UUID) for the file.
  • Second the ability to store a table of metadata tags, values and UIDs using common and new data types and lastly, the ability to store a new type of multimedia track, with a non-standard data type, a “metadata bit stream track”.
  • the items e.g., the first and third items, can be optional items to be used in more advanced cases or devices.
  • Use of one or more metadata bit stream tracks enable metadata to be available in close proximity within the file to the content that the metadata describes, as well as delivering the metadata only when/if it is needed.
  • the metadata table enables the efficient, singular, storage of metadata multiply referenced by metadata tags contained in the Media File, including those in any contained metadata bit stream track.
  • the use of a GUID allows the content of a Media File, including any metadata and metadata bit stream tracks, to change without breaking references made to it from other, previously authored, Media Files.
  • the metadata format system extends the scope of metadata tags (or just “tags”) from traditional coarse-grain descriptors to medium and fine-grained descriptors of sub-sections of the file and individual video frames.
  • the system introduces some new data types for the tag values that enable the demarcation and outlining of spatial regions of interest and support linking with internal and external objects.
  • the system also increases the robustness of and options for, content distribution across the Internet.
  • the system can be utilized by applications to enable them to, for example, regulate playback of content in many ways, allow annotation of the content by groups of viewers, and/or redirect playback to remote content, such as adverts or special-features, hosted on web-sites.
  • the metadata structure is largely in-line with those of MPEG-7 and the data-types similar to those defined in the SMPTE Metadata Dictionary. In doing so, it is straightforward to provide translation services between these professional audiovisual metadata standards and the simpler consumer oriented format.
  • the playback system 10 includes media servers 12 and metadata servers 13 connected to a LAN (e.g., a home network) or a WAN (e.g., the Internet) 14 .
  • Media files are stored on the media servers 12 and metadata resource databases stored on the metadata servers and can be accessed by devices configured with a client application.
  • devices that access media files on the media servers and metadata on the metadata servers include a personal computer 16 , a consumer electronics device such as a Media File Player 18 connected to a visualization device such as a television 20 , and a portable device such as a personal digital assistant 22 or a mobile phone handset.
  • the devices and the servers 12 can communicate over a LAN 14 that is connected to the WAN 24 via a gateway 26 .
  • the servers 12 , 13 and the devices communicate over a WAN (such as the Internet).
  • the Media File Player 18 is directly connected to a LAN and can directly access a WAN. In some embodiments, the Media File Player is not directly connected to a network and plays files that have been copied onto Optical Disks 17 , USB thumb-drives 19 or other direct-access physical media.
  • the software that copied the media files to the physical media e.g., running on a computer 16 ) copies the media files from the media servers and the metadata from the metadata servers to the physical media.
  • the copying software can translate metadata values that refer online resources to the location of the resource on the local media.
  • the devices are configured with client applications that read portions of media files from the media servers 12 or physical media 17 , 19 for playing.
  • the client application can be implemented in software, in firmware, in hardware or in a combination of the above.
  • the device plays media from downloaded media files.
  • the device provides one or more outputs that enable another device to play the media.
  • the media file includes an index
  • a device configured with a client application in accordance with an embodiment of the invention can use the index to determine the location of various portions of the media. Therefore, the index can be used to provide a user with “trick play” functions.
  • the device uses the index to determine the portion or portions of the media file that are required in order to execute the “trick play” function and reads those portions from the server or physical media.
  • the client application requests portions of the media file from media servers using a transport protocol that allows for downloading of specific byte ranges within the media file.
  • a transport protocol that allows for downloading of specific byte ranges within the media file.
  • One such protocol is the HTTP 1.1 protocol published by The Internet Society or BitTorrent available from www.bittorrent.org.
  • other protocols and/or mechanisms can be used to obtain specific portions of the media file from the media server.
  • a media track for the incorporation of metadata throughout the duration of the content.
  • metadata tracks are different from video, audio and subtitle tracks in typical media files.
  • the metadata track refers to the other tracks and thus is only relevant through the use of the other tracks.
  • the metadata track would appear as pieces of detached information without the context of the other tracks.
  • the metadata track can refer to other information, i.e., metadata information. This for example allows the information to be stored and/or extracted from another source or location within the media file or referenced multiple times but stored in a single or just a few locations.
  • FIG. 2 illustrates the metadata structure within the context of a typical multimedia file.
  • GUIDs 21 are used to identify files, with a new GUID created for every new file (ideally, even for a copy of a file, since a copy is a new file).
  • the GUID should be regarded as an immutable property of that file. Even if the file's contents change its GUID should remain the same. This rule allows a “main title” to be authored with a reference to a “placeholder” file via the GUID contained in the placeholder file; the “placeholder” file's contents could change to accommodate a new movie trailer or advertisement, or other form of content.
  • main title makes a reference to the “placeholder” file's GUID, it would receive the latest video encoded within the “placeholder”; hence the viewer's copy of the “main title” need never change physically, though it would always link to “current” content.
  • Track headers 23 for audio, video, subtitles, etc. along with a metadata track header 30 follows the GUID 21 entry.
  • a metadata table 25 and/or metadata table 29 follows the metadata track header 30 .
  • Metadata references in accordance with this invention can refer to metadata objects previously stored within one or more of the metadata tables. In this way, frequently referenced metadata objects can be referred to efficiently through the single metadata object residing in a table, rather than repeating the declaration of the object whenever it is needed. As metadata can change quite frequently, a metadata table is written into the file at a position where changes to the table's size do not require the entire multimedia file to be remuxed and rewritten.
  • the first metadata table 25 is a mid-file option and has a reserved area 27 that provides an open area to allow growth of the first metadata table if needed.
  • the second metadata table 29 is an end-of-file option that allows the growth of the metadata table to be indefinite or unrestricted.
  • the tables could be stored as the very last element in a file, or could be embedded within a file with an immediately following reserved area (R)into which the table could grow.
  • R reserved area
  • there may not be a reserved area (R) in which case changes to the size of the metadata table may require a re-write of the entire file.
  • FIG. 3 One embodiment of a metadata table is shown in FIG. 3 and sets forth a set of metadata tags and values, a metadata object (MO) 33 , assigned to an unique identifier (UID) 31 that is unique within the set of all metadata UIDs in the file.
  • FIG. 3 is discussed further below.
  • Time-ordered bit stream packets [audio, video, subtitle and metadata] 28 are usually located in files that also contain metadata tags and metadata tables.
  • the packets or portions thereof could be in separate files, e.g., a video file and a metadata file, automatically linked to each other through the use of the GUIDs or manually linked through user specification.
  • a first GUID could reference the metadata file
  • a second GUID could reference the video file.
  • the second GUID would be included to link the video file to the metadata file.
  • the video file would include the first GUID to link it to the metadata file.
  • a multimedia presentation can be a collection of files linked by GUIDs.
  • the playback device can retrieve all the associated files as desired to playback the intended multimedia presentation.
  • the GUID is not included or not used when all the presentation components, e.g., the metadata and audiovisual data, is placed in the same file.
  • the bit stream data in (B) is differentiated by a track ID, and each track's data type is defined by a Track Header 23 .
  • Audio, video and subtitle tracks are well-defined data types for multimedia files, as are the track header 23 for these data types.
  • the metadata track (metadata packets within the bit stream data 28 ) places the definition or references of the metadata objects close to the associated audio, video and subtitle packets. For example, if a car appears on screen at nine minutes and nineteen seconds into a presentation and metadata tags that detail the car's make, model and associated web-site are to be presented to the viewer at the same time, then the metadata tags are placed in a packet in the (B) bit stream element, physically near the audio/video packets with the same presentation time.
  • the metadata could have been placed in a separate entity unrelated to the nearest bit stream (B) element.
  • B bit stream
  • typical, cases, such a list of metadata objects and presentation times would be analyzed before playback, and the objects retained in memory until it was time for them to be displayed.
  • Such an alternative scheme could, therefore, slow start-up performance and may require a potentially large amount of memory to be available in a playback system.
  • a unique ID 31 (UID, as opposed to a Globally Unique ID) allows the referencing of the defined set of metadata 33 at a future point by using only the UID, rather than by an entire declaration (re-declaration) of the tags and values.
  • the metadata table as such provides the definition of a UID-based look-up table of metadata tags and values; a minimum size for a file's metadata, by defining multiple referenced metadata objects (MO) only one time; and a minimally invasive manner to change the properties of an object that is referenced at multiple points.
  • FIG. 4 one embodiment of a metadata object (MO) is shown.
  • the MO may be overloaded with multiple tags 41 and values 42 , with each value of a different data type.
  • the metadata object can have one or more [tag, value] pairs and each tag can have one or more associated values.
  • This ability to associate multiple tags and values gives the metadata systems a wide-range of options to describe aspects of the audiovisual presentation. As a result, versatility is provided by allowing the systems that may be able to understand, handle and display complex metadata to utilize all the metadata information, whereas other systems that may only have implemented the mandatory, basic, tags and data types (native to their format) are still able to utilize the basic metadata information.
  • tags defined by possibly overloaded values allow metadata to be defined in a scalable manner, where all players will be able to retrieve and offer at least one value for a tag, and more complex and capable players will be able to offer many different values for a tag.
  • multimedia files can be authored with complex applications described through rich metadata to be utilized by high-end players, and these same files will still function to an approximate degree on much simpler players, such as low-cost CE devices.
  • the metadata track header (M) 30 is a data type for packets in the bit stream 28 , and is illustrated in FIG. 5 .
  • each metadata track header also lists the UIDs 53 from the table metadata table that the track's entries reference.
  • the metadata track header only provides its type and a list of unique identifier UIDs referenced by the track's entries. This allows a playback system to locate and store only the metadata objects from the metadata table that would be required to display that particular track fully.
  • the metadata objects are declared in-stream in the track entry, if desired by for example the publisher, and thereby bypassing the metadata tables.
  • the metadata track entry provides a presentation time 61 , duration 63 and one or more metadata objects 65 or associated unique identifiers 67 , (i.e., references to predefined metadata objects).
  • metadata objects 65 or associated unique identifiers 67 i.e., references to predefined metadata objects.
  • specific tracks can be written that contain metadata for different purposes, e.g. a track could incorporate all the director's annotated, multimedia commentary presentation regarding the shooting of a movie, while another track could contain the lead actress's multimedia presentation on her character with links to further details on web-sites.
  • Viewers would have the option of enabling or disabling as many tracks as they choose in order to see exactly the types of data in which they are interested. As such, commentary from each viewer of the movie could be incorporated into separate tracks and during playback, only the current viewer's selected user tracks (for example, the viewer's friends' tracks) could be enabled and rendered for the viewer.
  • the metadata types utilized by the metadata structure are found typically in a standard container format, e.g., the MKV format.
  • the incorporation of standard data types enables devices with limited capabilities to simply decode the standard data and more advanced devices to access the full array of metadata, when the advanced data types are incorporated.
  • the metadata types are thus recognized by the advanced “rich featured” applications and devices and thereby allow such applications to access the metadata information and present the metadata information appropriately.
  • Non-advanced applications and devices may or may not recognize the metadata types and ignore the information. In other words, if the device recognizes that there is metadata information that its capability is limited to present, the device skips the metadata information, instead of attempting to display the metadata information that may cause the application to shutdown or stop the playback of the content.
  • Metadata systems in accordance with a number of embodiments of the invention incorporate standard data types, such as 32-bit integers and UTF-8 encoded text strings.
  • the metadata systems can also introduce some extra data types that can provide support for rich applications. These data types enable a richer variety of metadata to be contained within a file and can include portable formats for images, sounds, URIs, formatted text and Region of Interest (ROI) demarcations.
  • ROI Region of Interest
  • the richer metadata is stored using standards based formats that are commonly supported in consumer electronics devices.
  • standards based formats that are commonly supported in consumer electronics devices.
  • widely supported compression formats can be utilized such as JPEG for images and MP3 for sound. Both of these formats are well established in practically all DVD, BD and PC platforms and are available from many vendors for many other platforms. Since both formats allow storage of data in the compressed domain they offer a simple yet scalable solution to store images and sounds, despite their general utilization for data that they may not have originally been designed to represent.
  • the JPEG standard is primarily targeted at compressing “natural” images, although in the context of the present invention its use is extended to allow storage of CGI images, that are typically better suited to storage in the GIF format, for example.
  • multimedia formats e.g., MKV
  • MKV multimedia formats
  • the onus is on the playback device to correctly retrieve all elements of the file needed to playback the content.
  • this can burden the player to perform a significant amount of initial processing before it can playback the content, such as repetitively searching through lists.
  • the metadata structure in accordance with various embodiments such as using a single unique identifier to refer to a repeatedly referenced metadata object, can significantly reduce the processing burden on the player.
  • URIs 71 incorporation of URIs 71 into the metadata structure in accordance with an embodiment of the invention is shown.
  • the incorporated URIs e.g., addresses and/or client arguments, allow for the invocation of local or remotely stored dynamic objects.
  • the URIAddress of “http://commons.wikimedia.org/wiki/Matt_Damon” could be defined in an “Actor” tag for any scene in which the Hollywood actor Matt Damon appears.
  • the contents of the referenced page can be created dynamically, based on the server serving the page.
  • the tag value may be set once and engraved into master discs, its invocation on a player would allow the most up-to-date information to be displayed, directly off the Internet.
  • the tag could be overloaded with another value of HTML type; this value would contain a copy of the HTML that was current at the time of writing.
  • the URI information in one embodiment indicates server and client components.
  • a playback device can interpret the location or resources used to display or provide the metadata information.
  • an remote address e.g., a server address
  • a local address e.g., a client address
  • a selection of an actor or another indicator can cause the playback device to seek the information remotely (e.g., via the Internet) if indicated by a server address by the metadata information or locally (e.g., via a local drive or local area network) if indicated by a local address.
  • a secondary string can also be defined that contains parameters and values that are intended to be processed by the playback device rather than the remote server.
  • the metadata tag “URIClientArgs” contains the set of client-side parameters and values.
  • HTML V4.01 (or more recent version) which specifies a complex language for the markup of text and images.
  • this complexity can also be an inhibitor to wide-scale support on CE devices.
  • CE devices do not have to process all aspects of HTML V4.01.
  • a CE device could reduce its implementation complexity by rendering text using a single embedded compacted font, as described in US Patent Application entitled Systems and Methods for font file optimization for multimedia files, filed on Jun. 6, 2009, thereby reducing the need for the device to have access to as many fonts as a typical web-browser; this method could be used when encoding any text as metadata for any tag.
  • the playback device could limit native support for images to the JPEG format rather than all image formats supported by typical web-browsers.
  • This allows a more complex “metadata server” to translate complex HTML into simplified versions where all the potentially non-supported features of HTML can be translated into a combination of JPEG and “simplified” HTML.
  • This scheme guarantees that practically any translated page is viewable on any platform that incorporates a metadata player that embodies aspects of this patent application.
  • metadata services can take complex HTML directly off the Internet and translate them into a simplified version for inclusion within the metadata value, offering a much richer mechanism for tag evaluation.
  • Metadata structures in accordance with a number of embodiments of the invention also include a set of data types for the definition of Regions of Interest (ROI), which can be used to visually highlight objects in the decoded video track and connect those objects with metadata tags.
  • ROI Regions of Interest
  • This “visual tagging” allows the development of many rich applications that can link the visual objects in the content with static or dynamic values. For example, in movies where the supporting roles are played by lesser-known actors, the scenes with the supporting actors could have an “ROI” line drawn around each actor's face, with a URI defined to connect the outline back to that actor's page on a suitable web-site, or local file-system resource (see discussion above with respect to using metadata to link to dynamic objects).
  • FIG. 8 an example using a basic rectangle shape 81 is shown and in FIG. 9 a more complex shape 91 .
  • the first data type 82 is that of a “Bounding Area,” which is intended to define a shape, e.g., a rectangle that fully encloses the object to be connected with a metadata tag.
  • This object is simple in its definition and is intended to be simple to implement. It can have one or more tags associated with it, and like all other metadata tags, each tag could be overloaded with multiple values.
  • the ability to decode a basic ROI shape is supported on a large number and variety of devices and provides a baseline level of support. Therefore, media files that define ROIs can first define the ROI using the basic shape and can also more precisely define the ROI using more complex shapes supported by a smaller subset of devices.
  • a variety of increasingly complex shapes are also provided to allow the drawing of more complex and accurate outlines of arbitrarily shaped objects.
  • These extended and optional types include but are not limited to shapes that can define: rectangles, ellipses, Bezier curves and multi-segment polygons.
  • Each of these objects is defined using a minimum set of variables required in one embodiment. For example, a rectangle is defined by the coordinates of its opposite corners; ellipses by their center and a corner of its major and minor axis; and a quadratic Bezier curve by its two end-points and the control-point.
  • the metadata structure allows each playback device to implement as many of these shapes as they can; using whatever algorithm is best suited to the platform.
  • This set of one mandatory and multiple optionally implemented data types for ROI definitions allows for a very high degree of portability of the object demarcation system. For example, very simplistic players that cannot implement any of the “higher” drawing primitives can still provide very useful functionality through the “Bounding Area” object, by drawing a simple square or circle in the center of the Bounding Area. In extension, a complex player could utilize any further complex shapes to accurately trace the outline of a vehicle and link that object back to the web-site for that car, or if the device was personalized, to a dealer local to the viewer, as shown for example in FIG. 9 .
  • a set of identifiers or tags are provided to mark and record the data utilized to implement such applications.
  • the tags provide an indication of the metadata information associated with the tag and thereby allows the playback device to decode and utilize the metadata information through established rules defined for the information.
  • the tag also eases searching and filtering to locate specific metadata information.
  • a COMMENT tag allows the association of a Unicode text string with a media file object, or a time-period, or an object in the presentation.
  • a DISPLAY_ORIGIN tag indicates a rectangular 4:3 aspect ratio crop to be applied to a 16:9 (or 2.35:1, or other aspect ratio) video when displaying on a 4:3 display.
  • DISPLAY_SETTINGS is a data structure that can be used to alter display characteristics during playback.
  • DIVX_UID is a data structure that can be used to uniquely identify the file, each video track or each audio track.
  • GLOBAL_REFERENCE is a data structure for the recording of GPS (or other)coordinates, height, direction, time, etc.
  • OBJECT is a data structure for the description of a non-living entity in a scene.
  • RATING is used to indicate the MPAA (or equivalent) rating assigned to an entire title, scene or frame of data. It can also be assigned globally to a track and individually to specific metadata objects; i.e. its purpose is contextual.
  • RECORDER SETTINGS can be used to store values from an electronic recording device such as a video camera, DVR, etc.; typically the data is an EXIF table.
  • SIGNIFICANCE is used to indicate the relevance of scenes to the overall story-line of a title.
  • VIEWED_SEGMENTS allows the tracking of the time-periods that have been watched; each tag contains a counter also, indicating how many times that portion has been watched.
  • tags are descriptions associated with other objects, e.g., editions, chapters, or tracks. As such, they do not have an identity of their own and, therefore, there is no way to independently refer to the value of a tag.
  • a tag can be defined with an identity, e.g., a TagUID, and a reference made to an identified tag through the a reference to the tag's identity, e.g., TagUIDReference.
  • the modification of a multiply-dispersed tag can be reduced to a single modification.
  • a metadata track incorporates one or more references to an identified tag
  • the list of such references is placed within the metadata track's header, e.g., inside a TracksTagsList table.
  • the tag “TITLE” will be extended with “/JPEG” and the TagBinary value will hold the binary data of the image.
  • the extensions used in one embodiment closely match [if not accurately match] the file extensions given to files of the same type. This allows for the development of advanced metadata viewers that can utilize a system's native data type handlers, by invoking the handler based on the TagName extension.
  • the metadata structure allows different versions of content to be authored into a single file, with each version being selectable by the user.
  • This functionality is most analogous to a multi-DVD set containing individual DVDs of the “Studio Cut,” “Director's Cut” and “Unrated” versions of a movie.
  • each version of the content contains many scenes in common with the other versions and some scenes that are unique to each version; the difference between each version is only in the set and order of scenes.
  • This feature can also be used by people wishing to make, publish and exchange “community versions” of content—a feature that became very popular with the HD-DVD user-base.
  • Each “community version” could be encoded by a small amount of metadata, rather than megabytes of bit stream data. Such an efficient way of recording each user's version makes the exchange of these versions feasible by email and web-site downloads.
  • Dynamic versions of content can be presented by the Media File player based on metadata present in the Media File. Some different ways of creating dynamic versions are listed below. However, each method of creating a dynamic version still requires that the selected version be timed correctly so that a viewer can determine some basic time-related aspects of the version they choose, such as the total playback time for their version and the current playback position in that version.
  • timing information To allow accurate timing information to be generated from a dynamic version many embodiments utilize the following clarifications for time-related variables, which are applicable to the structures of the MKV file format, as well as other file containers that utilize similar file segmentation methodologies.
  • the time-codes define the beginning and end times of this Chapter relative to the start of the highest-priority video track they are associated with.
  • the end time-code must be larger than the beginning.
  • external Segment defined by The time-codes define the beginning and end times of this ChapterSegmentUID Chapter relative to the start of the highest-priority video [ChapterSegmentChapterUID should track they are associated with in the identified Segment. not be defined] If both time-codes are 0, then the defined Chapter is redirecting to the entire length of the external Segment. Otherwise, the end time-code must be larger than the beginning and they encode the portion of the Segment to be played.
  • the time-codes define the beginning and end times of this ChapterSegmentEditionUID Chapter relative to the start of the highest-priority video [ChapterSegmentUID must also be track they are associated with in the identified Edition of defined] the identified Segment. Both begin and end time-codes are ignored, and the defined Chapter is redirecting to the entire length of the external Edition.
  • external Chapter defined by The time-codes define the beginning and end times of this ChapterSegmentChapterUID Chapter relative to the start of the highest-priority video [ChapterSegmentUID and track they are associated with in the identified Chapter of ChapterSegmentEditionUID must also the Edition of the external Segment. be defined] If both time-codes are 0, then the defined Chapter is redirecting to the entire length of the external Chapter. Otherwise, the end time-code must be larger than the beginning and they encode the portion of the Chapter to be played.
  • playback could also be controlled by metadata track entries.
  • the movie can be “intelligently” reduced in total duration by displaying only the most important scenes.
  • the metadata “RATING” tag could be used to store the age-appropriateness of each scene in the content, and only the scenes at or below a certain appropriateness would be shown (a feature that would allow many families to watch a new movie together).
  • any metadata tag could be used to view “dynamic versions” of one or more pieces of content.
  • the filter for the dynamic version could be based on whether a scene contains the viewer's favorite actor, or the GPS location of the shoot, etc.
  • Playback redirection allows the ability to create different “content collections” from a set of content. For example, if a user has home-videos titled “Vacation 2006,” “Vacation 2007” and “Vacation 2008,” and each of these have chapters titled “Day 1,” “Day 2,” etc.—then a single “Day 1” collection could be created that redirects playback to each “Day 1” chapter of each year's vacation video. This redirection file would be small, and if another collection, say “Day 2”, was required, then it could be created, without having to do any re-encoding of the original titles. Another example of this is to simply utilize the redirection to achieve the DVD experience of trailers, movie, and extras. And yet another example would be to utilize redirection to link to adverts that are dynamically set by a service provider.
  • User-entered comments in various embodiments could be stored in a metadata track unique to a user.
  • Various properties of the user could also be entered as metadata associated with that user's metadata track header, such as the user's age, GPS location, native language, etc., as well as properties related to the comments, such as age-appropriateness, language, etc.
  • These metadata values could then be used as the basis of filters at playback time to ensure that the viewer only viewed comments in languages they've enabled, and only if the comments' age-appropriate flags are less than or equal to the viewer's age, etc.
  • One or more applications can be launched or started from the identification or decoding of metadata information.
  • the metadata information can specify a specific application or file type utilized by a specific or default application that is launched upon activation and decoding of the metadata information, such as the data type extension of a tag name.
  • a playback device in one embodiment can be more compact and/or less complex as not requiring the application and other similar applications to be integrated into the playback device.
  • embedding advertising information into media files can also be provided, where the advertising is related to objects within the video track, or words in the audio or subtitle track.
  • This placement of the adverts could become the payment method of the content, assuming that viewer's are required to view the “advertising” metadata track.
  • three brands are outlined, highlighted and annotated with ROI metadata that provides direct links to those brands' Internet properties.
  • a method of creating a media file including multimedia content and associated metadata information is provided.
  • a metadata source 111 metadata objects are extracted 112 to generate a metadata table 113 .
  • the metadata source is provided via a user interface such as a video authoring application or system.
  • the metadata source is provided through a formatted file, such as a XML file.
  • the metadata objects that are extracted from the source are objects that are instantiated multiple times and/or are referenced multiple times.
  • the metadata table 113 stores a single copy of the metadata object and associates the copy to a universal identifier.
  • Global metadata objects 115 are also extracted 114 from the metadata source 111 .
  • the global metadata objects describe general or global metadata entities, such as an entire file, title, chapter, etc.
  • the metadata track header 117 is created or populated 116 .
  • the metadata track includes a list of universal identifiers.
  • the universal identifiers correspond to the associated metadata objects that will be called for in each metadata track.
  • the metadata track(s) are prepared for multiplexing with audio, video and subtitle bit streams 118 .
  • the metadata track(s) 119 include the universal identifier along with the associated metadata object.
  • Each metadata track is coupled with a global universal identifier 110 and the content source 120 to create a target or complete media file 121 .
  • the global universal identifier is written first in the complete media file 122 .
  • Metadata objects and content elements follow 122 .
  • the container's natively supported metadata tags are utilized.
  • the media file 121 is stored to be accessible later based on a user request and/or sent immediately or shortly thereafter to satisfy a previous or pending user request.
  • the media file 130 includes content and metadata information.
  • Metadata information in one embodiment includes global metadata objects 131 , a metadata track header 132 , metadata table(s) 133 and/or a metadata track 134 . From the metadata information, the appropriate universal identifiers and metadata object(s) are created to be played at the appropriate time 135 . In one embodiment, from the global metadata objects, global tags are read and the metadata objects and universal identifiers 140 are extracted based on the title to be played 135 .
  • the metadata track header 132 is read for each metadata track to be rendered 136 .
  • a list of universal identifiers 141 is extracted.
  • the associated metadata object 142 is read 137 from the metadata table 133 to generate one or more metadata objects.
  • Playback of the content is started 138 and additional metadata objects 143 are extracted.
  • the metadata objects 142 and 143 are rendered and/or displayed 139 .
  • the displayed metadata objects are triggered or result from user interaction with the displayed content via a user playback interface. Also, based on the time and position of the main video track, associated metadata objects relative to the main video can also be displayed. The process continues until the user stops playback or otherwise terminates the playback and/or decoding of the media file.
  • the global universal identifier 110 may also be utilized in the decoding or playback process of FIG. 12 .
  • the GUID for example is used to locate metadata objects that would be found in another media file. By not utilizing filename conventions that vary widely, the GUID removes this limitation and allows a constant or reliable indicator to locate the desired metadata object.
  • the playback device or engine would search through its local content library for the referenced to media file. If the referenced media file is not found, the playback device can request the file from a media server.

Abstract

A metadata systems and methods are provided that enhance the playback features of multimedia files. A metadata structure is used that includes metadata tags and objects to allow access to various data typically not available to most playback devices.

Description

    CROSS-REFERENCE To RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Patent Application Nos. 61/059,547, filed on Jun. 6, 2008, and 61/109,476, filed on Oct. 29, 2008, the entire disclosures of which are hereby incorporated by reference as if set in full herein.
  • BACKGROUND
  • Typical multimedia container formats offer practical and efficient methods of encapsulating standard multimedia data types such as audio, video and subtitles. The same efficiency, however, does not typically extend to metadata, especially in most consumer targeted multimedia container formats. Often the descriptive and interactive metadata associated with content is collectively placed in a distinct section of the same file, or stored in secondary files using proprietary formats. To date, practical implementations of metadata have been limited to simple descriptions of the video title, rarely extending to any direct associations with the actual scenes in the video. Moreover, in systems where secondary metadata files are employed, many challenges come to light when delivery occurs over the Internet due to factors such as the re-naming and re-grouping of files by caches between the publisher and the consumer.
  • In addition, to support the demands of Internet based video services, more and more metadata are being amassed in disparate systems to drive those services. The weakness with the methods currently employed by many of these Internet services is that the rich-experiences are only available through the hosted on-line service which, therefore, must be accessed through a web-browser. If the content can be downloaded from the provider's web-site, typically all of the metadata that enables the rich-experience cannot be. This has the effect of tying the content and the viewers to a PC-based experience rather than a home theater one, even when the home theater is the desired viewing environment. This limitation is a barrier for the wide-scale adoption of Internet distribution of TV, movies and other forms of multimedia by commercial content distribution networks. For large content providers and their customers to participate in an Internet based content distribution system, the signature experience of each provider must be able to migrate with the content to the viewer's home theater, in-car entertainment system and/or their mobile phone just as easily and vividly as it is viewable through their PC's web-browsers—regardless of whether the playback environment has an immediate, active connection to the Internet.
  • The requirements for a metadata system that can be applied to multimedia files are complex as the files may include a combination of video, audio and subtitle tracks. Furthermore, some multimedia formats, such as DVD, require the playback of the video presentation to follow an authored path, such as the displaying of copyright notices, trailers, chapter menus, etc. In the model of physical distribution of DVDs and Blu-ray Discs (BDs), direct associations between the authored presentation order and the multimedia files is maintained by the physical properties of the disc.
  • SUMMARY
  • Generally, digital video distribution and playback systems and methods that provide an enriched and versatile metadata structure are provided.
  • In one embodiment, a method of playing back metadata content stored in a media file comprises providing a media file to a playback device. The media file has at least one metadata object and an association with content data in which the metadata object references at least one facet of the content data. The method further comprises decoding the content data by the playback device, displaying content on a display screen from the decoded content data, and decoding the at least one metadata object based on the displayed content by the playback device.
  • In another embodiment, a system for playback of a media file comprises a media server and a client processor. The media server is configured to locate media files with each media file having an immutable global identifier. The client processor is in network communication with the media server and is configured to send requests for a media file to the media server. The media server is also configured to locate and transmit the requested media file based on the global identifier and the client processor further comprises a playback engine configured to decode a metadata track within the transmitted media file. The metadata track refers to content data in the transmitted media file.
  • In yet another embodiment, a method of creating a media file having metadata information comprises supplying a source of metadata information to an encoder; supplying a source of content to the encoder; generating a metadata object from the supplied metadata information by the encoder, the generated metadata object referencing at least one portion of the supplied content and integrating the metadata object with the supplied content to form a media file by the encoder.
  • In one embodiment, a single unique identifier is used to refer to a repeatedly referenced metadata object.
  • The above-mentioned and other features of this invention and the manner of obtaining and using them will become more apparent, and will be best understood, by reference to the following description, taken in conjunction with the accompanying drawings. The drawings depict only typical embodiments of the invention and do not therefore limit its scope.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a semi-schematic diagram of networked and local-file playback systems in accordance with embodiments of the invention.
  • FIG. 2 is a graphical representation of metadata structure within a multimedia file in accordance with an embodiment of the invention.
  • FIG. 3 is a graphical representation of metadata table in accordance with an embodiment of the invention.
  • FIG. 4 is a graphical representation of metadata object in accordance with an embodiment of the invention.
  • FIG. 5 is a graphical representation of metadata track header in accordance with an embodiment of the invention.
  • FIG. 6 is a graphical representation of metadata track entry in accordance with an embodiment of the invention.
  • FIG. 7 is a graphical representation of metadata object in accordance with an embodiment of the invention.
  • FIG. 8 is a graphical representation of metadata track entry relative to a display and a video track entry in accordance with an embodiment of the invention.
  • FIG. 9 is a graphical representation of a metadata track entry relative to a display and a video track entry in accordance with an embodiment of the invention.
  • FIG. 10 is a graphical representation of metadata structure within a multimedia file in accordance with an embodiment of the invention.
  • FIG. 11 is a flowchart of a process encoding a multimedia file to include metadata information in accordance with an embodiment of the invention.
  • FIG. 12 is a flowchart of a process decoding a multimedia file having metadata information in accordance with an embodiment of the invention.
  • DETAILED DESCRIPTION
  • Generally, a rich metadata structure for multimedia files is provided that increases the scope of metadata tags and fundamentally enhances the capabilities of media-managers and players on both personal computer and consumer electronic (CE) platforms. In one embodiment, multimedia metadata systems and methods are provided that enable associations to be maintained by immutable logical properties that remain robust to changes to the mutable logical properties of the data, such as the file names and paths. The systems and methods allow description of audio and video, as well as subtitles and any other types of presentation data tracks within a file. The content can be contained in a single file, or be distributed across a multi-segment range of files. In addition, many embodiments of the invention support both the DVD experience of authored presentations, as well as the Internet-based dynamic and collaborative experience. In several embodiments, the metadata system maintains the same level of experience across PC and embedded platforms, regardless of whether the player has a live Internet connection or not. In a number of embodiments, the metadata can be associated with the content, regardless of whether the metadata is stored in-file with the content, or in another file. In addition, the metadata can describe a variety of entities and properties of the content including the entire file, each feature in the file, each chapter of the features, segments or segments of those chapters and even spatial regions of the video.
  • Metadata frameworks in accordance with embodiments of the invention utilize three items of support from the containers that incorporate it. First, the ability to store a Globally Unique Identifier (GUID) or Universally Unique Identifier (UUID) for the file. Second, the ability to store a table of metadata tags, values and UIDs using common and new data types and lastly, the ability to store a new type of multimedia track, with a non-standard data type, a “metadata bit stream track”. It should however be appreciated that one or more of the items, e.g., the first and third items, can be optional items to be used in more advanced cases or devices. Use of one or more metadata bit stream tracks enable metadata to be available in close proximity within the file to the content that the metadata describes, as well as delivering the metadata only when/if it is needed.
  • The metadata table enables the efficient, singular, storage of metadata multiply referenced by metadata tags contained in the Media File, including those in any contained metadata bit stream track. The use of a GUID allows the content of a Media File, including any metadata and metadata bit stream tracks, to change without breaking references made to it from other, previously authored, Media Files.
  • The metadata format system extends the scope of metadata tags (or just “tags”) from traditional coarse-grain descriptors to medium and fine-grained descriptors of sub-sections of the file and individual video frames. The system introduces some new data types for the tag values that enable the demarcation and outlining of spatial regions of interest and support linking with internal and external objects. The system also increases the robustness of and options for, content distribution across the Internet. The system can be utilized by applications to enable them to, for example, regulate playback of content in many ways, allow annotation of the content by groups of viewers, and/or redirect playback to remote content, such as adverts or special-features, hosted on web-sites. These extensive enhancements to conventional metadata tagging opens-up options for application functionality and enables the creation of a wide set of portable, rich, commercial and non-commercial content based services.
  • In one embodiment, the metadata structure is largely in-line with those of MPEG-7 and the data-types similar to those defined in the SMPTE Metadata Dictionary. In doing so, it is straightforward to provide translation services between these professional audiovisual metadata standards and the simpler consumer oriented format.
  • Referring now to FIG. 1, playback systems in accordance with embodiments of the invention are shown. The playback system 10 includes media servers 12 and metadata servers 13 connected to a LAN (e.g., a home network) or a WAN (e.g., the Internet) 14. Media files are stored on the media servers 12 and metadata resource databases stored on the metadata servers and can be accessed by devices configured with a client application. In the illustrated embodiment, devices that access media files on the media servers and metadata on the metadata servers include a personal computer 16, a consumer electronics device such as a Media File Player 18 connected to a visualization device such as a television 20, and a portable device such as a personal digital assistant 22 or a mobile phone handset. The devices and the servers 12 can communicate over a LAN 14 that is connected to the WAN 24 via a gateway 26. In other embodiments, the servers 12, 13 and the devices communicate over a WAN (such as the Internet).
  • In some embodiments, the Media File Player 18 is directly connected to a LAN and can directly access a WAN. In some embodiments, the Media File Player is not directly connected to a network and plays files that have been copied onto Optical Disks 17, USB thumb-drives 19 or other direct-access physical media. In such embodiments, the software that copied the media files to the physical media (e.g., running on a computer 16) copies the media files from the media servers and the metadata from the metadata servers to the physical media. When copying the metadata from the metadata servers, the copying software can translate metadata values that refer online resources to the location of the resource on the local media.
  • The devices are configured with client applications that read portions of media files from the media servers 12 or physical media 17, 19 for playing. The client application can be implemented in software, in firmware, in hardware or in a combination of the above. In many embodiments, the device plays media from downloaded media files. In several embodiments, the device provides one or more outputs that enable another device to play the media. When the media file includes an index, a device configured with a client application in accordance with an embodiment of the invention can use the index to determine the location of various portions of the media. Therefore, the index can be used to provide a user with “trick play” functions. When a user provides a “trick play” instruction, the device uses the index to determine the portion or portions of the media file that are required in order to execute the “trick play” function and reads those portions from the server or physical media.
  • In a number of embodiments, the client application requests portions of the media file from media servers using a transport protocol that allows for downloading of specific byte ranges within the media file. One such protocol is the HTTP 1.1 protocol published by The Internet Society or BitTorrent available from www.bittorrent.org. In other embodiments, other protocols and/or mechanisms can be used to obtain specific portions of the media file from the media server.
  • Incorporation of Metadata within Media Files
  • In several embodiments, a media track for the incorporation of metadata throughout the duration of the content is provided. With having metadata tracks embedded into a file, many options open-up for the utility of these tracks. It should be noted that a metadata track is different from video, audio and subtitle tracks in typical media files. For example, the metadata track refers to the other tracks and thus is only relevant through the use of the other tracks. In other words, the metadata track would appear as pieces of detached information without the context of the other tracks. Additionally, the metadata track can refer to other information, i.e., metadata information. This for example allows the information to be stored and/or extracted from another source or location within the media file or referenced multiple times but stored in a single or just a few locations. These differences and other such distinguishing and additional features are described in greater detail below.
  • FIG. 2 illustrates the metadata structure within the context of a typical multimedia file. GUIDs 21 are used to identify files, with a new GUID created for every new file (ideally, even for a copy of a file, since a copy is a new file). Once defined at file creation time, the GUID should be regarded as an immutable property of that file. Even if the file's contents change its GUID should remain the same. This rule allows a “main title” to be authored with a reference to a “placeholder” file via the GUID contained in the placeholder file; the “placeholder” file's contents could change to accommodate a new movie trailer or advertisement, or other form of content. Whenever the main title makes a reference to the “placeholder” file's GUID, it would receive the latest video encoded within the “placeholder”; hence the viewer's copy of the “main title” need never change physically, though it would always link to “current” content.
  • Track headers 23 for audio, video, subtitles, etc. along with a metadata track header 30 follows the GUID 21 entry. In one embodiment, a metadata table 25 and/or metadata table 29 follows the metadata track header 30. Metadata references in accordance with this invention can refer to metadata objects previously stored within one or more of the metadata tables. In this way, frequently referenced metadata objects can be referred to efficiently through the single metadata object residing in a table, rather than repeating the declaration of the object whenever it is needed. As metadata can change quite frequently, a metadata table is written into the file at a position where changes to the table's size do not require the entire multimedia file to be remuxed and rewritten. For example, the first metadata table 25 is a mid-file option and has a reserved area 27 that provides an open area to allow growth of the first metadata table if needed. The second metadata table 29 is an end-of-file option that allows the growth of the metadata table to be indefinite or unrestricted. Thus, as shown in FIG. 2, the tables could be stored as the very last element in a file, or could be embedded within a file with an immediately following reserved area (R)into which the table could grow. In some embodiments, it can be useful to utilize both areas for a distributed table. In some embodiments there may not be a reserved area (R), in which case changes to the size of the metadata table may require a re-write of the entire file. One embodiment of a metadata table is shown in FIG. 3 and sets forth a set of metadata tags and values, a metadata object (MO) 33, assigned to an unique identifier (UID) 31 that is unique within the set of all metadata UIDs in the file. FIG. 3 is discussed further below.
  • Time-ordered bit stream packets [audio, video, subtitle and metadata] 28 are usually located in files that also contain metadata tags and metadata tables. However, it should be appreciated that the packets or portions thereof could be in separate files, e.g., a video file and a metadata file, automatically linked to each other through the use of the GUIDs or manually linked through user specification. For example, a first GUID could reference the metadata file and a second GUID could reference the video file. Within the metadata file, the second GUID would be included to link the video file to the metadata file. Likewise, the video file would include the first GUID to link it to the metadata file. As such, a multimedia presentation can be a collection of files linked by GUIDs. Additionally, by linking the files with GUIDs, if a playback device first attempts to playback the metadata file when it intended to playback the video file, the playback device can retrieve all the associated files as desired to playback the intended multimedia presentation. In one embodiment, the GUID is not included or not used when all the presentation components, e.g., the metadata and audiovisual data, is placed in the same file.
  • Typically, the bit stream data in (B) is differentiated by a track ID, and each track's data type is defined by a Track Header 23. Audio, video and subtitle tracks are well-defined data types for multimedia files, as are the track header 23 for these data types. The metadata track (metadata packets within the bit stream data 28) places the definition or references of the metadata objects close to the associated audio, video and subtitle packets. For example, if a car appears on screen at nine minutes and nineteen seconds into a presentation and metadata tags that detail the car's make, model and associated web-site are to be presented to the viewer at the same time, then the metadata tags are placed in a packet in the (B) bit stream element, physically near the audio/video packets with the same presentation time. Alternatively, the metadata could have been placed in a separate entity unrelated to the nearest bit stream (B) element. In such, typical, cases, such a list of metadata objects and presentation times would be analyzed before playback, and the objects retained in memory until it was time for them to be displayed. Such an alternative scheme could, therefore, slow start-up performance and may require a potentially large amount of memory to be available in a playback system.
  • Metadata Tables
  • Referring again to FIG. 3, a unique ID 31 (UID, as opposed to a Globally Unique ID) allows the referencing of the defined set of metadata 33 at a future point by using only the UID, rather than by an entire declaration (re-declaration) of the tags and values. The metadata table as such provides the definition of a UID-based look-up table of metadata tags and values; a minimum size for a file's metadata, by defining multiple referenced metadata objects (MO) only one time; and a minimally invasive manner to change the properties of an object that is referenced at multiple points.
  • Metadata Objects
  • In FIG. 4, one embodiment of a metadata object (MO) is shown. The MO may be overloaded with multiple tags 41 and values 42, with each value of a different data type. As such, the metadata object can have one or more [tag, value] pairs and each tag can have one or more associated values. This ability to associate multiple tags and values gives the metadata systems a wide-range of options to describe aspects of the audiovisual presentation. As a result, versatility is provided by allowing the systems that may be able to understand, handle and display complex metadata to utilize all the metadata information, whereas other systems that may only have implemented the mandatory, basic, tags and data types (native to their format) are still able to utilize the basic metadata information. As such, the tags defined by possibly overloaded values allow metadata to be defined in a scalable manner, where all players will be able to retrieve and offer at least one value for a tag, and more complex and capable players will be able to offer many different values for a tag. Hence, multimedia files can be authored with complex applications described through rich metadata to be utilized by high-end players, and these same files will still function to an approximate degree on much simpler players, such as low-cost CE devices.
  • Metadata Tracks
  • In one embodiment, the metadata track header (M) 30 is a data type for packets in the bit stream 28, and is illustrated in FIG. 5. In addition to the various implementation-specific definitions required of headers 51 (such as a “Type” parameter), each metadata track header also lists the UIDs 53 from the table metadata table that the track's entries reference. At a minimum, the metadata track header only provides its type and a list of unique identifier UIDs referenced by the track's entries. This allows a playback system to locate and store only the metadata objects from the metadata table that would be required to display that particular track fully. In many embodiments, the metadata objects are declared in-stream in the track entry, if desired by for example the publisher, and thereby bypassing the metadata tables.
  • Metadata Contained within Media Files
  • Portions of metadata that are interleaved in the audio and video data, (i.e., bit stream 28) in accordance with an embodiment of the invention are shown in FIG. 6. The metadata track entry provides a presentation time 61, duration 63 and one or more metadata objects 65 or associated unique identifiers 67, (i.e., references to predefined metadata objects). In one example, specific tracks can be written that contain metadata for different purposes, e.g. a track could incorporate all the director's annotated, multimedia commentary presentation regarding the shooting of a movie, while another track could contain the lead actress's multimedia presentation on her character with links to further details on web-sites. Viewers would have the option of enabling or disabling as many tracks as they choose in order to see exactly the types of data in which they are interested. As such, commentary from each viewer of the movie could be incorporated into separate tracks and during playback, only the current viewer's selected user tracks (for example, the viewer's friends' tracks) could be enabled and rendered for the viewer.
  • Data Types for Rich Tags and Rich Experiences
  • Typing metadata data values in a highly portable manner can result in metadata that is compatible with many different applications and devices. In various embodiments, the metadata types utilized by the metadata structure are found typically in a standard container format, e.g., the MKV format. The incorporation of standard data types enables devices with limited capabilities to simply decode the standard data and more advanced devices to access the full array of metadata, when the advanced data types are incorporated. The metadata types are thus recognized by the advanced “rich featured” applications and devices and thereby allow such applications to access the metadata information and present the metadata information appropriately. Non-advanced applications and devices may or may not recognize the metadata types and ignore the information. In other words, if the device recognizes that there is metadata information that its capability is limited to present, the device skips the metadata information, instead of attempting to display the metadata information that may cause the application to shutdown or stop the playback of the content.
  • Accordingly, metadata systems in accordance with a number of embodiments of the invention incorporate standard data types, such as 32-bit integers and UTF-8 encoded text strings. The metadata systems can also introduce some extra data types that can provide support for rich applications. These data types enable a richer variety of metadata to be contained within a file and can include portable formats for images, sounds, URIs, formatted text and Region of Interest (ROI) demarcations.
  • Additionally, in a number of embodiments, the richer metadata is stored using standards based formats that are commonly supported in consumer electronics devices. For images and sounds, widely supported compression formats can be utilized such as JPEG for images and MP3 for sound. Both of these formats are well established in practically all DVD, BD and PC platforms and are available from many vendors for many other platforms. Since both formats allow storage of data in the compressed domain they offer a simple yet scalable solution to store images and sounds, despite their general utilization for data that they may not have originally been designed to represent. For example, the JPEG standard is primarily targeted at compressing “natural” images, although in the context of the present invention its use is extended to allow storage of CGI images, that are typically better suited to storage in the GIF format, for example.
  • It should be appreciated that some multimedia formats, e.g., MKV, have practically no requirement to order data in the file according to any scheme to allow files to be created in the way that best suits the file writer. However, the onus is on the playback device to correctly retrieve all elements of the file needed to playback the content. However, this can burden the player to perform a significant amount of initial processing before it can playback the content, such as repetitively searching through lists. As previously described and described below in greater detail the metadata structure in accordance with various embodiments, such as using a single unique identifier to refer to a repeatedly referenced metadata object, can significantly reduce the processing burden on the player.
  • Embedding Dynamic Objects Using Metadata
  • Referring now to FIG. 7, incorporation of URIs 71 into the metadata structure in accordance with an embodiment of the invention is shown. The incorporated URIs, e.g., addresses and/or client arguments, allow for the invocation of local or remotely stored dynamic objects. For example, the URIAddress of “http://commons.wikimedia.org/wiki/Matt_Damon” could be defined in an “Actor” tag for any scene in which the Hollywood actor Matt Damon appears. The contents of the referenced page can be created dynamically, based on the server serving the page. Hence, although the tag value may be set once and engraved into master discs, its invocation on a player would allow the most up-to-date information to be displayed, directly off the Internet. To support playback devices that do not have an Internet connection, or where the connection is temporarily unavailable (e.g. during a flight), the tag could be overloaded with another value of HTML type; this value would contain a copy of the HTML that was current at the time of writing.
  • The URI information in one embodiment indicates server and client components. In this manner a playback device can interpret the location or resources used to display or provide the metadata information. As such, an remote address, e.g., a server address, can be provided in the URI to indicate a remote server and/or database that hosts the associated object. Likewise, a local address, e.g., a client address, can be provided to indicate a local client and/or database to retrieve the requested information. For example, a selection of an actor or another indicator can cause the playback device to seek the information remotely (e.g., via the Internet) if indicated by a server address by the metadata information or locally (e.g., via a local drive or local area network) if indicated by a local address.
  • Furthermore, to facilitate specifying client-side processing rules a secondary string can also be defined that contains parameters and values that are intended to be processed by the playback device rather than the remote server. In some embodiments the metadata tag “URIClientArgs” contains the set of client-side parameters and values.
  • Conversion of HTML for Display Using on Simple Devices
  • Formatted text of the referenced page is typically supported through HTML V4.01 (or more recent version) which specifies a complex language for the markup of text and images. However, this complexity can also be an inhibitor to wide-scale support on CE devices. Hence, to simplify the implementation to realize use of the metadata information in full, CE devices do not have to process all aspects of HTML V4.01. For example, a CE device could reduce its implementation complexity by rendering text using a single embedded compacted font, as described in US Patent Application entitled Systems and Methods for font file optimization for multimedia files, filed on Jun. 6, 2009, thereby reducing the need for the device to have access to as many fonts as a typical web-browser; this method could be used when encoding any text as metadata for any tag. Furthermore, the playback device could limit native support for images to the JPEG format rather than all image formats supported by typical web-browsers. This allows a more complex “metadata server” to translate complex HTML into simplified versions where all the potentially non-supported features of HTML can be translated into a combination of JPEG and “simplified” HTML. This scheme guarantees that practically any translated page is viewable on any platform that incorporates a metadata player that embodies aspects of this patent application. Hence, metadata services can take complex HTML directly off the Internet and translate them into a simplified version for inclusion within the metadata value, offering a much richer mechanism for tag evaluation.
  • Regions of Interest
  • Metadata structures in accordance with a number of embodiments of the invention also include a set of data types for the definition of Regions of Interest (ROI), which can be used to visually highlight objects in the decoded video track and connect those objects with metadata tags. This “visual tagging” allows the development of many rich applications that can link the visual objects in the content with static or dynamic values. For example, in movies where the supporting roles are played by lesser-known actors, the scenes with the supporting actors could have an “ROI” line drawn around each actor's face, with a URI defined to connect the outline back to that actor's page on a suitable web-site, or local file-system resource (see discussion above with respect to using metadata to link to dynamic objects). In FIG. 8, an example using a basic rectangle shape 81 is shown and in FIG. 9 a more complex shape 91.
  • To implement ROIs in a portable manner, one mandatory data type and several optional types are defined. The first data type 82 is that of a “Bounding Area,” which is intended to define a shape, e.g., a rectangle that fully encloses the object to be connected with a metadata tag. This object is simple in its definition and is intended to be simple to implement. It can have one or more tags associated with it, and like all other metadata tags, each tag could be overloaded with multiple values. In a number of embodiments, the ability to decode a basic ROI shape is supported on a large number and variety of devices and provides a baseline level of support. Therefore, media files that define ROIs can first define the ROI using the basic shape and can also more precisely define the ROI using more complex shapes supported by a smaller subset of devices.
  • A variety of increasingly complex shapes are also provided to allow the drawing of more complex and accurate outlines of arbitrarily shaped objects. These extended and optional types include but are not limited to shapes that can define: rectangles, ellipses, Bezier curves and multi-segment polygons. Each of these objects is defined using a minimum set of variables required in one embodiment. For example, a rectangle is defined by the coordinates of its opposite corners; ellipses by their center and a corner of its major and minor axis; and a quadratic Bezier curve by its two end-points and the control-point. The metadata structure allows each playback device to implement as many of these shapes as they can; using whatever algorithm is best suited to the platform.
  • A variety of different algorithms for drawing each of these shapes on PCs and embedded processors to determine the feasibility of including these complex shapes in the metadata structure have been explored. Results have shown that even low-end 200 MHz RISC processors are capable of drawing thousands of lines per second without hardware assist, and this result can be immediately translated to each of the complex shapes that can be drawn through a series of straight lines.
  • This set of one mandatory and multiple optionally implemented data types for ROI definitions allows for a very high degree of portability of the object demarcation system. For example, very simplistic players that cannot implement any of the “higher” drawing primitives can still provide very useful functionality through the “Bounding Area” object, by drawing a simple square or circle in the center of the Bounding Area. In extension, a complex player could utilize any further complex shapes to accurately trace the outline of a vehicle and link that object back to the web-site for that car, or if the device was personalized, to a dealer local to the viewer, as shown for example in FIG. 9.
  • Tags for Rich Applications
  • In various embodiments, to enable the creation of other dynamic and rich applications, a set of identifiers or tags are provided to mark and record the data utilized to implement such applications. The tags provide an indication of the metadata information associated with the tag and thereby allows the playback device to decode and utilize the metadata information through established rules defined for the information. The tag also eases searching and filtering to locate specific metadata information.
  • The following are examples of these tags. For example, a COMMENT tag allows the association of a Unicode text string with a media file object, or a time-period, or an object in the presentation. A DISPLAY_ORIGIN tag indicates a rectangular 4:3 aspect ratio crop to be applied to a 16:9 (or 2.35:1, or other aspect ratio) video when displaying on a 4:3 display. DISPLAY_SETTINGS is a data structure that can be used to alter display characteristics during playback. DIVX_UID is a data structure that can be used to uniquely identify the file, each video track or each audio track. GLOBAL_REFERENCE is a data structure for the recording of GPS (or other)coordinates, height, direction, time, etc. OBJECT is a data structure for the description of a non-living entity in a scene. RATING is used to indicate the MPAA (or equivalent) rating assigned to an entire title, scene or frame of data. It can also be assigned globally to a track and individually to specific metadata objects; i.e. its purpose is contextual. RECORDER SETTINGS can be used to store values from an electronic recording device such as a video camera, DVR, etc.; typically the data is an EXIF table. SIGNIFICANCE is used to indicate the relevance of scenes to the overall story-line of a title. VIEWED_SEGMENTS allows the tracking of the time-periods that have been watched; each tag contains a counter also, indicating how many times that portion has been watched.
  • In some multimedia formats, tags are descriptions associated with other objects, e.g., editions, chapters, or tracks. As such, they do not have an identity of their own and, therefore, there is no way to independently refer to the value of a tag. In one embodiment, however, a tag can be defined with an identity, e.g., a TagUID, and a reference made to an identified tag through the a reference to the tag's identity, e.g., TagUIDReference. These extensions allow a single object to be defined once and then referenced as needed from other metadata definitions. The extensions are also useful in that file size can be reduced by providing references to handle metadata that is multiply-dispersed throughout a file. Also, the modification of a multiply-dispersed tag can be reduced to a single modification. In one embodiment, where a metadata track incorporates one or more references to an identified tag, the list of such references is placed within the metadata track's header, e.g., inside a TracksTagsList table.
  • In one embodiment, instead of extending the list of defined data types for a file format, existing tag names are extended to include the definition of the actual data type of the value that is stored inside a structure encoded into the file using one of the format's natively supported data types. For example, when TagName=“TITLE”, it can be assumed that the data type of the tag value will be a string and that the TagString is present with an UTF-8 encoding of the content's title. However, the base TagName is extended by adding a forward-slash “/” and then a set of characters that uniquely specify the data type of the value. For instance, if the cover art for a title is to be stored as metadata, and the cover art is in the JPEG format, then the tag “TITLE” will be extended with “/JPEG” and the TagBinary value will hold the binary data of the image. Also, the extensions used in one embodiment closely match [if not accurately match] the file extensions given to files of the same type. This allows for the development of advanced metadata viewers that can utilize a system's native data type handlers, by invoking the handler based on the TagName extension.
  • Applications Enabled by Rich Metadata
  • The following descriptions of rich applications provides some exemplary use-cases for the metadata structure. These are a few example applications and there are many more uses derivable from the correct and full utilization of the metadata structure provided.
  • Authored Versions
  • The metadata structure allows different versions of content to be authored into a single file, with each version being selectable by the user. This functionality is most analogous to a multi-DVD set containing individual DVDs of the “Studio Cut,” “Director's Cut” and “Unrated” versions of a movie. In such cases, each version of the content contains many scenes in common with the other versions and some scenes that are unique to each version; the difference between each version is only in the set and order of scenes. This feature can also be used by people wishing to make, publish and exchange “community versions” of content—a feature that became very popular with the HD-DVD user-base. Each “community version” could be encoded by a small amount of metadata, rather than megabytes of bit stream data. Such an efficient way of recording each user's version makes the exchange of these versions feasible by email and web-site downloads.
  • Dynamic Versions
  • Dynamic versions of content can be presented by the Media File player based on metadata present in the Media File. Some different ways of creating dynamic versions are listed below. However, each method of creating a dynamic version still requires that the selected version be timed correctly so that a viewer can determine some basic time-related aspects of the version they choose, such as the total playback time for their version and the current playback position in that version.
  • To allow accurate timing information to be generated from a dynamic version many embodiments utilize the following clarifications for time-related variables, which are applicable to the structures of the MKV file format, as well as other file containers that utilize similar file segmentation methodologies.
  • Timed-Segment Being Described Interpretation of ChapterTime* Values
    none-same Segment The time-codes define the beginning and end times of this
    Chapter relative to the start of the highest-priority video
    track they are associated with.
    The end time-code must be larger than the beginning.
    external Segment defined by The time-codes define the beginning and end times of this
    ChapterSegmentUID Chapter relative to the start of the highest-priority video
    [ChapterSegmentChapterUID should track they are associated with in the identified Segment.
    not be defined] If both time-codes are 0, then the defined Chapter is
    redirecting to the entire length of the external Segment.
    Otherwise, the end time-code must be larger than the
    beginning and they encode the portion of the Segment to
    be played.
    external Edition defined by The time-codes define the beginning and end times of this
    ChapterSegmentEditionUID Chapter relative to the start of the highest-priority video
    [ChapterSegmentUID must also be track they are associated with in the identified Edition of
    defined] the identified Segment.
    Both begin and end time-codes are ignored, and the
    defined Chapter is redirecting to the entire length of the
    external Edition.
    external Chapter defined by The time-codes define the beginning and end times of this
    ChapterSegmentChapterUID Chapter relative to the start of the highest-priority video
    [ChapterSegmentUID and track they are associated with in the identified Chapter of
    ChapterSegmentEditionUID must also the Edition of the external Segment.
    be defined] If both time-codes are 0, then the defined Chapter is
    redirecting to the entire length of the external Chapter.
    Otherwise, the end time-code must be larger than the
    beginning and they encode the portion of the Chapter to be
    played.
  • In another embodiment, as shown in FIG. 10, playback could also be controlled by metadata track entries. For example, in the case where the “SIGNIFICANCE” tag has been used to mark the “importance” of each scene in a movie, the movie can be “intelligently” reduced in total duration by displaying only the most important scenes. Hence, a single encoding of a 3 hour movie could be viewed within 2.5 hours, or 1.5 hours depending on the amount of time the viewer had to watch the movie (for example during a short flight). Similarly, the metadata “RATING” tag could be used to store the age-appropriateness of each scene in the content, and only the scenes at or below a certain appropriateness would be shown (a feature that would allow many families to watch a new movie together). Practically any metadata tag could be used to view “dynamic versions” of one or more pieces of content. The filter for the dynamic version (or dynamic mash-up) could be based on whether a scene contains the viewer's favorite actor, or the GPS location of the shoot, etc.
  • Playback Redirection
  • Playback redirection allows the ability to create different “content collections” from a set of content. For example, if a user has home-videos titled “Vacation 2006,” “Vacation 2007” and “Vacation 2008,” and each of these have chapters titled “Day 1,” “Day 2,” etc.—then a single “Day 1” collection could be created that redirects playback to each “Day 1” chapter of each year's vacation video. This redirection file would be small, and if another collection, say “Day 2”, was required, then it could be created, without having to do any re-encoding of the original titles. Another example of this is to simply utilize the redirection to achieve the DVD experience of trailers, movie, and extras. And yet another example would be to utilize redirection to link to adverts that are dynamically set by a service provider.
  • User-Entered Comments
  • User-entered comments in various embodiments could be stored in a metadata track unique to a user. Various properties of the user could also be entered as metadata associated with that user's metadata track header, such as the user's age, GPS location, native language, etc., as well as properties related to the comments, such as age-appropriateness, language, etc. These metadata values could then be used as the basis of filters at playback time to ensure that the viewer only viewed comments in languages they've enabled, and only if the comments' age-appropriate flags are less than or equal to the viewer's age, etc.
  • Advanced Media Management & Search
  • Media managers usually collate metadata about the content they manage and store that information in their own databases. However, this can lead to a non-portable experience for users that are increasingly moving their content between devices. The metadata structure provided lends itself well to the task of keeping the user's experience portable by allowing incorporation of experience data [such as “COMMENT” and “VIEWED_SEGMENTS”] in to the file. Furthermore, with its ease of implementation, even lower-end devices should be able to update appropriate metadata fields in order to maintain experiential data.
  • Launching Other Applications
  • One or more applications can be launched or started from the identification or decoding of metadata information. As such, the metadata information can specify a specific application or file type utilized by a specific or default application that is launched upon activation and decoding of the metadata information, such as the data type extension of a tag name. A playback device in one embodiment can be more compact and/or less complex as not requiring the application and other similar applications to be integrated into the playback device.
  • Integrated Advertising
  • Referring back to FIG. 9, embedding advertising information into media files can also be provided, where the advertising is related to objects within the video track, or words in the audio or subtitle track. This placement of the adverts could become the payment method of the content, assuming that viewer's are required to view the “advertising” metadata track. As shown for example in FIG. 9, three brands are outlined, highlighted and annotated with ROI metadata that provides direct links to those brands' Internet properties.
  • Creating Media Files Including Metadata Tracks
  • Referring now to FIG. 11, a method of creating a media file including multimedia content and associated metadata information is provided. Utilizing a metadata source 111, metadata objects are extracted 112 to generate a metadata table 113. In one embodiment, the metadata source is provided via a user interface such as a video authoring application or system. In other embodiments, the metadata source is provided through a formatted file, such as a XML file. The metadata objects that are extracted from the source are objects that are instantiated multiple times and/or are referenced multiple times. The metadata table 113 stores a single copy of the metadata object and associates the copy to a universal identifier.
  • Global metadata objects 115 are also extracted 114 from the metadata source 111. The global metadata objects describe general or global metadata entities, such as an entire file, title, chapter, etc. Utilizing the metadata table 113, the metadata track header 117 is created or populated 116. In one particular embodiment, the metadata track includes a list of universal identifiers. The universal identifiers correspond to the associated metadata objects that will be called for in each metadata track. The metadata track(s) are prepared for multiplexing with audio, video and subtitle bit streams 118. The metadata track(s) 119 include the universal identifier along with the associated metadata object.
  • Each metadata track is coupled with a global universal identifier 110 and the content source 120 to create a target or complete media file 121. In one embodiment, the global universal identifier is written first in the complete media file 122. Metadata objects and content elements follow 122. As previously noted, to maintain at least portability, the container's natively supported metadata tags are utilized. The media file 121 is stored to be accessible later based on a user request and/or sent immediately or shortly thereafter to satisfy a previous or pending user request.
  • Decoding Media Files Including Metadata Tracks
  • In FIG. 12, a method of decoding the media file including multimedia content and associated metadata information is shown. The media file 130 includes content and metadata information. Metadata information in one embodiment includes global metadata objects 131, a metadata track header 132, metadata table(s) 133 and/or a metadata track 134. From the metadata information, the appropriate universal identifiers and metadata object(s) are created to be played at the appropriate time 135. In one embodiment, from the global metadata objects, global tags are read and the metadata objects and universal identifiers 140 are extracted based on the title to be played 135.
  • The metadata track header 132 is read for each metadata track to be rendered 136. In one embodiment, a list of universal identifiers 141 is extracted. Similarly, for each universal identifier requiring evaluation, the associated metadata object 142 is read 137 from the metadata table 133 to generate one or more metadata objects. Playback of the content is started 138 and additional metadata objects 143 are extracted. The metadata objects 142 and 143 are rendered and/or displayed 139. In one embodiment, the displayed metadata objects are triggered or result from user interaction with the displayed content via a user playback interface. Also, based on the time and position of the main video track, associated metadata objects relative to the main video can also be displayed. The process continues until the user stops playback or otherwise terminates the playback and/or decoding of the media file.
  • Referring also again to FIG. 11, the global universal identifier 110 may also be utilized in the decoding or playback process of FIG. 12. The GUID for example is used to locate metadata objects that would be found in another media file. By not utilizing filename conventions that vary widely, the GUID removes this limitation and allows a constant or reliable indicator to locate the desired metadata object. In one embodiment, if the GUID referenced in the metadata track is in the media file, the playback device or engine would search through its local content library for the referenced to media file. If the referenced media file is not found, the playback device can request the file from a media server.
  • While the above description contains many specific embodiments of the invention, these should not be construed as limitations on the scope of the invention, but rather as an example of one embodiment thereof. Accordingly, the scope of the invention should be determined not by the embodiments illustrated, but by the appended claims and their equivalents.

Claims (28)

1. A method of playing back metadata content stored in a media file, comprising:
providing a media file to a playback device, the media file having at least one metadata object and an association with content data, the metadata object referencing at least one facet of the content data;
decoding the content data by the playback device;
displaying content on a display screen from the decoded content data; and
decoding the at least one metadata object based on the displayed content by the playback device.
2. The method of claim 1 wherein the at least one metadata object has at least one metadata tag associated with at least one metadata value.
3. The method of claim 2 wherein the media file further comprises at least one metadata table, the at least one metadata table having at least one identifier and at least one metadata object.
4. The method of claim 3 wherein the at least one metadata table is positioned at the end of the media file.
5. The method of claim 3 wherein the at least one metadata table includes a reserved space allowing the at least one metadata table to grow without having to re-calculate and re-write all the elements of the file.
6. The method of claim 1 wherein the media file further comprises at least one metadata track incorporated throughout the content data and the track includes a metadata object, or a reference to a metadata object in a metadata table.
7. The method of claim 2 wherein the media file further comprises a global identifier that is static for the media file.
8. The method of claim 2 wherein the content data comprises video, audio, or subtitle frames and the at least one metadata tag references at least one facet of at least one video, audio or subtitle frame and further comprising rendering text using a single embedded compacted font to match text for subtitle frames or the at least one metadata tag.
9. The method of claim 2 wherein the at least one metadata tag includes an extension providing a further description of the tag.
10. The method of claim 6 wherein the at least one metadata track is associated with at least one content track and at least one metadata track header.
11. The method of claim 10 further comprising decoding the at least one metadata track based on a user playback instruction.
12. The method of claim 2 further comprising assigning the at least one metadata tag to a portion of the content data by associating at least one identifier to the portion of the content data and the at least one metadata tag.
13. The method of claim 1 wherein the at least one metadata object describes a region of interest relative to the displayed content.
14. The method of claim 2 wherein the at least one metadata value refers to remote content and further comprising converting the remote content to a localized content.
15. The method of claim 2 further comprising launching other applications based on the decoded metadata object.
16. The method of claim 2 further comprising extracting the at least one metadata object prior to displaying content and controlling the displaying of content based on the extracted at least one metadata object.
17. A system for playback of a media file, comprising:
a media server configured to locate media files, each media file having an immutable global identifier; and
a client processor in network communication with the media server and configured to send requests for a media file to the media server, the media server configured to locate and transmit the requested media file based on the global identifier and the client processor further comprises a playback engine configured to decode a metadata track within the transmitted media file, the metadata track referring to content data in the transmitted media file.
18. The system of claim 17 wherein the playback engine decodes the metadata track by accessing a metadata table having an identifier and metadata objects.
19. The system of claim 17 wherein the playback engine displays metadata information based on the decoded metadata track.
20. A method of creating a media file having metadata information, the method comprising:
supplying a source of metadata information to an encoder;
supplying a source of content to an encoder;
generating a metadata object from the supplied metadata information by the encoder, the generated metadata object referencing at least one portion of the supplied content; and
integrating the metadata object with the supplied content to form a media file by the encoder.
21. The method of claim 20 wherein the content comprises video, audio and subtitle information.
22. The method of claim 20 further comprising generating a metadata table referencing the metadata object generated from the supplied metadata information.
23. The method of claim 20 wherein the metadata object comprises a presentation time and a duration.
24. The method of claim 20 further comprising extracting multiple referenced metadata objects within the metadata information source and storing a copy of the referenced metadata object in a metadata table.
25. The method of claim 24 further comprising extracting global metadata objects within the metadata information source and storing a copy of each global metadata object along with an associated identifier.
26. The method of claim 25 further comprising generating a metadata track and metadata track header including the associated identifiers.
27. The method of claim 20 further comprising integrating the metadata table into the media file.
28. The method of claim 27 further comprising integrating a global unique identifier into the media file.
US12/480,251 2008-06-06 2009-06-08 Multimedia distribution and playback systems and methods using enhanced metadata structures Abandoned US20090307258A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/480,251 US20090307258A1 (en) 2008-06-06 2009-06-08 Multimedia distribution and playback systems and methods using enhanced metadata structures

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US5954708P 2008-06-06 2008-06-06
US10947608P 2008-10-29 2008-10-29
US12/480,251 US20090307258A1 (en) 2008-06-06 2009-06-08 Multimedia distribution and playback systems and methods using enhanced metadata structures

Publications (1)

Publication Number Publication Date
US20090307258A1 true US20090307258A1 (en) 2009-12-10

Family

ID=41398577

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/480,276 Active 2031-08-17 US8289338B2 (en) 2008-06-06 2009-06-08 Systems and methods for font file optimization for multimedia files
US12/480,251 Abandoned US20090307258A1 (en) 2008-06-06 2009-06-08 Multimedia distribution and playback systems and methods using enhanced metadata structures

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/480,276 Active 2031-08-17 US8289338B2 (en) 2008-06-06 2009-06-08 Systems and methods for font file optimization for multimedia files

Country Status (6)

Country Link
US (2) US8289338B2 (en)
EP (2) EP2257931A4 (en)
JP (1) JP2011523309A (en)
KR (2) KR20110014995A (en)
CN (1) CN102037494A (en)
WO (2) WO2009149442A1 (en)

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100158099A1 (en) * 2008-09-16 2010-06-24 Realnetworks, Inc. Systems and methods for video/multimedia rendering, composition, and user interactivity
US20100161562A1 (en) * 2008-12-12 2010-06-24 Sachin Karajagi Method and apparatus for preventing duplicate or redundant data from storing in media player
US20100241668A1 (en) * 2009-03-17 2010-09-23 Microsoft Corporation Local Computer Account Management at Domain Level
US20110179060A1 (en) * 2010-01-19 2011-07-21 Microsoft Corporation Automatic Context Discovery
US20110209045A1 (en) * 2010-02-23 2011-08-25 Microsoft Corporation Web-Based Visual Representation of a Structured Data Solution
US8209598B1 (en) * 2009-08-24 2012-06-26 Adobe Systems Incorporated Exporting electronic documents from rich internet applications
US20120170915A1 (en) * 2011-01-05 2012-07-05 Rovi Technologies Corporation Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US20120311445A1 (en) * 2011-06-05 2012-12-06 Museami, Inc. Enhanced media recordings and playback
US20130249900A1 (en) * 2012-03-23 2013-09-26 Kyonggi University Industry & Academia Cooperation Foundation Method and apparatus for processing media file for augmented reality service
US20130282715A1 (en) * 2012-04-20 2013-10-24 Samsung Electronics Co., Ltd. Method and apparatus of providing media file for augmented reality service
US20140072223A1 (en) * 2012-09-13 2014-03-13 Koepics, Sl Embedding Media Content Within Image Files And Presenting Embedded Media In Conjunction With An Associated Image
US20140181158A1 (en) * 2012-12-21 2014-06-26 William Herz Media file system with associated metadata
US20140244600A1 (en) * 2013-02-25 2014-08-28 Apple Inc Managing duplicate media items
US8856218B1 (en) * 2011-12-13 2014-10-07 Google Inc. Modified media download with index adjustment
US20140344730A1 (en) * 2013-05-15 2014-11-20 Samsung Electronics Co., Ltd. Method and apparatus for reproducing content
US8909922B2 (en) 2011-09-01 2014-12-09 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US8914836B2 (en) 2012-09-28 2014-12-16 Sonic Ip, Inc. Systems, methods, and computer program products for load adaptive streaming
US8918908B2 (en) 2012-01-06 2014-12-23 Sonic Ip, Inc. Systems and methods for accessing digital content using electronic tickets and ticket tokens
US8977986B2 (en) 2011-01-05 2015-03-10 Advanced Micro Devices, Inc. Control panel and ring interface for computing systems
US8997254B2 (en) 2012-09-28 2015-03-31 Sonic Ip, Inc. Systems and methods for fast startup streaming of encrypted multimedia content
US8997161B2 (en) 2008-01-02 2015-03-31 Sonic Ip, Inc. Application enhancement tracks
US20150112820A1 (en) * 2010-08-31 2015-04-23 Cbs Interactive Inc. Platform for serving online content
US20150199335A1 (en) * 2014-01-10 2015-07-16 Electronics And Telecommunications Research Institute Method and apparatus for representing user language characteristics in mpeg user description system
KR20150083971A (en) * 2014-01-10 2015-07-21 한국전자통신연구원 Method and apparatus to provide language translation service for mpeg user description
US9094737B2 (en) 2013-05-30 2015-07-28 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US9124773B2 (en) 2009-12-04 2015-09-01 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
US9143812B2 (en) 2012-06-29 2015-09-22 Sonic Ip, Inc. Adaptive streaming of multimedia
US9184920B2 (en) 2006-03-14 2015-11-10 Sonic Ip, Inc. Federated digital rights management scheme including trusted systems
US9191457B2 (en) 2012-12-31 2015-11-17 Sonic Ip, Inc. Systems, methods, and media for controlling delivery of content
US9197685B2 (en) 2012-06-28 2015-11-24 Sonic Ip, Inc. Systems and methods for fast video startup using trick play streams
US9201922B2 (en) 2009-01-07 2015-12-01 Sonic Ip, Inc. Singular, collective and automated creation of a media guide for online content
US9247317B2 (en) 2013-05-30 2016-01-26 Sonic Ip, Inc. Content streaming with client device trick play index
US9264475B2 (en) 2012-12-31 2016-02-16 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9313510B2 (en) 2012-12-31 2016-04-12 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9344517B2 (en) 2013-03-28 2016-05-17 Sonic Ip, Inc. Downloading and adaptive streaming of multimedia content to a device with cache assist
US9343112B2 (en) 2013-10-31 2016-05-17 Sonic Ip, Inc. Systems and methods for supplementing content from a server
US20160150294A1 (en) * 2014-11-20 2016-05-26 Adobe Systems Incorporated Video Content Metadata for Enhanced Video Experiences
US9369687B2 (en) 2003-12-08 2016-06-14 Sonic Ip, Inc. Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US20160239508A1 (en) * 2015-02-12 2016-08-18 Harman International Industries, Incorporated Media content playback system and method
US20170256289A1 (en) * 2016-03-04 2017-09-07 Disney Enterprises, Inc. Systems and methods for automating identification and display of video data sets
US20170257664A1 (en) * 2016-02-02 2017-09-07 Telefonaktiebolaget Lm Ericsson (Publ) Method, electronic device, and system for immersive experience
US9794618B2 (en) 2015-02-12 2017-10-17 Harman International Industries, Incorporated Media content playback system and method
US9860658B2 (en) 2015-02-12 2018-01-02 Harman International Industries, Incorporated Media content playback system and method
US9866878B2 (en) 2014-04-05 2018-01-09 Sonic Ip, Inc. Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US9872069B1 (en) * 2012-06-21 2018-01-16 Google Llc Goal-based video analytics
US9906785B2 (en) 2013-03-15 2018-02-27 Sonic Ip, Inc. Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata
US9967305B2 (en) 2013-06-28 2018-05-08 Divx, Llc Systems, methods, and media for streaming media content
US10032485B2 (en) 2003-12-08 2018-07-24 Divx, Llc Multimedia distribution system
US20180225289A1 (en) * 2017-02-06 2018-08-09 Autochips Inc. Audio/video file playback method and audio/video file playback apparatus
US10148989B2 (en) 2016-06-15 2018-12-04 Divx, Llc Systems and methods for encoding video content
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US10452715B2 (en) 2012-06-30 2019-10-22 Divx, Llc Systems and methods for compressing geotagged video
US10498795B2 (en) 2017-02-17 2019-12-03 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
US20200059687A1 (en) * 2018-08-17 2020-02-20 Kiswe Mobile Inc. Live streaming with multiple remote commentators
US10591984B2 (en) 2012-07-18 2020-03-17 Verimatrix, Inc. Systems and methods for rapid content switching to provide a linear TV experience using streaming content distribution
US10687095B2 (en) 2011-09-01 2020-06-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US10708587B2 (en) 2011-08-30 2020-07-07 Divx, Llc Systems and methods for encoding alternative streams of video for playback on playback devices having predetermined display aspect ratios and network connection maximum data rates
US10721285B2 (en) 2016-03-30 2020-07-21 Divx, Llc Systems and methods for quick start-up of playback
US10902883B2 (en) 2007-11-16 2021-01-26 Divx, Llc Systems and methods for playing back multimedia files incorporating reduced index structures
US10931982B2 (en) 2011-08-30 2021-02-23 Divx, Llc Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US11457054B2 (en) 2011-08-30 2022-09-27 Divx, Llc Selection of resolutions for seamless resolution switching of multimedia content
US20230067389A1 (en) * 2014-09-12 2023-03-02 Sony Group Corporation Transmission device, transmission method, reception device, and a reception method
WO2023167849A1 (en) * 2022-03-02 2023-09-07 Streaming Global, Inc. Content delivery platform
US20230300349A1 (en) * 2018-04-05 2023-09-21 Canon Kabushiki Kaisha Method and apparatus for encapsulating images or sequences of images with proprietary information in a file
US11810335B2 (en) 2021-02-16 2023-11-07 International Business Machines Corporation Metadata for embedded binary data in video containers

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9319444B2 (en) * 2009-06-22 2016-04-19 Monotype Imaging Inc. Font data streaming
US8615709B2 (en) * 2010-04-29 2013-12-24 Monotype Imaging Inc. Initiating font subsets
US20120079374A1 (en) * 2010-09-29 2012-03-29 Apple Inc. Rendering web page text in a non-native font
CN102724410A (en) * 2011-05-06 2012-10-10 新奥特(北京)视频技术有限公司 Method and apparatus for font correction of different caption generators
US9081529B1 (en) * 2012-06-22 2015-07-14 Amazon Technologies, Inc. Generation of electronic books
US9817615B2 (en) 2012-12-03 2017-11-14 Monotype Imaging Inc. Network based font management for imaging devices
WO2014100582A2 (en) 2012-12-21 2014-06-26 Monotype Imaging Inc. Supporting color fonts
US9626337B2 (en) 2013-01-09 2017-04-18 Monotype Imaging Inc. Advanced text editor
US9317777B2 (en) 2013-10-04 2016-04-19 Monotype Imaging Inc. Analyzing font similarity for presentation
WO2015088497A1 (en) * 2013-12-10 2015-06-18 Thomson Licensing Generation and processing of metadata for a header
US9691169B2 (en) 2014-05-29 2017-06-27 Monotype Imaging Inc. Compact font hinting
US10115215B2 (en) 2015-04-17 2018-10-30 Monotype Imaging Inc. Pairing fonts for presentation
US11537262B1 (en) 2015-07-21 2022-12-27 Monotype Imaging Inc. Using attributes for font recommendations
CN105512096B (en) * 2015-11-30 2018-07-06 北京大学 A kind of optimization method and device based on font embedded in document
US11334750B2 (en) 2017-09-07 2022-05-17 Monotype Imaging Inc. Using attributes for predicting imagery performance
US10909429B2 (en) 2017-09-27 2021-02-02 Monotype Imaging Inc. Using attributes for identifying imagery for selection
WO2019089578A1 (en) 2017-10-30 2019-05-09 Monotype Imaging Inc. Font identification from imagery
US11144707B2 (en) * 2019-06-03 2021-10-12 Netflix, Inc. Techniques for text rendering using font patching
CN111273982A (en) * 2020-01-17 2020-06-12 北京字节跳动网络技术有限公司 Method, device, electronic equipment and medium for confirming default font of operating system
CN111601142B (en) * 2020-05-08 2022-03-01 青岛海信传媒网络技术有限公司 Subtitle display method and display equipment
CN112148900A (en) * 2020-09-14 2020-12-29 联想(北京)有限公司 Multimedia file display method and device
CN112258611A (en) * 2020-10-23 2021-01-22 北京字节跳动网络技术有限公司 Image processing method and device
CN114765703B (en) * 2021-01-13 2023-07-07 北京中关村科金技术有限公司 Method and device for dyeing TTS voice corresponding subtitle and storage medium
KR102420159B1 (en) * 2022-03-07 2022-07-13 주식회사 산돌 Web font service method of font service system

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434678A (en) * 1993-01-11 1995-07-18 Abecassis; Max Seamless transmission of non-sequential video segments
US20040267390A1 (en) * 2003-01-02 2004-12-30 Yaacov Ben-Yaacov Portable music player and transmitter
US20050080743A1 (en) * 2003-10-08 2005-04-14 Ostrover Lewis S. Electronic media player with metadata based control and method of operating the same
US6912726B1 (en) * 1997-04-02 2005-06-28 International Business Machines Corporation Method and apparatus for integrating hyperlinks in video
US7024424B1 (en) * 2001-05-30 2006-04-04 Microsoft Corporation Auto playlist generator
US7191193B2 (en) * 2003-01-02 2007-03-13 Catch Media Automatic digital music library builder
US20070192352A1 (en) * 2005-12-21 2007-08-16 Levy Kenneth L Content Metadata Directory Services
US20070208771A1 (en) * 2002-05-30 2007-09-06 Microsoft Corporation Auto playlist generation with multiple seed songs
US20080066100A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Enhancing media system metadata
US20080077952A1 (en) * 2006-09-25 2008-03-27 St Jean Randy Dynamic Association of Advertisements and Digital Video Content, and Overlay of Advertisements on Content
US20080092168A1 (en) * 1999-03-29 2008-04-17 Logan James D Audio and video program recording, editing and playback systems using metadata
US7462772B2 (en) * 2006-01-13 2008-12-09 Salter Hal C Music composition system and method
US20090047993A1 (en) * 2007-08-14 2009-02-19 Vasa Yojak H Method of using music metadata to save music listening preferences
US20090044686A1 (en) * 2007-08-14 2009-02-19 Vasa Yojak H System and method of using metadata to incorporate music into non-music applications
US20090297118A1 (en) * 2008-06-03 2009-12-03 Google Inc. Web-based system for generation of interactive games based on digital videos
US20100111360A1 (en) * 2008-10-30 2010-05-06 Frederic Sigal Method of providing a frame-based object redirection overlay for a video stream
US20110016120A1 (en) * 2009-07-15 2011-01-20 Apple Inc. Performance metadata for media
US8254671B1 (en) * 2009-05-14 2012-08-28 Adobe Systems Incorporated System and method for shot boundary detection in video clips

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100918725B1 (en) * 2002-04-12 2009-09-24 미쓰비시덴키 가부시키가이샤 Metadata regeneration condition setting device
EP1420580A1 (en) 2002-11-18 2004-05-19 Deutsche Thomson-Brandt GmbH Method and apparatus for coding/decoding items of subtitling data
WO2004054247A1 (en) * 2002-12-09 2004-06-24 Koninklijke Philips Electronics N.V. Interactive television system with partial character set generator
KR20070028325A (en) * 2004-02-10 2007-03-12 엘지전자 주식회사 Text subtitle decoder and method for decoding text subtitle streams
JP5119566B2 (en) * 2004-02-16 2013-01-16 ソニー株式会社 REPRODUCTION DEVICE AND REPRODUCTION METHOD, PROGRAM RECORDING MEDIUM, AND PROGRAM
US20080120330A1 (en) 2005-04-07 2008-05-22 Iofy Corporation System and Method for Linking User Generated Data Pertaining to Sequential Content
US20080120342A1 (en) 2005-04-07 2008-05-22 Iofy Corporation System and Method for Providing Data to be Used in a Presentation on a Device
KR100717008B1 (en) 2005-05-31 2007-05-10 삼성전자주식회사 Method and apparatus for transmitting and receiving of partial font file

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434678A (en) * 1993-01-11 1995-07-18 Abecassis; Max Seamless transmission of non-sequential video segments
US6912726B1 (en) * 1997-04-02 2005-06-28 International Business Machines Corporation Method and apparatus for integrating hyperlinks in video
US20080092168A1 (en) * 1999-03-29 2008-04-17 Logan James D Audio and video program recording, editing and playback systems using metadata
US7024424B1 (en) * 2001-05-30 2006-04-04 Microsoft Corporation Auto playlist generator
US20070208771A1 (en) * 2002-05-30 2007-09-06 Microsoft Corporation Auto playlist generation with multiple seed songs
US20040267390A1 (en) * 2003-01-02 2004-12-30 Yaacov Ben-Yaacov Portable music player and transmitter
US7191193B2 (en) * 2003-01-02 2007-03-13 Catch Media Automatic digital music library builder
US20050080743A1 (en) * 2003-10-08 2005-04-14 Ostrover Lewis S. Electronic media player with metadata based control and method of operating the same
US20070192352A1 (en) * 2005-12-21 2007-08-16 Levy Kenneth L Content Metadata Directory Services
US7462772B2 (en) * 2006-01-13 2008-12-09 Salter Hal C Music composition system and method
US20080066100A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Enhancing media system metadata
US7865927B2 (en) * 2006-09-11 2011-01-04 Apple Inc. Enhancing media system metadata
US20080077952A1 (en) * 2006-09-25 2008-03-27 St Jean Randy Dynamic Association of Advertisements and Digital Video Content, and Overlay of Advertisements on Content
US20090047993A1 (en) * 2007-08-14 2009-02-19 Vasa Yojak H Method of using music metadata to save music listening preferences
US20090044686A1 (en) * 2007-08-14 2009-02-19 Vasa Yojak H System and method of using metadata to incorporate music into non-music applications
US20090297118A1 (en) * 2008-06-03 2009-12-03 Google Inc. Web-based system for generation of interactive games based on digital videos
US20100111360A1 (en) * 2008-10-30 2010-05-06 Frederic Sigal Method of providing a frame-based object redirection overlay for a video stream
US8254671B1 (en) * 2009-05-14 2012-08-28 Adobe Systems Incorporated System and method for shot boundary detection in video clips
US20110016120A1 (en) * 2009-07-15 2011-01-20 Apple Inc. Performance metadata for media

Cited By (145)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11735227B2 (en) 2003-12-08 2023-08-22 Divx, Llc Multimedia distribution system
US11159746B2 (en) 2003-12-08 2021-10-26 Divx, Llc Multimedia distribution system for multimedia files with packed frames
US11297263B2 (en) 2003-12-08 2022-04-05 Divx, Llc Multimedia distribution system for multimedia files with packed frames
US11017816B2 (en) 2003-12-08 2021-05-25 Divx, Llc Multimedia distribution system
US11355159B2 (en) 2003-12-08 2022-06-07 Divx, Llc Multimedia distribution system
US10032485B2 (en) 2003-12-08 2018-07-24 Divx, Llc Multimedia distribution system
US11012641B2 (en) 2003-12-08 2021-05-18 Divx, Llc Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US9369687B2 (en) 2003-12-08 2016-06-14 Sonic Ip, Inc. Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US10257443B2 (en) 2003-12-08 2019-04-09 Divx, Llc Multimedia distribution system for multimedia files with interleaved media chunks of varying types
US11509839B2 (en) 2003-12-08 2022-11-22 Divx, Llc Multimedia distribution system for multimedia files with packed frames
US11735228B2 (en) 2003-12-08 2023-08-22 Divx, Llc Multimedia distribution system
US11886545B2 (en) 2006-03-14 2024-01-30 Divx, Llc Federated digital rights management scheme including trusted systems
US10878065B2 (en) 2006-03-14 2020-12-29 Divx, Llc Federated digital rights management scheme including trusted systems
US9798863B2 (en) 2006-03-14 2017-10-24 Sonic Ip, Inc. Federated digital rights management scheme including trusted systems
US9184920B2 (en) 2006-03-14 2015-11-10 Sonic Ip, Inc. Federated digital rights management scheme including trusted systems
US11495266B2 (en) 2007-11-16 2022-11-08 Divx, Llc Systems and methods for playing back multimedia files incorporating reduced index structures
US10902883B2 (en) 2007-11-16 2021-01-26 Divx, Llc Systems and methods for playing back multimedia files incorporating reduced index structures
US8997161B2 (en) 2008-01-02 2015-03-31 Sonic Ip, Inc. Application enhancement tracks
US8782713B2 (en) 2008-09-16 2014-07-15 Intel Corporation Systems and methods for encoding multimedia content
US20100158099A1 (en) * 2008-09-16 2010-06-24 Realnetworks, Inc. Systems and methods for video/multimedia rendering, composition, and user interactivity
US9235917B2 (en) 2008-09-16 2016-01-12 Intel Corporation Systems and methods for video/multimedia rendering, composition, and user-interactivity
US10210907B2 (en) 2008-09-16 2019-02-19 Intel Corporation Systems and methods for adding content to video/multimedia based on metadata
US9870801B2 (en) 2008-09-16 2018-01-16 Intel Corporation Systems and methods for encoding multimedia content
US8948250B2 (en) 2008-09-16 2015-02-03 Intel Corporation Systems and methods for video/multimedia rendering, composition, and user-interactivity
US8363716B2 (en) * 2008-09-16 2013-01-29 Intel Corporation Systems and methods for video/multimedia rendering, composition, and user interactivity
US20100161562A1 (en) * 2008-12-12 2010-06-24 Sachin Karajagi Method and apparatus for preventing duplicate or redundant data from storing in media player
US10437896B2 (en) 2009-01-07 2019-10-08 Divx, Llc Singular, collective, and automated creation of a media guide for online content
US9201922B2 (en) 2009-01-07 2015-12-01 Sonic Ip, Inc. Singular, collective and automated creation of a media guide for online content
US9672286B2 (en) 2009-01-07 2017-06-06 Sonic Ip, Inc. Singular, collective and automated creation of a media guide for online content
US20100241668A1 (en) * 2009-03-17 2010-09-23 Microsoft Corporation Local Computer Account Management at Domain Level
US8209598B1 (en) * 2009-08-24 2012-06-26 Adobe Systems Incorporated Exporting electronic documents from rich internet applications
US11102553B2 (en) 2009-12-04 2021-08-24 Divx, Llc Systems and methods for secure playback of encrypted elementary bitstreams
US9706259B2 (en) 2009-12-04 2017-07-11 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
US10484749B2 (en) 2009-12-04 2019-11-19 Divx, Llc Systems and methods for secure playback of encrypted elementary bitstreams
US10212486B2 (en) 2009-12-04 2019-02-19 Divx, Llc Elementary bitstream cryptographic material transport systems and methods
US9124773B2 (en) 2009-12-04 2015-09-01 Sonic Ip, Inc. Elementary bitstream cryptographic material transport systems and methods
US20110179060A1 (en) * 2010-01-19 2011-07-21 Microsoft Corporation Automatic Context Discovery
US20110179061A1 (en) * 2010-01-19 2011-07-21 Microsoft Corporation Extraction and Publication of Reusable Organizational Knowledge
US20110179045A1 (en) * 2010-01-19 2011-07-21 Microsoft Corporation Template-Based Management and Organization of Events and Projects
US20110179049A1 (en) * 2010-01-19 2011-07-21 Microsoft Corporation Automatic Aggregation Across Data Stores and Content Types
WO2011090882A3 (en) * 2010-01-19 2011-11-17 Microsoft Corporation Extraction and publication of reusable organizational knowledge
US20110209045A1 (en) * 2010-02-23 2011-08-25 Microsoft Corporation Web-Based Visual Representation of a Structured Data Solution
US9852384B2 (en) 2010-02-23 2017-12-26 Microsoft Technology Licensing, Llc Web-based visual representation of a structured data solution
US10699312B2 (en) 2010-08-31 2020-06-30 Cbs Interactive Inc. Platform for serving online content
US9953349B2 (en) * 2010-08-31 2018-04-24 Cbs Interactive Inc. Platform for serving online content
US20150112820A1 (en) * 2010-08-31 2015-04-23 Cbs Interactive Inc. Platform for serving online content
US11638033B2 (en) 2011-01-05 2023-04-25 Divx, Llc Systems and methods for performing adaptive bitrate streaming
US8649669B2 (en) * 2011-01-05 2014-02-11 Sonic Ip, Inc. Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US9247312B2 (en) 2011-01-05 2016-01-26 Sonic Ip, Inc. Systems and methods for encoding source media in matroska container files for adaptive bitrate streaming using hypertext transfer protocol
US9025659B2 (en) 2011-01-05 2015-05-05 Sonic Ip, Inc. Systems and methods for encoding media including subtitles for adaptive bitrate streaming
US20120170915A1 (en) * 2011-01-05 2012-07-05 Rovi Technologies Corporation Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US10382785B2 (en) 2011-01-05 2019-08-13 Divx, Llc Systems and methods of encoding trick play streams for use in adaptive streaming
US10368096B2 (en) 2011-01-05 2019-07-30 Divx, Llc Adaptive streaming systems and methods for performing trick play
US9210481B2 (en) 2011-01-05 2015-12-08 Sonic Ip, Inc. Systems and methods for performing smooth visual search of media encoded for adaptive bitrate streaming via hypertext transfer protocol using trick play streams
US8914534B2 (en) 2011-01-05 2014-12-16 Sonic Ip, Inc. Systems and methods for adaptive bitrate streaming of media stored in matroska container files using hypertext transfer protocol
WO2012094181A2 (en) * 2011-01-05 2012-07-12 Divx, Llc. Systems and methods for adaptive bitrate streaming of media including subtitles
WO2012094181A3 (en) * 2011-01-05 2014-04-10 Divx, Llc. Systems and methods for adaptive bitrate streaming of media including subtitles
US8977986B2 (en) 2011-01-05 2015-03-10 Advanced Micro Devices, Inc. Control panel and ring interface for computing systems
US9883204B2 (en) 2011-01-05 2018-01-30 Sonic Ip, Inc. Systems and methods for encoding source media in matroska container files for adaptive bitrate streaming using hypertext transfer protocol
US20120311445A1 (en) * 2011-06-05 2012-12-06 Museami, Inc. Enhanced media recordings and playback
US11611785B2 (en) 2011-08-30 2023-03-21 Divx, Llc Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US10931982B2 (en) 2011-08-30 2021-02-23 Divx, Llc Systems and methods for encoding and streaming video encoded using a plurality of maximum bitrate levels
US11457054B2 (en) 2011-08-30 2022-09-27 Divx, Llc Selection of resolutions for seamless resolution switching of multimedia content
US10708587B2 (en) 2011-08-30 2020-07-07 Divx, Llc Systems and methods for encoding alternative streams of video for playback on playback devices having predetermined display aspect ratios and network connection maximum data rates
US10856020B2 (en) 2011-09-01 2020-12-01 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US10244272B2 (en) 2011-09-01 2019-03-26 Divx, Llc Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US9247311B2 (en) 2011-09-01 2016-01-26 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US10341698B2 (en) 2011-09-01 2019-07-02 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US11178435B2 (en) 2011-09-01 2021-11-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US8918636B2 (en) 2011-09-01 2014-12-23 Sonic Ip, Inc. Systems and methods for protecting alternative streams in adaptive bitrate streaming systems
US10687095B2 (en) 2011-09-01 2020-06-16 Divx, Llc Systems and methods for saving encoded media streamed using adaptive bitrate streaming
US11683542B2 (en) 2011-09-01 2023-06-20 Divx, Llc Systems and methods for distributing content using a common set of encryption keys
US10225588B2 (en) 2011-09-01 2019-03-05 Divx, Llc Playback devices and methods for playing back alternative streams of content protected using a common set of cryptographic keys
US8909922B2 (en) 2011-09-01 2014-12-09 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US9621522B2 (en) 2011-09-01 2017-04-11 Sonic Ip, Inc. Systems and methods for playing back alternative streams of protected content protected using common cryptographic information
US8856218B1 (en) * 2011-12-13 2014-10-07 Google Inc. Modified media download with index adjustment
US8918908B2 (en) 2012-01-06 2014-12-23 Sonic Ip, Inc. Systems and methods for accessing digital content using electronic tickets and ticket tokens
US10289811B2 (en) 2012-01-06 2019-05-14 Divx, Llc Systems and methods for enabling playback of digital content using status associable electronic tickets and ticket tokens representing grant of access rights
US11526582B2 (en) 2012-01-06 2022-12-13 Divx, Llc Systems and methods for enabling playback of digital content using status associable electronic tickets and ticket tokens representing grant of access rights
US9626490B2 (en) 2012-01-06 2017-04-18 Sonic Ip, Inc. Systems and methods for enabling playback of digital content using electronic tickets and ticket tokens representing grant of access rights
US9224246B2 (en) * 2012-03-23 2015-12-29 Samsung Electronics Co., Ltd. Method and apparatus for processing media file for augmented reality service
US20130249900A1 (en) * 2012-03-23 2013-09-26 Kyonggi University Industry & Academia Cooperation Foundation Method and apparatus for processing media file for augmented reality service
US20130282715A1 (en) * 2012-04-20 2013-10-24 Samsung Electronics Co., Ltd. Method and apparatus of providing media file for augmented reality service
US9872069B1 (en) * 2012-06-21 2018-01-16 Google Llc Goal-based video analytics
US9197685B2 (en) 2012-06-28 2015-11-24 Sonic Ip, Inc. Systems and methods for fast video startup using trick play streams
US9143812B2 (en) 2012-06-29 2015-09-22 Sonic Ip, Inc. Adaptive streaming of multimedia
US10452715B2 (en) 2012-06-30 2019-10-22 Divx, Llc Systems and methods for compressing geotagged video
US10591984B2 (en) 2012-07-18 2020-03-17 Verimatrix, Inc. Systems and methods for rapid content switching to provide a linear TV experience using streaming content distribution
US20140072223A1 (en) * 2012-09-13 2014-03-13 Koepics, Sl Embedding Media Content Within Image Files And Presenting Embedded Media In Conjunction With An Associated Image
US8914836B2 (en) 2012-09-28 2014-12-16 Sonic Ip, Inc. Systems, methods, and computer program products for load adaptive streaming
US8997254B2 (en) 2012-09-28 2015-03-31 Sonic Ip, Inc. Systems and methods for fast startup streaming of encrypted multimedia content
US20140181158A1 (en) * 2012-12-21 2014-06-26 William Herz Media file system with associated metadata
US9313510B2 (en) 2012-12-31 2016-04-12 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9264475B2 (en) 2012-12-31 2016-02-16 Sonic Ip, Inc. Use of objective quality measures of streamed content to reduce streaming bandwidth
US9191457B2 (en) 2012-12-31 2015-11-17 Sonic Ip, Inc. Systems, methods, and media for controlling delivery of content
US10225299B2 (en) 2012-12-31 2019-03-05 Divx, Llc Systems, methods, and media for controlling delivery of content
US10805368B2 (en) 2012-12-31 2020-10-13 Divx, Llc Systems, methods, and media for controlling delivery of content
USRE48761E1 (en) 2012-12-31 2021-09-28 Divx, Llc Use of objective quality measures of streamed content to reduce streaming bandwidth
US11438394B2 (en) 2012-12-31 2022-09-06 Divx, Llc Systems, methods, and media for controlling delivery of content
US11785066B2 (en) 2012-12-31 2023-10-10 Divx, Llc Systems, methods, and media for controlling delivery of content
US20140244600A1 (en) * 2013-02-25 2014-08-28 Apple Inc Managing duplicate media items
US10264255B2 (en) 2013-03-15 2019-04-16 Divx, Llc Systems, methods, and media for transcoding video data
US10397292B2 (en) 2013-03-15 2019-08-27 Divx, Llc Systems, methods, and media for delivery of content
US10715806B2 (en) 2013-03-15 2020-07-14 Divx, Llc Systems, methods, and media for transcoding video data
US11849112B2 (en) 2013-03-15 2023-12-19 Divx, Llc Systems, methods, and media for distributed transcoding video data
US9906785B2 (en) 2013-03-15 2018-02-27 Sonic Ip, Inc. Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata
US9344517B2 (en) 2013-03-28 2016-05-17 Sonic Ip, Inc. Downloading and adaptive streaming of multimedia content to a device with cache assist
US20140344730A1 (en) * 2013-05-15 2014-11-20 Samsung Electronics Co., Ltd. Method and apparatus for reproducing content
US9094737B2 (en) 2013-05-30 2015-07-28 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US11470405B2 (en) 2013-05-30 2022-10-11 Divx, Llc Network video streaming with trick play based on separate trick play files
US9712890B2 (en) 2013-05-30 2017-07-18 Sonic Ip, Inc. Network video streaming with trick play based on separate trick play files
US10462537B2 (en) 2013-05-30 2019-10-29 Divx, Llc Network video streaming with trick play based on separate trick play files
US9247317B2 (en) 2013-05-30 2016-01-26 Sonic Ip, Inc. Content streaming with client device trick play index
US9967305B2 (en) 2013-06-28 2018-05-08 Divx, Llc Systems, methods, and media for streaming media content
US9343112B2 (en) 2013-10-31 2016-05-17 Sonic Ip, Inc. Systems and methods for supplementing content from a server
KR102149530B1 (en) 2014-01-10 2020-09-01 한국전자통신연구원 Method and apparatus to provide language translation service for mpeg user description
KR20150083971A (en) * 2014-01-10 2015-07-21 한국전자통신연구원 Method and apparatus to provide language translation service for mpeg user description
US20150199335A1 (en) * 2014-01-10 2015-07-16 Electronics And Telecommunications Research Institute Method and apparatus for representing user language characteristics in mpeg user description system
US11711552B2 (en) 2014-04-05 2023-07-25 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US10321168B2 (en) 2014-04-05 2019-06-11 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US10893305B2 (en) 2014-04-05 2021-01-12 Divx, Llc Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US9866878B2 (en) 2014-04-05 2018-01-09 Sonic Ip, Inc. Systems and methods for encoding and playing back video at different frame rates using enhancement layers
US20230067389A1 (en) * 2014-09-12 2023-03-02 Sony Group Corporation Transmission device, transmission method, reception device, and a reception method
US20160150294A1 (en) * 2014-11-20 2016-05-26 Adobe Systems Incorporated Video Content Metadata for Enhanced Video Experiences
CN105893458A (en) * 2015-02-12 2016-08-24 哈曼国际工业有限公司 Media content playback system and method
US20160239508A1 (en) * 2015-02-12 2016-08-18 Harman International Industries, Incorporated Media content playback system and method
US9794618B2 (en) 2015-02-12 2017-10-17 Harman International Industries, Incorporated Media content playback system and method
US9860658B2 (en) 2015-02-12 2018-01-02 Harman International Industries, Incorporated Media content playback system and method
US20170257664A1 (en) * 2016-02-02 2017-09-07 Telefonaktiebolaget Lm Ericsson (Publ) Method, electronic device, and system for immersive experience
US10452874B2 (en) * 2016-03-04 2019-10-22 Disney Enterprises, Inc. System and method for identifying and tagging assets within an AV file
US20170256289A1 (en) * 2016-03-04 2017-09-07 Disney Enterprises, Inc. Systems and methods for automating identification and display of video data sets
US10915715B2 (en) 2016-03-04 2021-02-09 Disney Enterprises, Inc. System and method for identifying and tagging assets within an AV file
US10721285B2 (en) 2016-03-30 2020-07-21 Divx, Llc Systems and methods for quick start-up of playback
US11483609B2 (en) 2016-06-15 2022-10-25 Divx, Llc Systems and methods for encoding video content
US10148989B2 (en) 2016-06-15 2018-12-04 Divx, Llc Systems and methods for encoding video content
US10595070B2 (en) 2016-06-15 2020-03-17 Divx, Llc Systems and methods for encoding video content
US11729451B2 (en) 2016-06-15 2023-08-15 Divx, Llc Systems and methods for encoding video content
US20180225289A1 (en) * 2017-02-06 2018-08-09 Autochips Inc. Audio/video file playback method and audio/video file playback apparatus
US11343300B2 (en) 2017-02-17 2022-05-24 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
US10498795B2 (en) 2017-02-17 2019-12-03 Divx, Llc Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming
US20230300349A1 (en) * 2018-04-05 2023-09-21 Canon Kabushiki Kaisha Method and apparatus for encapsulating images or sequences of images with proprietary information in a file
US10887646B2 (en) * 2018-08-17 2021-01-05 Kiswe Mobile Inc. Live streaming with multiple remote commentators
US20200059687A1 (en) * 2018-08-17 2020-02-20 Kiswe Mobile Inc. Live streaming with multiple remote commentators
US11810335B2 (en) 2021-02-16 2023-11-07 International Business Machines Corporation Metadata for embedded binary data in video containers
WO2023167849A1 (en) * 2022-03-02 2023-09-07 Streaming Global, Inc. Content delivery platform

Also Published As

Publication number Publication date
EP2304587A4 (en) 2012-07-11
EP2304587A1 (en) 2011-04-06
KR20110056476A (en) 2011-05-30
WO2009149442A1 (en) 2009-12-10
EP2257931A4 (en) 2011-03-16
CN102037494A (en) 2011-04-27
EP2257931A1 (en) 2010-12-08
WO2009149440A1 (en) 2009-12-10
US20090303241A1 (en) 2009-12-10
KR20110014995A (en) 2011-02-14
JP2011523309A (en) 2011-08-04
US8289338B2 (en) 2012-10-16

Similar Documents

Publication Publication Date Title
US20090307258A1 (en) Multimedia distribution and playback systems and methods using enhanced metadata structures
US9918134B2 (en) Method and system for content delivery
US9620172B2 (en) Systems and methods for converting interactive multimedia content authored for distribution via a physical medium for electronic distribution
US20040220926A1 (en) Personalization services for entities from multiple sources
US20040220791A1 (en) Personalization services for entities from multiple sources
EP2249256A1 (en) Method and device for providing content metadata and method and device for restricting access rights to contents
KR20060122662A (en) Storage medium including application for providing meta data, apparatus for providing meta data and method therefor
US7716248B2 (en) Method and system to enable dynamic modification of metadata in content
EP1709550A2 (en) Personalization services for entities from multiple sources
US20180091867A1 (en) Video content replay
US20050071368A1 (en) Apparatus and method for displaying multimedia data combined with text data and recording medium containing a program for performing the same method
US20220150557A1 (en) Method, device, and computer program for signaling available portions of encapsulated media content
Priyadarshi et al. Rich Metadata Description for Interactivity and Dynamic User-Generated Information
Gibbon et al. Video Data Sources and Applications
WO2009045051A2 (en) Method for providing initial behavior of multimedia application format content and system therefor
De Vos Usability improvements on a Metadata Server for Video on Demand based on Free Software
Pereira et al. Future MPEG developments
Vicars-Harris Advanced Systems Format
Jin et al. Storage format for personalized broadcasting content consumption

Legal Events

Date Code Title Description
AS Assignment

Owner name: DIVX, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PRIYADARSHI, SHAIWAL;SOROUSHIAN, KOUROSH;BRANESS, JASON;AND OTHERS;REEL/FRAME:023021/0295;SIGNING DATES FROM 20090721 TO 20090728

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, NE

Free format text: SECURITY AGREEMENT;ASSIGNORS:ALL MEDIA GUIDE, LLC;DIVX, LLC;SONIC SOLUTIONS LLC;REEL/FRAME:026026/0111

Effective date: 20110325

AS Assignment

Owner name: DIVX, LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:DIVX, INC.;REEL/FRAME:028965/0830

Effective date: 20101007

AS Assignment

Owner name: DIVX, LLC, CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:030591/0534

Effective date: 20130607

Owner name: ALL MEDIA GUDE, LLC, CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:030591/0534

Effective date: 20130607

Owner name: SONIC SOLUTIONS LLC, CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:030591/0534

Effective date: 20130607

AS Assignment

Owner name: SONIC IP, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DIVX, LLC;REEL/FRAME:031713/0032

Effective date: 20131121

AS Assignment

Owner name: DIVX, LLC, CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:032645/0559

Effective date: 20140331

AS Assignment

Owner name: DIVX, LLC, CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNOR:JPMORGAN CHASE BANK N.A., AS COLLATERAL AGENT;REEL/FRAME:033378/0685

Effective date: 20140702

Owner name: SONIC SOLUTIONS LLC, CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNOR:JPMORGAN CHASE BANK N.A., AS COLLATERAL AGENT;REEL/FRAME:033378/0685

Effective date: 20140702

Owner name: ALL MEDIA GUIDE, LLC, CALIFORNIA

Free format text: PATENT RELEASE;ASSIGNOR:JPMORGAN CHASE BANK N.A., AS COLLATERAL AGENT;REEL/FRAME:033378/0685

Effective date: 20140702

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION