EP1884115A4 - Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting - Google Patents

Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting

Info

Publication number
EP1884115A4
EP1884115A4 EP06768652A EP06768652A EP1884115A4 EP 1884115 A4 EP1884115 A4 EP 1884115A4 EP 06768652 A EP06768652 A EP 06768652A EP 06768652 A EP06768652 A EP 06768652A EP 1884115 A4 EP1884115 A4 EP 1884115A4
Authority
EP
European Patent Office
Prior art keywords
data
trigger
time stamp
time
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06768652A
Other languages
German (de)
French (fr)
Other versions
EP1884115A1 (en
Inventor
Gwang-Soon Lee
Kyu-Tae Yang
Bong-Ho Lee
Young-Kwon Hahm
Chung-Hyun Ahn
Soo-In Lee
Do-Hyung Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronics and Telecommunications Research Institute ETRI
Alticast Corp
Original Assignee
Electronics and Telecommunications Research Institute ETRI
Alticast Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electronics and Telecommunications Research Institute ETRI, Alticast Corp filed Critical Electronics and Telecommunications Research Institute ETRI
Publication of EP1884115A1 publication Critical patent/EP1884115A1/en
Publication of EP1884115A4 publication Critical patent/EP1884115A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/24Systems for the transmission of television signals using pulse code modulation
    • H04N7/52Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal
    • H04N7/54Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal the signals being synchronous
    • H04N7/56Synchronising systems therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H20/00Arrangements for broadcast or for distribution combined with broadcast
    • H04H20/28Arrangements for simultaneous broadcast of plural pieces of information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04HBROADCAST COMMUNICATION
    • H04H60/00Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
    • H04H60/02Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information
    • H04H60/06Arrangements for scheduling broadcast services or broadcast-related services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems

Definitions

  • the present invention relates to digital multimedia broadcasting (DMB); and more particularly, to a method and apparatus for synchronizing data transmitted based on the Eureka-147 with audio and video (AV) data transmitted after coded and multiplexed into Moving Picture Experts Group (MPEG) 4 or MPEG-2.
  • DMB digital multimedia broadcasting
  • AV audio and video
  • a synchronized data service model provides a means for an application program executed in a terrestrial Digital Multimedia Broadcasting (DMB) middleware to make a performance in synchronization with other media, such as a DMB video service.
  • DMB Digital Multimedia Broadcasting
  • a video service means an audio/vieo (AV) service provided based on "Digital Audio Broadcasting (DAB); DMB video service; User Application Specification” (ETSI TS 102 428).
  • Fig. 1 is an exemplary block diagram showing a conventional terrestrial DMB transmitting system.
  • a DMB AV encoder 110 encodes the audio and video signals based on a DMB video transmission and reception interface standard to thereby create AV stream.
  • This process includes an MPEG-4 AV encoding procedure and a multiplexing procedure into MPEG-2 transport stream (TS).
  • TS MPEG-2 transport stream
  • the data signal source 130 generates diverse data having no concern with AV data
  • the data encoder 140 encodes the diverse data generated in the data signal source 130 to create data packets.
  • the AV stream and the data packets outputted from the DMB AV encoder 110 and the data encoder 140 are multiplexed into frames of the Ensemble Transport Interface (ETI), i.e., ETI frames, in an ensemble multiplexer 150.
  • ETI frames go through Coded Orthogonal Frequency Division Multiplexing (COFDM) encoding in a DMB transmitter 160 to be outputted in the form of radio frequency (RF) signals.
  • COFDM Coded Orthogonal Frequency Division Multiplexing
  • the ETI frames are basically composed of fast information channel (FIC) data and main service channel (MSC) data.
  • FIC fast information channel
  • MSC main service channel
  • the FIC data and the MSC data are generated in an FIC unit 151 and an MSC unit 152 of the ensemble multiplexer 150, respectively.
  • the FIC is an information channel for fast access to multiplexing information and service information in a Eureka-147 system
  • the MSC is a channel for multiplexing service components each corresponding to each service according to a multiplexing structure set up through the FIC.
  • Basic audio data, AV stream and diverse additional data are multiplexed and transmitted through the MSC.
  • absolute time is added to FIG-type 0 expansive 10 (FIG 0/10) in the form of universal coordinated time (UCT), and this provides a reference time based on which the MSC data are decoded and presented.
  • the UCT absolute time information may be added as a standard for timing information when data are encoded.
  • MPEG-4 and/or MPEG-2 systems use a clock reference and time stamps to synchronize AV data transmitted over elementary stream (ES) and transmit timing information.
  • ES elementary stream
  • a receiving terminal uses a decoding time stamp (DTS) to define a decoding time point of each access unit in a decoding buffer, and uses a composition time stamp (CTS) to accurately define a composition time point of each composition unit (CU).
  • DTS decoding time stamp
  • CTS composition time stamp
  • An object clock reference (OCR) is used to transmit a time mark of given stream to an ES decoder.
  • An OCR value corresponds to an object time base (OTB) value at a time when a transmitting terminal generates an OCR time stamp.
  • OCR values are included in an SL packet header and transmitted.
  • an MPEG-2 system uses a program clock reference (PCR) and a presentation time stamp (PTS), and a MPEG-4 system uses an object clock reference (OCR), a composition time stamp (CTS), and a decoding time stamp (DTS).
  • PCR program clock reference
  • PTS presentation time stamp
  • DTS decoding time stamp
  • the MPEG-4 system is synchronized with the MPEG-2 system by mapping SL packets of MPEG-4 to Packetized Elementary Stream (PES) of MPEG-2 at a ratio of 1 : 1.
  • a PES header includes PTS of the MPEG-2 only when an SL packet header of MPEG-4 includes an OCR. Otherwise, the PTS of the MPEG-2 is not used.
  • object time base (OTB) that defines a time stamp for MPEG-4 data stream is engaged with a system time clock of MPEG-2.
  • the data encoder 140 of Fig. 1 transmits timing information with reference to UTC absolute time information transmitted over an FIC channel, when it encodes data based on multimedia object protocol (MOT), which is a data transmission specification based on a DMB system specification, i.e., the Eureka-147.
  • MOT multimedia object protocol
  • the data encoder 140 objectizes data on a file or directory basis and then packetizes the objects.
  • a time stamp which is timing information on time when one data object is decoded and presented, is added to the header of the data object in the form of UTC.
  • this method has a problem that it is hard to exactly synchronize data with DMB AV stream. Disclosure Technical Problem
  • an object of the present invention to provide a method and apparatus for synchronizing data with audio/video (AV) data in a Digital Multimedia
  • DMB Broadcasting
  • a method for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting including the steps of: a) receiving an AV time stamp for the AV data; b) calculating a time stamp of the data, which is information on a time point when the data are to be presented in a user terminal, which will be shortly referred to as data time stamp hereinafter, based on the AV time stamp; c) generating sync metadata including the calculated data time stamp; and d) encoding the sync metadata and transmitting the encoded sync metadata.
  • DMB digital multimedia broadcasting
  • a method for providing data synchronized with AV data in DMB including the steps of: a) separating receiving signals into AV stream and data packets; b) receiving a system reference time information from an AV stream decoder for decoding the AV stream; c) acquiring a data time stamp from sync metadata included in the data packets; and d) comparing the data time stamp with the system reference time, and decoding and presenting a data object file at a time point when the data time stamp coincides with the system reference time.
  • the present invention can synchronize data transmitted based on the Eureka-147 with a video based on
  • MPEG Moving Picture Experts Group 4 and MPEG-2.
  • DMB Digital Multimedia Broadcasting
  • Fig. 1 is an exemplary block diagram showing a conventional terrestrial Digital Multimedia Broadcasting (DMB) transmitting system
  • Fig. 2 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with an embodiment of the present invention
  • Fig. 3 is a view describing a method of calculating a time point when data are to be presented based on a composition time stamp (CTS) value;
  • CTS composition time stamp
  • Fig. 4 is a view showing a structure of a data carousel shown in Fig. 3
  • Fig. 5 is a view showing a structure of sync metadata shown in Fig. 3;
  • Fig. 6 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with another embodiment of the present invention
  • Fig. 7 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with yet another embodiment of the present invention
  • Fig. 8 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with still another embodiment of the present invention.
  • Fig. 9 is a block diagram showing a receiving system capable of providing data synchronized with video data in accordance with an embodiment of the present invention.
  • Fig. 2 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with an embodiment of the present invention.
  • the terrestrial DMB transmitting system includes an audio/video (AV) signal source 200 for generating AV contents, a Digital Multimedia Broadcasting (DMB) AV encoder 210 for encoding the generated AV contents into AV stream based on a terrestrial DMB standard, a data server 230 for providing diverse data services, an ensemble multiplexer 240 for multiplexing the generated AV stream and data packets into ensembles, a DMB transmitter 250 for performing Orthogonal Frequency Division Multiplexing (OFDM) encoding and radio frequency (RF) transmission, and a Network Time Protocol (NTP) server 220 for synchronizing the above constituent elements temporally.
  • OFDM Orthogonal Frequency Division Multiplexing
  • RF radio frequency
  • NTP Network Time Protocol
  • the data server 230 is composed of a data signal source 231, a data management and controlling unit 233 for managing and controlling the data signal source and a data encoder 235, and the data encoder 235 for encoding generated data based on diverse DMB data transmission and reception standards.
  • the DMB AV encoder 210 encodes the AV data according to a terrestrial DMB video standard based on MPEG-4 and MPEG-2.
  • the DMB AV encoder 210 inserts Object Clock Reference (OCR) and Composition Time Stamp (CTS) at MPEG-4 layer, inserting Program Clock Reference (PCR) and Program Time Stamp (PTS) at MPEG-2 layer.
  • OCR Object Clock Reference
  • CTS Composition Time Stamp
  • PCR Program Clock Reference
  • PTS Program Time Stamp
  • the DMB AV encoder 210 supplies Composition Time Stamp (CTS) of an initial period of a program to the data server 230.
  • the data signal source 231 To provide additional data based on the Eureka-147, the data signal source 231 generates and stores diverse detailed data by collecting and authoring JAVA-based application data, texts related to the application data, image, moving pictures and the like.
  • the additional data are encoded in the data encoder 235 and transmitted under the control and management of the data management and controlling unit 233.
  • the NTP server 220 temporally synchronizes the AV signal source 200, the AV encoder 210, and the data server 230.
  • the data management and controlling unit 233 manages the time points when the data from the data signal source 231 are inserted.
  • the calculated synchronization information is directly transmitted to the data encoder 235 or becomes metadata for synchronization between video and the data and transmitted to the data encoder 235.
  • Fig. 3 describes a method of calculating a time point when data are to be presented based on a time stamp value, e.g., a composition time stamp (CTS) value.
  • a time stamp value e.g., a composition time stamp (CTS) value.
  • CTS composition time stamp
  • time information (V(b) ) for a scene where data synchronized with video data are to be added can be exactly extracted in advance.
  • a time stamp which is restoration time of each scene at the user terminal, such as CTS, is added to the header of an SL packet.
  • data files are broadcasted in the form of data carousels prior to the restoration time in the user terminal. Then, the user terminal downloads the data carousels and performs data restoration at the predetermined time point.
  • data are of a downloadable application program and/or related data thereof. The data may be stored in a non-volatile memory in advance.
  • the data management and controlling unit 233 generates sync metadata to presented data synchronized with a particular scene and it transmits the generated sync metadata based on an appropriate data protocol.
  • Fig. 4 is a view showing a structure of a data carousel shown in Fig. 3
  • Fig. 5 is a view showing a structure of sync metadata shown in Fig. 3.
  • the sync metadata may be composed of an identifier, a trigger time, which is a video time stamp, e.g., CTS, a related data indicator, and data.
  • the identifier identifies data
  • the trigger time includes a data decoding time, data restoration time, data extermination time in a user terminal, and it is calculated in advance based on the time stamp of video, which will be referred to as a video time stamp, e.g., CTS.
  • the related data indicator indicates data which are synchronized with a particular scene and presented in connection with an application program executed in the user terminal.
  • the data include information instantly needed.
  • the sync metadata are added prior to a particular video restoration time (V(b) ) .
  • the trigger time (Ts(b)) that constitutes the synch metadata should be estimated prior to the video restoration time (V(b)).
  • the time when to calculate the trigger time (Ts(b)) can be acquired based on the known video restoration time (V(b) ) and the video time stamp information of an initial period of the application program, which includes video information and time stamp information and is inputted from the DMB AV encoder 210.
  • the video time stamp information of the initial period of the program is temporally synchronized by the NTP server among all devices, it can be easily extracted.
  • the video time stamp information is directly added to the header of a data object repeatedly transmitted by a data carousel without using the sync metadata to thereby synchronize the restoration time of the data object with the video.
  • the synch metadata will be described in detail with reference to examples.
  • trigger time synchronized with video and indicating time when a particular event is executed in an application program should be generated, and the generated trigger time should be transmitted from a data server of a transmitting part to a terrestrial DMB middleware within a short time.
  • the transmitting part should transmit the trigger time and data to be executed or an indicator of the data to execute an event synchronized with video.
  • a message composed of the trigger time and data to be executed or the indicator of the data is referred to as a trigger packet.
  • the trigger packet is an example of the sync metadata.
  • the trigger packet should be scheduled and transmitted from the data server of the transmitting part prior to a predetermined synchronized trigger time to make the application program execute the event. When it reaches the trigger time, the trigger packet is transmitted repeatedly to make the application program perform a predetermined action indicated by the event at a time synchronized with the video.
  • the terrestrial DMB uses a Transparent Data Channel (TDC) packet mode which has a small overhead and does not use a data group having a short waiting time.
  • the TDC packet mode is based on ETSI TS 101 759, i.e., Digital Audio Broadcasting (DAB); DAB Data Broadcasting Transparent Data Channel (TDC) standard, to transmit the trigger packet.
  • DAB Digital Audio Broadcasting
  • TDC Transparent Data Channel
  • the trigger packet may be transmitted by using other transmission protocols.
  • Table 1 shows formats of the trigger packet .
  • Triggerld is an identifier for identifying a trigger in the application program
  • TriggerTime indicates a time point when an event is generated.
  • a video time stamp such as CTS is used at around the event generation time point to provide a link service.
  • a video service providing device and the data server need to cooperate to use the CTS at the trigger time.
  • privateDataByte indicates data that are needed for the application program needs to execute an event at the trigger time.
  • the privateDataByte may be composed of a related data indicator and data, which is shown in Fig.
  • API Model A terrestrial DMB middleware defines a trigger interface in a dmb.io package, which is defined for data reception through a DMB data channel, to provide a synchronized data service.
  • the API model is what is extended from javax.microedition . Datagram (CLDC 1.1 (JSR 139) at http://java.sun.com/products/cldc/index.jsp) and it is a datagram including event information. Trigger is used to transmit sync signals to other media.
  • the API model is linked with the ID of the trigger and time information when an event indicated by the trigger is to be executed. Although a plurality of triggers are received on a broadcasting network, if the triggers have the same ID, they are all treated as the same triggers.
  • a trigger may be transmitted several times to increase a reception possibility or to make sure the time indicated by the trigger because the time may be changed due to discontinuity in system time clocks.
  • the trigger is ignored.
  • the ID of the trigger whose time is past and process is completed may be reused by another trigger.
  • doItNow( ) is false.
  • a trigger having a false doItNow( ) is transmitted to the application program only once even though the trigger of the same ID is transmitted several times. If any, when the trigger time is changed in the middle, triggers of the same ID may be transmitted several times.
  • a trigger whose doItNow( ) is true is transmitted to the application program.
  • the application program executes an operation indicated by the trigger instantly. After the execution, even though a trigger of the same ID is transmitted, the trigger is treated as a different one from the trigger whose doItNow( ) was true.
  • PrivateData transmitted in the form of trigger packets are read by using a method of Diagram that is an upper class of the trigger.
  • Table 2 defines API for a synchronized service.
  • Table 2 public interface dmb. io. Trigger
  • Fig. 6 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with another embodiment of the present invention.
  • the drawing shows an example of a system where a data server 430 uses a time stamp extracting unit 437, when a DMB AV encoder 410 cannot directly output time information of a video source and the time stamp information thereof to the data server 430.
  • the time stamp extracting unit 437 extracts video time stamp, e.g., CTS of the video, from the AV stream outputted from the DMB AV encoder 410 and the extracted time stamp is inputted to the data management and controlling unit 433.
  • video time stamp e.g., CTS of the video
  • Sync data may be serviced more easily when AV data are encoded in advance and stored in the form of stream, which is shown in Figs. 7 and 8.
  • time of a video to which data are to be added and a time stamp thereof can be acquired in advance.
  • Fig. 7 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing video with data in real-time in accordance with yet another embodiment of the present invention.
  • AV stream is stored in an MPEG-2 file or a Forward Error Correction (FEC) -added file thereof .
  • FEC Forward Error Correction
  • a DMB transmitting system includes an AV signal source 500, an AV encoder 510 for encoding AV signals based on a terrestrial DMB standard, a storage 560 for storing the AV signals encoded in the form of stream, a data server 530 for generating sync metadata to provide data synchronized with AV data by using an AV time stamp which are supplied from the storage 560, and multiplexers 520, 540 and 550 for multiplexing the output signals of the AV encoder 510 and the output signals of the data server 530.
  • video time information of a video to which data are to be added and a data time stamp information for data restoration are inputted to the data server 530 in advance.
  • the data management and controlling unit 533 determines when to add the data and generates sync metadata based on the time to add the data.
  • stream switching occurs in a switcher.
  • the switcher 520 performs re-stamping, which is a process for guaranteeing continuity of time stamp.
  • the time stamp pre-added to a stream file is re-established in the switcher 520 to be in continuum with the time stamp of the AV stream outputted from the DMB AV encoder 510.
  • a predetermined value is added to the time stamp.
  • the data server 530 should receive the time stamp of the AV stream re-established in the switcher 520 and information on the time point when the switching has occurred in order to reflect the re-establishment.
  • Fig. 8 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing video with data in real-time in accordance with still another embodiment of the present invention.
  • AV stream is encoded in the form of an MP4 file and directly added to the DMB AV encoder 610 in the transmitting system of Fig. 8.
  • AV stream When AV stream is encoded into an MPEG-4 format and an MP4 file, which is one of storage file formats, is added to DMB, it should be packetized into MPEG-4 SL and an MPEG-2 transport stream (TS) in an M4onM2 processing module 620, which is shown in Fig. 8.
  • This process may be carried out inside the DMB AV encoder 610 but it also may be performed in an additional device.
  • an MP4 file is packetized into MPEG-4 SL and MPEG-2 TS
  • the comparative time information inside the MP4 file is transformed into OCR or CTS.
  • the data server 630 already includes time information of video to which data are to be added and a data time stamp for data restoration.
  • FIG. 9 is a block diagram showing a receiving system capable of providing a service where data are synchronized with video in accordance with an embodiment of the present invention.
  • the receiving system capable of providing video synchronized with data includes an RF receiving channel decoder 710, an MSC processor 730, a DMB AV decoder 740, and a DMB data decoder 760, and a data presenting apparatus 770.
  • the RF receiving channel decoder 710 receives RF signals, demodulates the RF signals into baseband signals, performs channel decoding, and separates FIC data from MSC data.
  • a FIC analyzer 720 analyzes the FIC data including multiplexing information and service information and provides an analysis result to the MSC processor 730.
  • the MSC processor 730 separates data transmitted through an MSC channel into data packets and AV stream.
  • the AV stream and the data packet are inputted to the DMB AV decoder 740 and the DMB data decoder 760 to be decoded, respectively.
  • the DMB AV presenting apparatus 750 and the data presenting apparatus 770 presented the AV stream and the data at the same restoration time, respectively.
  • the DMB data decoder 760 receives a system reference time information such as OCR from the DMB AV decoder 740 and compares the system reference time information with the CTS-based data time stamp information added to the header of sync metadata or the header of a data object. What coincides herein becomes the restoration time of the data object file.
  • the data presenting apparatus 770 executes an application program directed by the sync metadata, and related data indicated by the sync metadata and instant data added to the sync metadata are presented at the extracted restoration time, respectively, to be synchronized with AV data.
  • the method of the present invention which is described above, can be realized as a program and stored in a computer-readable recording medium such as CD-ROM, RAM, ROM, floppy disks, hard disks, magneto-optical disks and the like. Since the process can be easily implemented by those skilled in the art of the present invention, detailed description on it will not be provided herein.

Abstract

Provided is a method and apparatus for transmitting an metadata synchronized with audio/vidieo (AV) data on a satellite, terrestrial, cable or proprietary digital broadcasting network. The method includes the steps of : a) receiving an AV time stamp for the AV data; b) calculating a time stamp of the data, which is information on a time point when the data are to be presented in a user terminal, which will be shortly referred to as data time stamp hereinafter, based on the AV time stamp; c) generating sync metadata including the calculated data time stamp; and d) encoding the sync metadata and transmitting the encoded sync metadata.

Description

METHOD AND APPARATUS FOR SYNCHRONIZING DATA SERVICE WITH VIDEO SERVICE IN DIGITAL MULTIMEDIA BROADCASTING
Description Technical Field
The present invention relates to digital multimedia broadcasting (DMB); and more particularly, to a method and apparatus for synchronizing data transmitted based on the Eureka-147 with audio and video (AV) data transmitted after coded and multiplexed into Moving Picture Experts Group (MPEG) 4 or MPEG-2.
Background Art
A synchronized data service model provides a means for an application program executed in a terrestrial Digital Multimedia Broadcasting (DMB) middleware to make a performance in synchronization with other media, such as a DMB video service. Herein, a video service means an audio/vieo (AV) service provided based on "Digital Audio Broadcasting (DAB); DMB video service; User Application Specification" (ETSI TS 102 428).
Fig. 1 is an exemplary block diagram showing a conventional terrestrial DMB transmitting system. When an AV signal source 100 generates video signals and audio signals for DMB, a DMB AV encoder 110 encodes the audio and video signals based on a DMB video transmission and reception interface standard to thereby create AV stream. This process includes an MPEG-4 AV encoding procedure and a multiplexing procedure into MPEG-2 transport stream (TS). Meanwhile, to provide a data service based on the Eureka-147, the data signal source 130 generates diverse data having no concern with AV data, and the data encoder 140 encodes the diverse data generated in the data signal source 130 to create data packets. The AV stream and the data packets outputted from the DMB AV encoder 110 and the data encoder 140 are multiplexed into frames of the Ensemble Transport Interface (ETI), i.e., ETI frames, in an ensemble multiplexer 150. The ETI frames go through Coded Orthogonal Frequency Division Multiplexing (COFDM) encoding in a DMB transmitter 160 to be outputted in the form of radio frequency (RF) signals.
Herein, the ETI frames are basically composed of fast information channel (FIC) data and main service channel (MSC) data. The FIC data and the MSC data are generated in an FIC unit 151 and an MSC unit 152 of the ensemble multiplexer 150, respectively.
The FIC is an information channel for fast access to multiplexing information and service information in a Eureka-147 system, and the MSC is a channel for multiplexing service components each corresponding to each service according to a multiplexing structure set up through the FIC. Basic audio data, AV stream and diverse additional data are multiplexed and transmitted through the MSC. In the FIC, absolute time is added to FIG-type 0 expansive 10 (FIG 0/10) in the form of universal coordinated time (UCT), and this provides a reference time based on which the MSC data are decoded and presented. Thus, the UCT absolute time information may be added as a standard for timing information when data are encoded.
MPEG-4 and/or MPEG-2 systems use a clock reference and time stamps to synchronize AV data transmitted over elementary stream (ES) and transmit timing information.
In the MPEG-4 systems, a receiving terminal uses a decoding time stamp (DTS) to define a decoding time point of each access unit in a decoding buffer, and uses a composition time stamp (CTS) to accurately define a composition time point of each composition unit (CU). An object clock reference (OCR) is used to transmit a time mark of given stream to an ES decoder. An OCR value corresponds to an object time base (OTB) value at a time when a transmitting terminal generates an OCR time stamp. OCR values are included in an SL packet header and transmitted.
Korean terrestrial DMB uses both MPEG-4 and MPEG-2. To transmit and synchronize AV data, an MPEG-2 system uses a program clock reference (PCR) and a presentation time stamp (PTS), and a MPEG-4 system uses an object clock reference (OCR), a composition time stamp (CTS), and a decoding time stamp (DTS). Also, the MPEG-4 system is synchronized with the MPEG-2 system by mapping SL packets of MPEG-4 to Packetized Elementary Stream (PES) of MPEG-2 at a ratio of 1 : 1. A PES header includes PTS of the MPEG-2 only when an SL packet header of MPEG-4 includes an OCR. Otherwise, the PTS of the MPEG-2 is not used. Also, object time base (OTB) that defines a time stamp for MPEG-4 data stream is engaged with a system time clock of MPEG-2.
Meanwhile, the data encoder 140 of Fig. 1 transmits timing information with reference to UTC absolute time information transmitted over an FIC channel, when it encodes data based on multimedia object protocol (MOT), which is a data transmission specification based on a DMB system specification, i.e., the Eureka-147. In short, the data encoder 140 objectizes data on a file or directory basis and then packetizes the objects. Herein, a time stamp, which is timing information on time when one data object is decoded and presented, is added to the header of the data object in the form of UTC. However, this method has a problem that it is hard to exactly synchronize data with DMB AV stream. Disclosure Technical Problem
It is, therefore, an object of the present invention to provide a method and apparatus for synchronizing data with audio/video (AV) data in a Digital Multimedia
Broadcasting (DMB) by generating timing information on time when a particular event is executed in an application program to provide a data synchronized with a DMB AV service.
Technical Solution
In accordance with one aspect of the present invention, there is provided a method for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting (DMB), the method including the steps of: a) receiving an AV time stamp for the AV data; b) calculating a time stamp of the data, which is information on a time point when the data are to be presented in a user terminal, which will be shortly referred to as data time stamp hereinafter, based on the AV time stamp; c) generating sync metadata including the calculated data time stamp; and d) encoding the sync metadata and transmitting the encoded sync metadata.
In accordance with another aspect of the present invention, there is provided a method for providing data synchronized with AV data in DMB, the method including the steps of: a) separating receiving signals into AV stream and data packets; b) receiving a system reference time information from an AV stream decoder for decoding the AV stream; c) acquiring a data time stamp from sync metadata included in the data packets; and d) comparing the data time stamp with the system reference time, and decoding and presenting a data object file at a time point when the data time stamp coincides with the system reference time.
Advantageous Effects
The present invention can synchronize data transmitted based on the Eureka-147 with a video based on
Moving Picture Experts Group (MPEG) 4 and MPEG-2. Thus, the present invention can contribute to the spread of diverse Digital Multimedia Broadcasting (DMB) services.
Description of Drawings
The above and other objects and features of the present invention will become apparent from the following description of the preferred embodiments given in conjunction with the accompanying drawings, in which:
Fig. 1 is an exemplary block diagram showing a conventional terrestrial Digital Multimedia Broadcasting (DMB) transmitting system;
Fig. 2 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with an embodiment of the present invention; Fig. 3 is a view describing a method of calculating a time point when data are to be presented based on a composition time stamp (CTS) value;
Fig. 4 is a view showing a structure of a data carousel shown in Fig. 3; Fig. 5 is a view showing a structure of sync metadata shown in Fig. 3;
Fig. 6 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with another embodiment of the present invention; Fig. 7 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with yet another embodiment of the present invention; Fig. 8 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with still another embodiment of the present invention; and
Fig. 9 is a block diagram showing a receiving system capable of providing data synchronized with video data in accordance with an embodiment of the present invention.
Best Mode for the Invention
Other objects and aspects of the invention will become apparent from the following description of the embodiments with reference to the accompanying drawings, which is set forth hereinafter. Those skilled in the art of the invention can easily implement the technological concept of the present invention. When it is determined that detailed description of a related art may obscure the points of the present invention, the description will not be provided herein. Hereinafter, embodiments of the present invention will be described in detail, with reference to the accompanying drawings.
Fig. 2 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with an embodiment of the present invention.
The terrestrial DMB transmitting system includes an audio/video (AV) signal source 200 for generating AV contents, a Digital Multimedia Broadcasting (DMB) AV encoder 210 for encoding the generated AV contents into AV stream based on a terrestrial DMB standard, a data server 230 for providing diverse data services, an ensemble multiplexer 240 for multiplexing the generated AV stream and data packets into ensembles, a DMB transmitter 250 for performing Orthogonal Frequency Division Multiplexing (OFDM) encoding and radio frequency (RF) transmission, and a Network Time Protocol (NTP) server 220 for synchronizing the above constituent elements temporally. The data server 230 is composed of a data signal source 231, a data management and controlling unit 233 for managing and controlling the data signal source and a data encoder 235, and the data encoder 235 for encoding generated data based on diverse DMB data transmission and reception standards.
When the AV signal source 200 generates AV data in real-time or takes them out from a storage to provide a moving picture service in a DMB system, the DMB AV encoder 210 encodes the AV data according to a terrestrial DMB video standard based on MPEG-4 and MPEG-2. Herein, for synchronization the DMB AV encoder 210 inserts Object Clock Reference (OCR) and Composition Time Stamp (CTS) at MPEG-4 layer, inserting Program Clock Reference (PCR) and Program Time Stamp (PTS) at MPEG-2 layer. The DMB AV encoder 210 supplies Composition Time Stamp (CTS) of an initial period of a program to the data server 230.
To provide additional data based on the Eureka-147, the data signal source 231 generates and stores diverse detailed data by collecting and authoring JAVA-based application data, texts related to the application data, image, moving pictures and the like. The additional data are encoded in the data encoder 235 and transmitted under the control and management of the data management and controlling unit 233. Also, the NTP server 220 temporally synchronizes the AV signal source 200, the AV encoder 210, and the data server 230. The data management and controlling unit 233 manages the time points when the data from the data signal source 231 are inserted. It also calculates the time points when the data are presented in a user terminal into video time stamp-based values, i.e., CTS-based values, by using the initial CTS of a program inputted from the DMB AV encoder 210, and known scene information and data adding time from the AV signal source 200. The calculated synchronization information is directly transmitted to the data encoder 235 or becomes metadata for synchronization between video and the data and transmitted to the data encoder 235.
Fig. 3 describes a method of calculating a time point when data are to be presented based on a time stamp value, e.g., a composition time stamp (CTS) value.
In case of a recorded program, time information (V(b) ) for a scene where data synchronized with video data are to be added can be exactly extracted in advance. When the video data are encoded in the DMB AV encoder 210, a time stamp, which is restoration time of each scene at the user terminal, such as CTS, is added to the header of an SL packet. In a non-streaming data broadcasting system, data files are broadcasted in the form of data carousels prior to the restoration time in the user terminal. Then, the user terminal downloads the data carousels and performs data restoration at the predetermined time point. Herein, data are of a downloadable application program and/or related data thereof. The data may be stored in a non-volatile memory in advance. In the present specification, the data management and controlling unit 233 generates sync metadata to presented data synchronized with a particular scene and it transmits the generated sync metadata based on an appropriate data protocol. Fig. 4 is a view showing a structure of a data carousel shown in Fig. 3, and Fig. 5 is a view showing a structure of sync metadata shown in Fig. 3.
As illustrated in Fig. 5, the sync metadata may be composed of an identifier, a trigger time, which is a video time stamp, e.g., CTS, a related data indicator, and data. The identifier identifies data, and the trigger time includes a data decoding time, data restoration time, data extermination time in a user terminal, and it is calculated in advance based on the time stamp of video, which will be referred to as a video time stamp, e.g., CTS. The related data indicator indicates data which are synchronized with a particular scene and presented in connection with an application program executed in the user terminal. The data include information instantly needed.
The sync metadata are added prior to a particular video restoration time (V(b) ) . Thus, the trigger time (Ts(b)) that constitutes the synch metadata should be estimated prior to the video restoration time (V(b)). The time when to calculate the trigger time (Ts(b)) can be acquired based on the known video restoration time (V(b) ) and the video time stamp information of an initial period of the application program, which includes video information and time stamp information and is inputted from the DMB AV encoder 210. Herein, the video time stamp information of the initial period of the program is temporally synchronized by the NTP server among all devices, it can be easily extracted.
In case of a simple sync data providing service, as shown in Fig. 4, the video time stamp information is directly added to the header of a data object repeatedly transmitted by a data carousel without using the sync metadata to thereby synchronize the restoration time of the data object with the video. Hereinafter, the synch metadata will be described in detail with reference to examples.
In order to provide a DMB service and a sync data service, trigger time synchronized with video and indicating time when a particular event is executed in an application program should be generated, and the generated trigger time should be transmitted from a data server of a transmitting part to a terrestrial DMB middleware within a short time. In other words, when a user terminal executes an application program, the transmitting part should transmit the trigger time and data to be executed or an indicator of the data to execute an event synchronized with video. A message composed of the trigger time and data to be executed or the indicator of the data is referred to as a trigger packet. In short, the trigger packet is an example of the sync metadata.
The trigger packet should be scheduled and transmitted from the data server of the transmitting part prior to a predetermined synchronized trigger time to make the application program execute the event. When it reaches the trigger time, the trigger packet is transmitted repeatedly to make the application program perform a predetermined action indicated by the event at a time synchronized with the video. The terrestrial DMB uses a Transparent Data Channel (TDC) packet mode which has a small overhead and does not use a data group having a short waiting time. The TDC packet mode is based on ETSI TS 101 759, i.e., Digital Audio Broadcasting (DAB); DAB Data Broadcasting Transparent Data Channel (TDC) standard, to transmit the trigger packet. The trigger packet may be transmitted by using other transmission protocols.
The following Table 1 shows formats of the trigger packet . Table 1
In the Table 1, "triggerld" is an identifier for identifying a trigger in the application program, and "triggerTime" indicates a time point when an event is generated. A video time stamp such as CTS is used at around the event generation time point to provide a link service. Herein, a video service providing device and the data server need to cooperate to use the CTS at the trigger time.
"privateDataByte" indicates data that are needed for the application program needs to execute an event at the trigger time. The privateDataByte may be composed of a related data indicator and data, which is shown in Fig.
5.
API Model A terrestrial DMB middleware defines a trigger interface in a dmb.io package, which is defined for data reception through a DMB data channel, to provide a synchronized data service. The API model is what is extended from javax.microedition . Datagram (CLDC 1.1 (JSR 139) at http://java.sun.com/products/cldc/index.jsp) and it is a datagram including event information. Trigger is used to transmit sync signals to other media. The API model is linked with the ID of the trigger and time information when an event indicated by the trigger is to be executed. Although a plurality of triggers are received on a broadcasting network, if the triggers have the same ID, they are all treated as the same triggers. A trigger may be transmitted several times to increase a reception possibility or to make sure the time indicated by the trigger because the time may be changed due to discontinuity in system time clocks. When the time indicated by the trigger is past, the trigger is ignored. The ID of the trigger whose time is past and process is completed may be reused by another trigger. When even one trigger of the same ID is received, a trigger object is received instantly. When the time indicated by the trigger has not come yet, doItNow( ) is false. A trigger having a false doItNow( ) is transmitted to the application program only once even though the trigger of the same ID is transmitted several times. If any, when the trigger time is changed in the middle, triggers of the same ID may be transmitted several times. When the trigger time approaches near or get past slightly at an event generation time, a trigger whose doItNow( ) is true is transmitted to the application program. The application program executes an operation indicated by the trigger instantly. After the execution, even though a trigger of the same ID is transmitted, the trigger is treated as a different one from the trigger whose doItNow( ) was true.
PrivateData transmitted in the form of trigger packets are read by using a method of Diagram that is an upper class of the trigger.
The following Table 2 defines API for a synchronized service. Table 2 public interface dmb. io. Trigger
All extended Interfaces javax . microedition . io . Datagram
Methods public long doItNow( ) false, if the trigger informs prior to an event generation time, or true, if operation should be executed as soon as an event is received. public long getID() returns ID of the trigger. public long getTime() returns time information indicated by the trigger,
Fig. 6 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing data with video data in real-time in accordance with another embodiment of the present invention. The drawing shows an example of a system where a data server 430 uses a time stamp extracting unit 437, when a DMB AV encoder 410 cannot directly output time information of a video source and the time stamp information thereof to the data server 430. The time stamp extracting unit 437 extracts video time stamp, e.g., CTS of the video, from the AV stream outputted from the DMB AV encoder 410 and the extracted time stamp is inputted to the data management and controlling unit 433.
In this case, to calculate the time stamp (Ts(b)) of data that constitutes the sync metadata in advance, encoding delay time of the DMB AV encoder 410 which can be acquired in advance from experiments should be considered additionally.
Sync data may be serviced more easily when AV data are encoded in advance and stored in the form of stream, which is shown in Figs. 7 and 8. In this case, time of a video to which data are to be added and a time stamp thereof can be acquired in advance. Fig. 7 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing video with data in real-time in accordance with yet another embodiment of the present invention.
In the drawing, AV stream is stored in an MPEG-2 file or a Forward Error Correction (FEC) -added file thereof .
Referring to Fig. 7, a DMB transmitting system includes an AV signal source 500, an AV encoder 510 for encoding AV signals based on a terrestrial DMB standard, a storage 560 for storing the AV signals encoded in the form of stream, a data server 530 for generating sync metadata to provide data synchronized with AV data by using an AV time stamp which are supplied from the storage 560, and multiplexers 520, 540 and 550 for multiplexing the output signals of the AV encoder 510 and the output signals of the data server 530. Herein, video time information of a video to which data are to be added and a data time stamp information for data restoration are inputted to the data server 530 in advance. The data management and controlling unit 533 determines when to add the data and generates sync metadata based on the time to add the data.
When data are to be provided in the middle of AV stream transmission out of the DMB AV encoder 510 in a general DMB service which is shown in Fig. 7, stream switching occurs in a switcher. The switcher 520 performs re-stamping, which is a process for guaranteeing continuity of time stamp. The time stamp pre-added to a stream file is re-established in the switcher 520 to be in continuum with the time stamp of the AV stream outputted from the DMB AV encoder 510. In overall, a predetermined value is added to the time stamp. The data server 530 should receive the time stamp of the AV stream re-established in the switcher 520 and information on the time point when the switching has occurred in order to reflect the re-establishment. The time stamp re- establishment is a movement of the entire time stamps by a predetermined value. Thus, the information needs to be inputted the moment when switching occurs . Fig. 8 is a block diagram illustrating a terrestrial DMB transmitting system for synchronizing video with data in real-time in accordance with still another embodiment of the present invention.
Differently from the transmitting system of Fig. 7, AV stream is encoded in the form of an MP4 file and directly added to the DMB AV encoder 610 in the transmitting system of Fig. 8.
When AV stream is encoded into an MPEG-4 format and an MP4 file, which is one of storage file formats, is added to DMB, it should be packetized into MPEG-4 SL and an MPEG-2 transport stream (TS) in an M4onM2 processing module 620, which is shown in Fig. 8. This process may be carried out inside the DMB AV encoder 610 but it also may be performed in an additional device. When an MP4 file is packetized into MPEG-4 SL and MPEG-2 TS, the comparative time information inside the MP4 file is transformed into OCR or CTS. The data server 630 already includes time information of video to which data are to be added and a data time stamp for data restoration. Thus, if it receives the transform information of the packetization from an MP4 file into MPEG-4 SL and MPEG-2 TS, it can predict a data time stamp, which is time when data synchronized with video are presented. Fig. 9 is a block diagram showing a receiving system capable of providing a service where data are synchronized with video in accordance with an embodiment of the present invention.
Referring to Fig. 9, the receiving system capable of providing video synchronized with data includes an RF receiving channel decoder 710, an MSC processor 730, a DMB AV decoder 740, and a DMB data decoder 760, and a data presenting apparatus 770. The RF receiving channel decoder 710 receives RF signals, demodulates the RF signals into baseband signals, performs channel decoding, and separates FIC data from MSC data. A FIC analyzer 720 analyzes the FIC data including multiplexing information and service information and provides an analysis result to the MSC processor 730. The MSC processor 730 separates data transmitted through an MSC channel into data packets and AV stream. The AV stream and the data packet are inputted to the DMB AV decoder 740 and the DMB data decoder 760 to be decoded, respectively. The DMB AV presenting apparatus 750 and the data presenting apparatus 770 presented the AV stream and the data at the same restoration time, respectively.
The DMB data decoder 760 receives a system reference time information such as OCR from the DMB AV decoder 740 and compares the system reference time information with the CTS-based data time stamp information added to the header of sync metadata or the header of a data object. What coincides herein becomes the restoration time of the data object file. Finally, the data presenting apparatus 770 executes an application program directed by the sync metadata, and related data indicated by the sync metadata and instant data added to the sync metadata are presented at the extracted restoration time, respectively, to be synchronized with AV data.
The method of the present invention, which is described above, can be realized as a program and stored in a computer-readable recording medium such as CD-ROM, RAM, ROM, floppy disks, hard disks, magneto-optical disks and the like. Since the process can be easily implemented by those skilled in the art of the present invention, detailed description on it will not be provided herein.
While the present invention has been described with respect to certain preferred embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the scope of the invention as defined in the following claims,

Claims

What is claimed is:
1. A method for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting (DMB), comprising the steps of: a) receiving an AV time stamp for the AV data; b) calculating a time stamp of the data, which is information on a time point when the data are to be presented in a user terminal, which will be shortly referred to as data time stamp hereinafter, based on the AV time stamp; c) generating sync metadata including the calculated data time stamp; and d) encoding the sync metadata and transmitting the encoded sync metadata.
2. The method as recited in claim 1, wherein the AV time stamp is extracted from AV stream acquired from encoding.
3. The method as recited in claim 1, wherein the AV time stamp is provided from an AV stream file in storage .
4. The method as recited in claim 1, wherein the sync metadata are trigger packets composed of information for executing an event which is synchronized with the AV data.
5. The method as recited in claim 4, wherein the trigger packets include a field for identifying a trigger in an application program at a receiving part, a field indicating a time point when an event is generated, and a field indicating data to be executed by the application program at the trigger time point.
6. The method as recited in claim 5, wherein the trigger packets are transmitted repeatedly prior to the trigger time point.
7. A method for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting (DMB), comprising the steps of: a) separating receiving signals into AV stream and data packets; b) receiving a system reference time information from an AV stream decoder for decoding the AV stream; c) acquiring a data time stamp from sync metadata included in the data packets; and d) comparing the data time stamp with the system reference time, and decoding and presenting a data object file at a time point when the data time stamp coincides with the system reference time.
8. The method as recited in claim 7, wherein the sync metadata are trigger packets composed of information for executing an event which is synchronized with the AV data.
9. The method as recited in claim 7, wherein the sync metadata are trigger packets composed of information for executing an event which is synchronized with the AV data.
10. The method as recited in claim 9, wherein the trigger packets include a field for identifying a trigger in an application program at a receiving part, a field indicating a time point when an event is generated, and a field indicating data to be executed by the application program at a trigger time point.
11. The method as recited in claim 9, wherein if the trigger packets are for informing generation of a predetermined event in advance before the event is generated, a false signal is transmitted to the application program of the receiving part, or if the trigger packets are for processing data instantly in response to generation of the event, a true signal is transmitted to the application program of the receiving part.
12. An apparatus for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting (DMB), comprising: an AV encoder for encoding AV signals based on a terrestrial DMB standard and providing an AV time stamp into a data server; a data server for generating sync metadata to provide data synchronized with AV data by using the AV time stamp; a time synchronizing server for temporally synchronizing the AV encoder with the data server; and a multiplexer for multiplexing an output signal of the AV encoder and an output signal of the data server and transmitting a multiplexed signal.
13. The apparatus as recited in claim 12, wherein the data server includes: a data signal source for generating diverse data related to the digital multimedia broadcasting; a data management and controlling unit for calculating a data time stamp, which is a time point when the data are to be presented in a user terminal, based on the AV time stamp, and generating sync metadata including the calculated data time stamp; and a data encoder for encoding the diverse data and the sync metadata into data packets.
14. The apparatus as recited in claim 12, wherein the sync metadata are trigger packets composed of information for executing an event which is synchronized with the AV data.
15. The apparatus as recited in claim 14, wherein the trigger packets include a field for identifying a trigger in an application program at a receiving part, a field indicating a time point when an event is generated, and a field indicating data to be executed by the application program at a trigger time point.
16. The apparatus as recited in claim 14, wherein the trigger packets are transmitted repeatedly before the trigger time point.
17. An apparatus for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting (DMB), comprising: an AV encoder for encoding AV signals based on a terrestrial DMB standard; a storage for storing the encoded AV signals in a form of stream; a data server for generating sync metadata to provide data synchronized with AV data by using an AV time stamp provided from the storage; and a multiplexer for multiplexing an output signal of the AV encoder and an output signal of the data server and transmitting a multiplexed signal.
18. The apparatus as recited in claim 17, wherein the sync metadata are trigger packets composed of information for executing an event which is synchronized with the AV data.
19. The apparatus as recited in claim 18, wherein the trigger packets include a field for identifying a trigger in an application program at a receiving part, a field indicating a time point when an event is generated, and a field indicating data to be executed by the application program at a trigger time point.
20. An apparatus for providing data synchronized with audio/video (AV) data in digital multimedia broadcasting (DMB), comprising: an RF receiving channel decoder for receiving DMB signals, demodulating the DMB signals into baseband signals, and performing channel decoding; an MSC processor for separating AV stream from data packets based on multiplexing information and service information; an AV stream decoder for decoding the AV stream; and a data presenting means for providing the data synchronized with the AV stream by using the sync metadata included in the data packets and a system reference time information provided from the AV stream decoder.
21. The apparatus as recited in claim 20, wherein the sync metadata are trigger packets composed of information for executing an event which is synchronized with the AV data.
22. The apparatus as recited in claim 21, wherein the trigger packets include a field for identifying a trigger in an application program at a receiving part, a field indicating a time point when an event is generated, and a field indicating data to be executed by the application program at a trigger time point.
23. The apparatus as recited in claim 21, wherein if the trigger packets are for informing generation of a predetermined event in advance before the event is generated, a false signal is transmitted to the application program of the receiving part, or if the trigger packets are for processing data instantly in response to generation of the event, a true signal is transmitted to the application program of the receiving part .
EP06768652A 2005-05-26 2006-05-26 Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting Withdrawn EP1884115A4 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR20050044579 2005-05-26
KR20050080642 2005-08-31
PCT/KR2006/002011 WO2006126852A1 (en) 2005-05-26 2006-05-26 Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting

Publications (2)

Publication Number Publication Date
EP1884115A1 EP1884115A1 (en) 2008-02-06
EP1884115A4 true EP1884115A4 (en) 2008-08-06

Family

ID=37452227

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06768652A Withdrawn EP1884115A4 (en) 2005-05-26 2006-05-26 Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting

Country Status (3)

Country Link
EP (1) EP1884115A4 (en)
KR (1) KR100837720B1 (en)
WO (1) WO2006126852A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101226178B1 (en) * 2007-03-27 2013-01-24 삼성전자주식회사 Method and apparatus for displaying video data
KR101382613B1 (en) * 2007-06-20 2014-04-07 한국전자통신연구원 Method of multiplexing transmission frame, apparatus of multiplexing transmission frame and transmission apparatus for digital broadcasting signal
US7936790B2 (en) * 2007-08-30 2011-05-03 Silicon Image, Inc. Synchronizing related data streams in interconnection networks
EP2292013B1 (en) 2008-06-11 2013-12-04 Koninklijke Philips N.V. Synchronization of media stream components
KR101052480B1 (en) * 2008-08-27 2011-07-29 한국전자통신연구원 Broadcast signal transceiver and method
KR101349227B1 (en) 2010-03-29 2014-02-11 한국전자통신연구원 An apparatus and method for providing object information in multimedia system
CN103109540B (en) 2010-07-19 2016-04-27 Lg电子株式会社 The method of transmitting-receiving media file and the device of use the method sending/receiving
KR101236813B1 (en) * 2011-01-13 2013-02-28 주식회사 알티캐스트 Open audio service system in digital broadcast and method thereof
JP5948773B2 (en) 2011-09-22 2016-07-06 ソニー株式会社 Receiving apparatus, receiving method, program, and information processing system
US9118425B2 (en) * 2012-05-31 2015-08-25 Magnum Semiconductor, Inc. Transport stream multiplexers and methods for providing packets on a transport stream
WO2019164361A1 (en) * 2018-02-23 2019-08-29 스타십벤딩머신 주식회사 Streaming device and streaming method
CN111611252B (en) * 2020-04-01 2023-07-18 石化盈科信息技术有限责任公司 Monitoring, device, equipment and storage medium for safety data in data synchronization process

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680216A2 (en) * 1994-04-28 1995-11-02 Thomson Consumer Electronics, Inc. Apparatus and method for formulating an interactive signal
WO2001026369A1 (en) * 1999-10-05 2001-04-12 Webtv Networks, Inc. Trigger having a time attribute
EP1343323A2 (en) * 2002-03-07 2003-09-10 Chello Broadband NV Display of enhanced content

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000244433A (en) * 1999-02-17 2000-09-08 Sony Corp Data multiplexer and data multiplexing method
KR100392384B1 (en) * 2001-01-13 2003-07-22 한국전자통신연구원 Apparatus and Method for delivery of MPEG-4 data synchronized to MPEG-2 data
JP2002238048A (en) * 2001-02-07 2002-08-23 Nec Corp Mpeg 2 transport stream transmission rate conversion method
CN1269363C (en) * 2001-07-27 2006-08-09 松下电器产业株式会社 Digital broadcast system, sync information replacing apparatus and method
KR100438518B1 (en) * 2001-12-27 2004-07-03 한국전자통신연구원 Apparatus for activating specific region in mpeg-2 video using mpeg-4 scene description and method thereof
KR100406122B1 (en) * 2002-03-29 2003-11-14 한국전자통신연구원 Apparatus and method for injecting synchronized data for digital data broadcasting
KR100646851B1 (en) * 2004-11-03 2006-11-23 한국전자통신연구원 Terrestrial DMB broadcasting tranceiving System for synchronizing data service with audio/video service

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680216A2 (en) * 1994-04-28 1995-11-02 Thomson Consumer Electronics, Inc. Apparatus and method for formulating an interactive signal
WO2001026369A1 (en) * 1999-10-05 2001-04-12 Webtv Networks, Inc. Trigger having a time attribute
EP1343323A2 (en) * 2002-03-07 2003-09-10 Chello Broadband NV Display of enhanced content

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Digital Audio Broadcasting (DAB); DMB video service; User Application Specification European Broadcasting Union Union Européenne de Radio-Télévision EBUÜER; ETSI TS 102 428", ETSI STANDARDS, LIS, vol. BC, no. V1.1.1, 1 January 2005 (2005-01-01), XP014030465, ISSN: 0000-0001 *
"Digital Audio Broadcasting (DAB); MOT Slide Show; User Application Specification; ETSI TS 101 499", ETSI STANDARDS, LIS, vol. BC, no. V1.1.1, 1 July 2001 (2001-07-01), XP014006407, ISSN: 0000-0001 *
"DRAFT VERSION 1.1R26 UPDATED 02/02/99; STATE OF THIS DOCUMENT", ADVANCED TELEVISION ENHANCEMENT FORUM SPECIFICATION (ATVEF), XX, XX, 1 February 1998 (1998-02-01), pages 1 - 37, XP002935044 *
See also references of WO2006126852A1 *

Also Published As

Publication number Publication date
EP1884115A1 (en) 2008-02-06
WO2006126852A1 (en) 2006-11-30
KR20060122784A (en) 2006-11-30
KR100837720B1 (en) 2008-06-13

Similar Documents

Publication Publication Date Title
WO2006126852A1 (en) Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting
US10129609B2 (en) Method for transceiving media files and device for transmitting/receiving using same
US10820065B2 (en) Service signaling recovery for multimedia content using embedded watermarks
US7188353B1 (en) System for presenting synchronized HTML documents in digital television receivers
KR20030078354A (en) Apparatus and method for injecting synchronized data for digital data broadcasting
CN102752669A (en) Transfer processing method and system for multi-channel real-time streaming media file and receiving device
CN108111872B (en) Audio live broadcasting system
US10797811B2 (en) Transmitting device and transmitting method, and receiving device and receiving method
US9426506B2 (en) Apparatuses for providing and receiving augmented broadcasting service in hybrid broadcasting environment
CN108174264B (en) Synchronous lyric display method, system, device, medium and equipment
CN101218819A (en) Method and apparatus for synchronizing data service with video service in digital multimedia broadcasting
JP2024040224A (en) Transmission method and transmitting device, and receiving method and receiving device
CN105812961B (en) Adaptive stream media processing method and processing device
EP2814256B1 (en) Method and apparatus for modifying a stream of digital content
US20100205317A1 (en) Transmission, reception and synchronisation of two data streams
EP1487214A1 (en) A method and a system for synchronizing MHP applications in a data packet stream
JP6791344B2 (en) Transmission device and transmission method, and reception device and reception method
JP2021010194A (en) Transmitter, transmission method, receiver and reception method

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20071126

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

A4 Supplementary search report drawn up and despatched

Effective date: 20080704

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20090113

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20141002