US20030088877A1 - Multimedia system with improved data management mechanisms - Google Patents

Multimedia system with improved data management mechanisms Download PDF

Info

Publication number
US20030088877A1
US20030088877A1 US09/804,946 US80494601A US2003088877A1 US 20030088877 A1 US20030088877 A1 US 20030088877A1 US 80494601 A US80494601 A US 80494601A US 2003088877 A1 US2003088877 A1 US 2003088877A1
Authority
US
United States
Prior art keywords
file
multimedia data
request
multimedia
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/804,946
Inventor
Jason Loveman
Mark Allen
Ronald White
Charles Haynes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/804,946 priority Critical patent/US20030088877A1/en
Publication of US20030088877A1 publication Critical patent/US20030088877A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/40Combinations of multiple record carriers
    • G11B2220/41Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/36Monitoring, i.e. supervising the progress of recording or reproducing

Definitions

  • the present invention relates generally to a multimedia system with improved data management mechanisms, and more particularly to a method and apparatus for substantially simultaneously encoding multiple versions of a multimedia data signal, and providing substantially simultaneous access and storage of the multiple versions, a correspondence between the multiple versions being generated during storage.
  • FIG. 1 To produce a news program, a typical news production organization performs four major operations, which are illustrated in FIG. 1.
  • the operations include video production 10 , graphics production 12 , text production 14 and on-air operations 16 .
  • video production 10 To produce a news program, a typical news production organization performs four major operations, which are illustrated in FIG. 1.
  • the operations include video production 10 , graphics production 12 , text production 14 and on-air operations 16 .
  • the results of these operations rarely are combined effectively until the actual broadcast of the news program.
  • Video production 10 includes generating and editing motion video for broadcasting using video information retrieved from a video archive or produced from various sources (e.g., cameras, either studio or field recorded).
  • Text production 14 includes scripting and editing of text gathered from several sources including a text archive.
  • graphics production 12 includes generating and editing graphics data, such as titling and still images gathered from a variety of sources.
  • FIG. 2 A conventional process for integrating the major operations is illustrated in FIG. 2.
  • a disk-based video production operation 30 is integrated with a media production process 32 and on air operations 34 .
  • the use of disk-based digital audio/video storage systems, digital networks, and digital non-linear editing systems has allowed for successful integration of video production, graphics production and on-air operations.
  • Several products are available from Avid Technology, Inc., Tewksbury, Mass., for providing the integration process shown in FIG. 2.
  • the newsroom text production and management system 14 of FIG. 2 is the same text production and management system 14 shown in FIG. 1.
  • newsroom computer systems have been in use for several years, these computer systems are predominately text based, and have limited integration capabilities with tape-based or disk-based audio/video production systems.
  • Newsroom computer systems such as those previously available from BaSys, and now from Avid Technology under the name NetStation, have developed from systems which were developed to receive news agency copy and provide simple word processing and communications facilities.
  • add-ons of various kinds have been developed which provide some integration of the text production operation with the audio/video production operation.
  • only limited integration of the text and audio/video data has been achieved, thereby providing only limited multimedia capability.
  • a journalist develops an idea for a story, and determines how various audio/video clips should be used in the story. Often, the journalist will preview audio/video footage that has been archived, and select portions of the archived footage, called clips, for use in the story. Then, the journalist provides instructions to an editor who edits the clips to produce a final form of the story that is suitable for broadcast.
  • the journalist may wish to prepare a rough form of the story and provide the rough form to the editor for final preparation.
  • a rough form of what the journalist expects for the final form of the story is better than verbal or hand written instructions.
  • the journalist wishes to incorporate video from a previous broadcast that is contained in a video tape archive, the journalist must request that the tape be retrieved manually, and must then review the tape in an edit bay or a similar location.
  • the journalist may then perform some preliminary editing of the archived video, with other material such as video of recent events, text and graphics received over news wire services, and archived text, before providing the rough form to the editor and instructing the editor to prepare the final form of the story for broadcast.
  • the capability to perform the above-identified functions is not available to the journalist in a newsroom system, but as discussed above, must be performed remotely, for example, in an edit bay.
  • a journalist may wish to prepare a story about a particular event while the event unfolds. If the journalist has access to a live feed of the event, it is likely that the journalist will record the event on a video tape using a video tape recorder (VTR), or in a file on a disk using a non-linear disk-based audio/video production system. If the journalist is recording the event on video tape and wishes to prepare a rough form of the story by integrating recorded portions of event, the journalist must stop the VTR, and rewind the video tape to the specific recorded portions intended for integration.
  • VTR video tape recorder
  • An embodiment of the invention is directed to a multimedia system that includes a multimedia capture and encoding system that captures multimedia data, and provides a first compressed version of the multimedia data having a first resolution and a second compressed version of the multimedia data having a second resolution that is different from the first resolution.
  • the multimedia system further includes a multimedia storage system, coupled to the multimedia capture and encoding system, that stores multimedia information including the first and second compressed versions of the multimedia data.
  • the multimedia system further includes a video editing and playback system coupled to the multimedia storage system.
  • the video editing and playback system includes editing circuitry that generates a composition that uses a portion of the first compressed version, and playback circuitry that plays the composition using a portion of the second compressed version that corresponds to the portion of the first compressed version.
  • Another embodiment of the invention is directed to a multimedia system that includes a multimedia capture and encoding system that captures multimedia data, and provides a compressed version of the multimedia data having a first resolution.
  • the multimedia system further includes a multimedia storage system, coupled to the multimedia capture and encoding system that stores multimedia information including the compressed version of the multimedia data, and provides to a network the compressed version of the multimedia data substantially simultaneously as the compressed version is stored.
  • the multimedia storage system includes a server coupled to the network that sends the compressed version on the network.
  • the multimedia system further includes a video host coupled to the network that sends a first request to the server for a first portion of the compressed version of the multimedia data, determines an amount of time to wait based on a length of the first portion and a response time of the first request, and sends a second request to the server for a second portion of the compressed version of the multimedia data after waiting the determined amount of time.
  • a video host coupled to the network that sends a first request to the server for a first portion of the compressed version of the multimedia data, determines an amount of time to wait based on a length of the first portion and a response time of the first request, and sends a second request to the server for a second portion of the compressed version of the multimedia data after waiting the determined amount of time.
  • FIG. 1 is a block diagram illustrating components of a typical television news operation
  • FIG. 2 is a block diagram illustrating components of a typical television news operation having audio/video production capabilities integrated with on-air operations;
  • FIG. 3 is a block diagram of a digital multimedia system according to an embodiment of the present invention.
  • FIG. 4 is a block diagram of a digital multimedia system having a capture manager and an asset manager according to an embodiment of the present invention
  • FIG. 5 is a block diagram of a digital multimedia system having multiple low resolution encoders and multiple high resolution encoders according to an embodiment of the present invention
  • FIG. 6 is a block diagram of a digital multimedia system having a browse server according to an embodiment of the present invention.
  • FIG. 7 is a flow diagram of a method performed by a video host of a digital multimedia system, according to an embodiment of the present invention.
  • FIG. 8 is a flow diagram of a method performed by a browse server of a digital multimedia system, according to an embodiment of the invention.
  • FIG. 9 is a view of a dialog window of a capture manager of a digital multimedia system according to an embodiment of the present invention.
  • FIG. 10 is a block diagram of a digital multimedia system having a core newsroom system, a multimedia archive, and a video production system, according to an embodiment of the present invention
  • FIG. 11 is a view of a graphics user interface of a digital multimedia newsroom production system according to an embodiment of the present invention.
  • FIG. 12 is a diagram of a multimedia file structure according to an embodiment of the present invention.
  • FIG. 3 shows a digital multimedia system 50 for managing motion video data in accordance with an embodiment of the invention.
  • the multimedia system 50 enables one or more users to manipulate effectively motion video data, text, graphics and audio (i.e., multimedia data) and generate a multimedia composition.
  • the system 50 substantially simultaneously encodes a low resolution version and a high resolution version of multimedia data.
  • a journalist using the system generates a composition using a portion of the low resolution version, and an editor plays the composition using a portion of the high resolution version that corresponds to the portion of the low resolution version.
  • the multimedia system 50 includes a multimedia capture and encoding system 52 that captures multimedia data, and substantially simultaneously provides the first compressed version of the multimedia data having the first resolution, and the second compressed version of the multimedia data having the second resolution that is different than the first resolution.
  • the multimedia system further includes a multimedia storage system 54 , coupled to the multimedia capture and encoding system 52 , that stores multimedia information including the first and second compressed versions of the multimedia data.
  • the multimedia storage system 54 includes a digital computer-readable and writable non-volatile random-access medium, such as a magnetic disk, for storing the first and second compressed versions digitally and non-linearly.
  • the multimedia system 50 further includes a video editing and playback system 56 coupled to the multimedia storage system 54 .
  • the video editing and playback system 56 includes editing circuitry 58 that generates a composition that uses a portion of the first compressed version, and playback circuitry 60 that plays the composition using a portion of the second compressed version that corresponds to the portion of the first compressed version.
  • the composition includes one or more data structures that define a list of video entries. Each video entry indicates a name of a file containing video information, and a range of the file that defines a portion of the video information.
  • the editing circuitry 58 and the playback circuitry 60 are typically used by a journalist and an editor, respectively. Alternatively, both the editing circuitry 58 and the playback circuitry 60 may reside on a single graphics workstation.
  • FIG. 4 Another embodiment of the invention is a newsroom production system 700 which is illustrated in FIG. 4.
  • the system 700 substantially simultaneously encodes a low resolution version and a high resolution version of multimedia data, and enables a journalist using the system to generate a composition using a portion of the low resolution version, and an editor to play the composition using a portion of the high resolution version that corresponds to the portion of the low resolution version.
  • the system 700 enables a news production organization to manipulate effectively multimedia data including motion video clips from a variety of sources, as well as text, live presentations by announcers and associated graphics.
  • the system 700 is similar to the system 50 described above in that the system 700 includes a multimedia capture and encoding system 710 , a multimedia storage system 730 and a video editing and playback system 750 .
  • the system further includes a first computer network 704 and a second computer network 706 that are coupled to a bridge 708 .
  • Each of the multimedia capture and encoding system 710 , the multimedia storage system 730 , and the video editing and playback system 750 is coupled to the first network 704 and the second network 706 .
  • the network 706 is an ATM network such as AvidNet available from Avid Technology, Inc., Tewksbury, Mass., which is described in U.S. patent application Ser. No. 08/215,849, which is hereby incorporated by reference.
  • the system 700 also includes an input 702 for receiving multimedia data, from one or more sources.
  • the multimedia capture and encoding system 710 includes a first encoder 712 coupled to the first network 704 , a second encoder 716 coupled to the second network 706 , and an encoding controller 714 interconnected between the encoders 712 and 716 .
  • the encoding controller 714 is also referred to as a capture manager.
  • Each of the encoders 712 and 716 is further coupled to the video input 702 to receive the multimedia data.
  • the multimedia storage system 730 includes a first video server 732 coupled to the first network 704 , a second video server 736 coupled to the second network 706 , and an asset manager 734 .
  • the asset manager 734 is coupled to each of the second video server 736 , the capture manager 714 and the second encoder 716 .
  • the video editing and playback system 750 includes a first workstation 742 coupled to the first graphics network 704 , and a second graphics workstation 744 coupled to the second network 706 .
  • the first graphics workstation includes first editing circuitry 752 coupled to the first network 704 .
  • the second graphics workstation includes playback circuitry 754 coupled to the second network 706 , and second editing circuitry 756 coupled to the second network 706 .
  • the playback circuitry 754 and the second editing circuitry 756 may reside on separate graphics workstations each of which is coupled to the second network 706 . Both the playback circuitry 754 and the second editing circuitry 756 are further coupled to the asset manager 734 .
  • the first and second encoders 712 and 716 substantially simultaneously receive a multimedia data signal from the input 702 .
  • the first encoder 712 outputs over the network 704 a signal containing a first compressed version of the multimedia data.
  • the second encoder 716 outputs over the network 706 a signal containing a second compressed version of the multimedia data.
  • the resolution of the first compressed version is different than the resolution of the second compressed version.
  • the first and second resolutions differ from a time perspective so that one of the versions uses less frames than the other over a given interval of time.
  • the first and second resolutions differ spatially, i.e., in the number of pixels used to represent a still image, so that one of the versions provides images of a better clarity than the other version.
  • the first and second resolutions differ both temporally, i.e., the number of images per second of motion video, and spatially.
  • the first compressed version is an MPEG-1 (ISO/IEC 11172-1 through 9) encoded stream
  • the second compressed version is a 60 field per second motion-JPEG (MJPEG) encoded stream of broadcast television quality images so that the first and second compressed versions have different temporal and spatial resolutions.
  • the first video server 732 receives and stores the first compressed version from the first encoder 712 .
  • the second video server 736 receives and stores the second compressed version from the second encoder 716 . Storage of the first and second compressed versions occurs substantially simultaneously.
  • the first video server 732 is a low resolution video server that stores low resolution multimedia data such as Avid BrowseServer
  • the second video server 736 is a high resolution video server that stores high resolution multimedia data such as Avid MediaServer.
  • Both Avid MediaServer and Avid BrowseServer are motion video storage devices available from Avid Technology, Inc., Tewksbury, Mass.
  • the capture manager 714 controls the asset manager 734 so that a correspondence between the first and second compressed versions is generated.
  • the asset manager 734 initially creates and then maintains a mapping of the first and second compressed versions.
  • the mapping is achieved by storing file identification information and timecode data in a file. If a filename and timecode range identifying a portion of the first compressed version is provided to the asset manager 734 , the asset manager can identify a portion of the second compressed version that corresponds to the portion of the first compressed version. In particular, the asset manager 734 searches the file and retrieves a filename and a timecode range identifying the portion of the second compressed version that corresponds to the portion of the first compressed version. Accordingly, correspondence between the first and second compressed versions is achieved.
  • Such a mapping mechanism may be implemented by using a form of dynamic linking as disclosed in U.S. Pat. No. 5,267,351 to Reber et al., which is hereby incorporated by reference.
  • the asset manager 734 may maintain indications of high resolution video files which are equivalent to low resolution files. This equivalency can be used to automatically and dynamically associate the appropriate high resolution files with the low resolution files used by the journalist to create a multimedia composition.
  • the capture manager 714 ensures that timecode information is included in the encoded multimedia data. In particular, if timecode information is not included, either the capture manager 714 or the asset manager 734 adds timecode information to the second compressed version before it is stored in the second video server 736 .
  • a journalist in the newsroom operates the first workstation 742 and an editor operates the second workstation 744 .
  • the journalist generates a composition that uses a portion of the first compressed version of the multimedia data having the first resolution. It is not necessary that the composition be suitable for broadcast. Rather, the composition may be a rough form of the journalist's story that an editor can convert into a final form for broadcast.
  • the journalist sends the representation of the composition (not including the media data) to the editor across the first and second networks 704 and 706 through the bridge 708 .
  • the editor can play the composition on the second workstation 744 .
  • the second workstation 744 plays the composition
  • the second workstation plays a portion of the second compressed version of the multimedia data having the second resolution rather than the portion of the first compressed version used by the journalist.
  • the editor converts the composition into the final broadcast form by performing editing operations, such as adding blend and fade transitions between audio/video portions and other special effects, using the second editing circuitry 756 .
  • FIG. 5 Another embodiment of the invention is directed to a newsroom production system 800 which is illustrated in FIG. 5.
  • This embodiment is similar to the embodiment of FIG. 4 except that it includes multiple first encoders 812 , multiple second encoders 816 , and multiple first workstations 842 .
  • a news production organization can utilize effectively the system 800 to generate news stories that include broadcast quality motion video clips from a variety of sources.
  • the encoders 812 are low resolution encoders that simultaneously output low resolution compressed versions of multimedia data from various sources.
  • the encoders 816 are high resolution encoders that simultaneously output high resolution compressed versions of the multimedia data from the various sources.
  • a benefit of the system 800 is that the multimedia data transferred through the first network 804 is low resolution data which requires less bandwidth than high resolution data. Accordingly, several workstations 842 can be connected to the first network 804 without experiencing substantial degradation in performance of the first network 804 .
  • the multimedia data transferred through the second network 806 is high resolution data which uses more network bandwidth than low resolution data. Nevertheless, the second network 806 provides suitable performance since it is isolated from network traffic caused by the first workstations 842 . Accordingly, more than one second workstation 844 may be connected to the second network 806 .
  • the bridge 808 allows certain signals to pass from one network to the other. In particular, the bridge 808 allows a journalist working on one of the first workstations 842 to send a generated composition, i.e., one or more data structures that define a story, to an editor working on one of the second workstations 844 .
  • the system 800 can capture, encode and store both low resolution and high resolution versions of more than one audio/video feed simultaneously.
  • the capture manager 814 maintains control of the multiple encoding sessions simultaneously.
  • the input 802 may have a first terminal that is connected to a satellite feed so that the satellite feed can be encoded and stored by the system 800 .
  • the input 802 may have a second terminal that is connected to a live camera so that the camera feed can be encoded and stored simultaneously by the system 800 .
  • a user can view and edit an encoded version of an audio/video feed while the encoded version is being encoded and stored in a file on one of the video servers.
  • a system 890 includes an encoder, a browse server, and a workstation, as shown in FIG. 6. These devices can be a portion of the multicast system 800 of FIG. 5.
  • the first video server 832 illustrated in FIG. 5 is suitable as the browse server 832 in FIG. 6.
  • the browse server 832 includes a buffer cache 870 and disk-based memory 880 , as shown in FIG. 6.
  • portions of the encoded version are initially cached in the buffer cache 870 by the browse server's operating system.
  • the operating system writes, i.e., flushes these portions from the buffer cache 870 to a file on the disk-based memory 880 .
  • the operating system As the operating system writes the portions to the memory, the operating system simultaneously sends network packets including these portions onto the network 804 to one or more workstations 854 for viewing and possible editing if a request for the encoded version is received from the one or more workstations 854 .
  • the browse server 832 uses the Windows NT operating system available from Microsoft, Corporation, Redmond, Wash., which permits data to be stored into a file, and simultaneously read from the file without file contention problems.
  • the system 890 utilizes file access operations provided by the Windows NT operating system so that multimedia data can be flushed from buffer cache 870 to the disk-based memory 880 , and simultaneously sent to one or more workstations 842 through the network 804 .
  • the system utilizes a feature of the NT file system providing the ability to read data from a file while data is appended to the file.
  • the portions are multicast (or “pushed”) over the network 804 , and any workstations 842 wishing to have access to the portions simply register a request with the browse server 832 . Then, the browse server multicasts the portions to the workstations 842 over the network 804 using IP multicasting as the browse server simultaneously stores the portions.
  • the browse server 832 responds specifically to individual requests for encoded portions, i.e., the portions are “pulled” from the browse server 832 to the workstation 842 .
  • the browse server 832 functions as a web server by providing packets of information onto a computer network in response to individual requests from various workstations 842 .
  • the workstation 842 sends a request onto the network 804 that is received by the browse server 832 .
  • the browse server 832 responds by sending a network packet containing a portion of the encoded version back to the workstation 842 substantially simultaneously as the encoded version is stored into a file on the browse server 832 .
  • the workstation 842 performs the method 1000 illustrated in FIG. 7.
  • the workstation 842 sends a request to the browse server 832 for one or more portions of the encoded version that is being simultaneously stored in the browse server 832 .
  • the workstation 842 waits until it receives portions of the encoded version from the browse server 832 in response to the request.
  • the workstation 842 receives and plays one or more portions, and determines when to send a next request for more portions. The time for sending a next request depends on both the amount of video data received, e.g., the number of portions, and the time it took between sending the request and receiving the data.
  • step 1006 the workstation sends the next request expecting to receive one or more new portions of the encoded version a predetermined amount of time before the workstation 842 is through playing the earlier received portions. Accordingly, the workstation 842 attempts to maintain some predetermined amount of lead time. In one embodiment, this lead time is approximately 0.5 seconds so that the workstation 842 sends the next request expecting that the next portions will be received 0.5 seconds before the previous portion is through playing.
  • step 1008 the workstation 842 checks whether the end of the file that stores the encoded version has been reached. If so, the method 1000 terminates. Otherwise, the workstation 842 repeats the method 1000 .
  • the workstation 842 uses an active reader thread to acquire the new portions. If more than 6 seconds worth of material is stored by the workstation 842 , the reader thread sleeps for a predetermined amount of time or until it is activated.
  • the browse server 832 performs the method 1100 illustrated in FIG. 8.
  • the browse server 832 opens a file to store the encoded version of multimedia data.
  • the browse server 832 polls the network 804 for requests for portions of the encoded version, and when a request is received from the workstation 842 , the browse server 832 sends one or more portions of the encoded version to the workstation 842 .
  • the browse server 832 can track which portions of the encoded version have been sent to the workstation 842 and which portions to send in response to the next request.
  • the browse server 832 can use the ID to find determine which file and which read block need to be accessed, and then send the read block and other information such as timecode and length information of the portion or portions of the encoded version defined by the read block. Alternatively, the workstation 842 tracks which portions of the encoded version are needed next, and sends an indication of which portions it needs with the next request. In step 1106 , the browse server 832 determines whether the encoded version has been completely stored, e.g., whether the encoder 812 has been stopped.
  • the browse server 1106 proceeds to step 1108 and closes the file, and sends an end of file indication along with any remaining unsent portions when a next request is received from the workstation 842 . Otherwise, the browse server 832 proceeds to step 1104 to continue polling the network and storing the encoded version in the file.
  • the workstation 842 may be an Internet client by having an IP address, and the browser server 832 is effectively a server, such as an http server or other kind of server that uses the TCP/IP protocol.
  • communications between the workstation 842 and the browse server 832 are “connectionless.” That is, the requests sent from the workstation 842 to the browse server 832 establish a connection only for the period of time required to transmit network packets of the request. Similarly, another connection is established between the browse server 832 and the workstation 842 for transfer of one or more portions of the encoded version across the network 804 . Otherwise, no connection exists, i.e., no connection stream remains open.
  • http server software may be used by the browse server 832 to handle responses from the workstations 842 which are configured as web hosts.
  • Such software is Microsoft Internet Information Server, or Microsoft Peer Web Services, available from Microsoft, Corporation, Redmond, Wash.
  • a journalist working at the workstation 842 may view and edit an encoded version of the multimedia data while it is being stored in the browse server 832 . Accordingly, the journalist may prepare a composition that includes portions of an encoded version of multimedia data, while the multimedia data is being simultaneously stored in a browse server. Furthermore, the journalist is not burdened with having to store the encoded version in multiple files on the same browse server 832 . Portions of the encoded version on the workstations 842 can be accessed with a maximum of 5 seconds of delay from the time the audio/video feed is first provided to the input 802 .
  • Each journalist may create a recording session and maintain control of the recording session using a graphical user interface of the capture manager 814 .
  • This user interface also includes some access features (e.g., viewing and editing capabilities described above) allowing the journalist to access portions of an encoded version as the encoded version is being simultaneously recorded and stored.
  • An example of the graphical user interface is illustrated in FIG. 9.
  • the interface is in the form of a dialog window 900 that includes one or more property page displaying one or more respective encoding configurations.
  • Each property pages includes buttons that enable the user to send commands and information to the capture manager 814 using conventional input methods with a mouse and a keyboard. Other conventional input mechanisms can be substituted for the mouse and keyboard.
  • FIG. 9 An example of the graphical user interface is illustrated in FIG. 9.
  • the interface is in the form of a dialog window 900 that includes one or more property page displaying one or more respective encoding configurations.
  • Each property pages includes buttons that enable the user to send commands and information to the capture manager 814 using conventional input
  • the dialog window 900 has six property pages named: Network Feed1, Network Feed2, Archive VTR, Projects VTR, Satellite Feed1, and Satellite Feed 2.
  • the property page for Network Feed1 is shown as being presently in the foreground by “Network Feed1” being displayed as the source 902 .
  • the other property pages are shown in the background by tables 940 with their respective names.
  • the high resolution encoder 816 is named “Jupiter” and the low resolution encoder is named “MR1”.
  • the bottom area 904 of the dialog window 900 displays a plurality of buttons including an “Exit” button 906 for exiting the graphical user interface of the capture manager 814 , a “New” button 908 for creating a new property page for a new encoding configuration, a “Delete” button 910 for deleting a property page, and a “Help” button 912 for obtaining help through a help window (not shown).
  • the dialog window 900 further displays recording status information including an elapsed time 914 of the encoding session, a start time 916 that is assigned to the encoded version of the multimedia data being stored, a recorded headframe 918 that is used as a graphical image representation of the encoded version, and a flashing status 920 that indicates a current state of the encoding session.
  • the dialog window 900 further displays additional control buttons depending on the configuration of the encoding session as identified by its property page.
  • the Network Feed1 property page includes encoder control buttons 922 : “Standby” 924 , “Start” 926 and “Stop” 928 , that allow the user to respectively pause, start and stop an encoding session.
  • the Network Feed1 property page further includes a “Previewer” (not shown) that allows the user to view progress of the encoding session, a “Synchronizer” 932 that allows the user to advance to the end of the currently encoded video to view the latest results of the encoding session, a “Metadata Edit Controller” 930 that allows the user to view and modify portions of the encoded version, and a “Headframe Grabber” 934 that allows the user to select, as the headframe for the encoded version, any frame in the encoded version that has been stored.
  • Each of the property pages in the dialog window 900 is tabbed, as shown in the area 940 of FIG. 9.
  • the capture manager 814 displays the property page associated with the selected tab in the foreground of the dialog window 900 . If the user cannot find an appropriate configuration to select and determines that a new configuration is needed, the user may create a new configuration and a new property page associated with the new configuration by pressing the “New” button 908 .
  • the capture manager 814 will respond by prompting the user for information regarding the new configuration until it has enough information to begin a new encoding session.
  • the capture manager 814 begins encoding when the user selects the “Start” button 926 .
  • the capture manager 814 sends a signal to the low resolution encoder 812 through connection 820 (see FIG. 5) causing it to begin encoding.
  • the capture manager 814 sends this signal to the low resolution encoder 812 when the capture manager 814 receives a signal from the high resolution encoder 816 through connection 818 indicating that the high resolution encoder 816 has started encoding. Accordingly, if the user has started the high resolution encoder 818 , the low resolution encoder 812 is started automatically and simultaneously.
  • the user interface enable a journalist to control multiple live feeds simultaneously from one graphics workstation.
  • the journalist brings the property page for that encoding session to the foreground in the dialog window 900 and performs the desired operations. Then, the journalist can perform an operation on a different encoding session by bring it to the foreground.
  • the journalist using the capture manager's dialog window 900 can view any portion of the encoded version as long as it has been stored in a file in the browse server 832 .
  • the journalist may jump to the beginning of the version, jump to the middle of the version, and jump to the end of the version. All of these access methods can occur while browse server 832 continues storing additional portions of the encoded version in the same file.
  • the journalist may add markers to the portions of the encoded version in real time.
  • the journalist is not required to wait until an encoding session is over before viewing and marking multimedia data.
  • FIG. 10 Another embodiment of the invention is directed to a multimedia newsroom production system 90 , as illustrated in FIG. 10.
  • the newsroom production system 90 enables a news production organization to manipulate effectively multimedia data to generate news stories for broadcasting. Each generated news story may include several broadcast quality motion video clips from various sources.
  • the system 90 includes three major systems, a core newsroom system 100 , a multimedia archive 200 , and a video production system 300 .
  • the components of the systems are interconnected through a single digital network.
  • the single digital network is a 100 Mb/s network.
  • the components of the core newsroom system and the multimedia archive are interconnected using a first digital network 400
  • the components of the video production system are interconnected with a second digital network 410
  • An adaptor box 420 is connected to both the first digital network 400 and the second digital networks 410 to enable communication between the two networks.
  • the first digital network 400 is implemented using an Ethernet system having a data rate equal to, or greater than, 100 Mb/s
  • the second digital network 410 is implemented using an Ethernet system having a data rate equal to, or greater than, 10 Mb/s.
  • the adaptor box 420 may be implemented using one of a number of commercially available products such as a FastNet 10 available from Cabletron Systems, Inc, Rochester, N.H.
  • the video production system 300 provides audio/video capture, media data editing, and management and control of high quality multimedia data suitable for broadcast.
  • the multimedia data can be any form of information that can be represented in a digital form.
  • the video production system includes a digital playback system 310 , a video editor 320 , a media recorder 330 connected to an MPEG encoder 340 , a media server 350 including an asset manager 360 , a high bandwidth data network 364 , and a graphics workstation 370 .
  • the media server 350 is a large scale computer that stores and delivers high quality audio and motion JPEG video (MJPEG), suitable for broadcast, in conjunction with the other devices of the video production system 300 .
  • the media server 350 can also function as an archive system for multimedia data produced in the video production system 300 .
  • additional near-line storage and off-line storage is provided on a digital data storage medium, such as tape or optical disks, to relieve the media server of archive responsibilities to provide additional on-line storage capabilities within the media server 350 .
  • An asset manager 360 is an integral part of the media server 350 and is implemented as software in the media server 350 .
  • the asset manager 360 stores information and is the tool used to manage the data stored in the near-line storage and the off-line storage.
  • the material stored in the media archive can be automatically moved to on-line status on the media server by the asset manager 360 .
  • the asset manager 360 contains search support data for locating media objects stored in the media server 350 , in the near-line storage system and in the offline storage system.
  • the asset manager 360 also contains composition information that can be used to capture, edit, and play back the media objects stored in the media server 350 .
  • the media server 350 also provides translation of low resolution media data compositions, generated within the core newsroom system, to high resolution media data compositions for editing and playback within the video production system.
  • the media server 350 is implemented using an Avid MediaServerTM available from Avid Technology, Inc., Tewksbury, Mass.
  • the media recorder 330 is a disk-based digital recording workstation which is used to capture audio/video data and provide digitization and compression of the audio/video data.
  • the media recorder 330 digitizes, compresses and records audio/video material and transmits the digitized compressed data to the media server over the high speed network for storage on the media server 350 .
  • the media recorder 330 uses an MJPEG encoding scheme to generate high quality, high resolution, compressed digital data suitable for broadcast.
  • an MPEG encoder 340 is coupled to the media recorder 330 to also provide MPEG compression capability.
  • the addition of the MPEG encoder 340 to the media recorder 330 provides the system with a dual-digitizing capability for media data recorded by the media recorder 330 .
  • the MPEG encoder provides greater compression of the data than the media recorder 330 , thereby allowing the data to be efficiently transmitted over the Ethernet network 400 to be played on the journalist workstations 110 .
  • the MPEG encoder 340 has a direct connection to the digital network 400 to provide MPEG encoded media data to the multimedia archive 200 .
  • the media recorder 330 is implemented using an Avid Media RecorderTM available from Avid Technology Inc., Tewksbury, Mass.
  • the video editor 320 is a full-feature, digital, non-linear video editing workstation specifically tailored to provide functions for news editing.
  • the video editor provides editing of high resolution broadcast quality images provided by the media server 350 .
  • the video editor is implemented using a an Avid NewsCutterTM or an Avid Media Composer®, both of which are available from Avid Technology Inc., Tewksbury, Mass.
  • the digital playback system 310 is a digital, disk-based playback system that manages the broadcast to air of multimedia data produced and stored within the video production system 300 .
  • the digital playback system 310 plays materials stored either locally or on the media server 350 in accordance with play lists generated from a program lineup created on one of the journalist workstations 110 within the core newsroom system 100 , or on a workstation directly coupled to the video production system (not shown).
  • the digital playback system 310 is implemented using an Avid AirPlay® available from Avid Technology, Inc., Tewksbury, Mass.
  • the high bandwidth network 364 provides high speed communication between the components of the video production system 300 .
  • the high bandwidth network 364 is implemented using an ATM network as described in co-pending U.S. patent application Ser. No. 08/249,849, titled An Apparatus and Computer Implemented Process For Providing Real-Time Multimedia Data Transport in a Distributed Computing System, which is incorporated herein by reference.
  • the high bandwidth network 364 supports real time playback of broadcast quality MJPEG video and multi-track audio over fiber optic networks.
  • the graphics workstation 370 is used for generating and editing graphics material for broadcast from and storage in the video production system.
  • the graphics workstation 370 is implemented using a Matador Workstation available from Avid Technology, Inc., Tewksbury, Mass.
  • the media recorder 330 and the MPEG encoder 340 form a multimedia capture and encoding system, as illustrated in the embodiment of FIG. 3.
  • the combination of the media recorder 330 and the MPEG encoder 340 captures multimedia data, and substantially simultaneously provides a first compressed version of the multimedia data having a first resolution (e.g., MPEG), and a second compressed version of the multimedia data having a second resolution (e.g., MJPEG) that is different than the first resolution.
  • a first resolution e.g., MPEG
  • a second compressed version of the multimedia data having a second resolution e.g., MJPEG
  • the graphics workstation 370 forms playback circuitry 60 of a video editing and playback system 56 , as illustrated in FIG. 3.
  • the graphics workstation plays compositions that use compressed versions of multimedia data stored in the media server 350 .
  • the compositions may be generated by the core newsroom system 100 using different compressed versions of multimedia data stored in the multimedia archive system 200 .
  • the core newsroom system 100 consists primarily of a number of journalist workstations 110 and a pair of news servers 120 .
  • FIG. 10 shows a newsroom system having three journalist workstations 110 .
  • the number of workstations 110 actually used may be much greater than three, and the actual number of journalist workstations 110 that may be used in the system is based on several factors including the amount of network activity generated by each user of the workstations and by the amount of delay each user will tolerate in accessing the system.
  • each of the journalist workstations 110 is implemented using an MPC III compliant workstation.
  • the journalist workstation 110 provides access to multimedia data from a variety of sources and includes the tools (i.e. software) necessary to create a multimedia storyboard of a news story for broadcast.
  • the multimedia data available to the journalist includes the low resolution MPEG video data captured by the media recorder.
  • each of the journalist workstations 110 includes a video port for receiving video from, for example, a VTR.
  • Each of the journalist workstations 110 also includes a serial port for controlling the VTR.
  • the graphics user interface of the journalist workstation 110 and the functions available to a user of the journalist workstation 110 are described in greater detail below.
  • the news server 120 provide management and storage of the multimedia data in the newsroom environment.
  • the news servers 120 are configured as distributed processors with mirrored data bases to provide maximum reliability and performance. Other centralized functions, such as communications functions, are managed by the news servers 120 .
  • the news servers 120 are implemented using an Avid NewsServer available from Avid Technology, Inc., Tewksbury, Mass.
  • the news servers 120 have external connections 122 for providing access to news wire services and to allow remote access to the news servers 120 from users external to the core newsroom system.
  • the core newsroom system 100 may also include one or more terminal servers 140 to provide connection to the digital network 400 for user terminals 130 .
  • the user terminals may be one of several different terminals used in prior art systems primarily for text processing and communications functions.
  • a device controller 150 or a number of device controllers 150 , may also be coupled to the digital network 400 to provide control of several multimedia devices, such as teleprompters, from the journalist workstations.
  • a journalist workstation 110 of the core newsroom system 100 in combination with a graphics workstation 370 of the video production system 300 form of a video editing and playback system 76 , as illustrated in the embodiment of FIG. 3.
  • the journalist workstation 110 forms editing circuitry 58 that generates a composition that uses a portion of a first compressed version of multimedia data having a first resolution.
  • the graphics workstation 370 forms playback circuitry 60 that plays the composition using a portion of a second compressed version of the multimedia data stored in the media server 350 .
  • the multimedia archive (MMA) 200 includes a library server 210 and one or more object servers 220 .
  • the library server 210 holds catalog and search support meta data for locating objects stored in the multimedia archive 200 .
  • the object server 220 provides the primary storage media for browsing and archival of material generated during news gathering and production processes.
  • the object server 220 works in conjunction with the library server 210 to facilitate distribution of multimedia material to the journalist workstations 110 .
  • the objects stored in the multimedia archive can be low resolution versions of video, audio, graphics, and text.
  • the MMA can be used to store finished stories, audio, video and other content for reuse in creating new stories.
  • the multimedia archive 200 is implemented using the IBM Digital Library 5765-258.
  • the multimedia archive system 200 in combination with the media server 350 of the video production system form a multimedia storage system 54 , as illustrated in the embodiment of FIG. 3.
  • the multimedia archive system 200 and the media server 350 are coupled to the media recorder 330 and the MPEG encoder 340 that form the multimedia capture and encoding system 52 , and are further coupled to the journalist workstations 110 and the graphics workstation 370 that form the video editing and playback system 56 .
  • the multimedia archive system 200 and the media server 350 store multimedia information including the first and second compressed versions of the multimedia data, which are described above.
  • the operation of the digital multimedia newsroom production system 90 shown in FIG. 10 is described below.
  • the operation of the system 90 can be described as a collection of distinct function specific workloads characterized at a high level as asset creation, asset use, asset storage, and asset administration.
  • the system 90 provides the capability for the following functions:
  • the news servers 120 provide capability for capture and storage of news wire text data through the external interfaces 122 .
  • News wire text stories are captured by the news servers 120 and cataloged in a database of the news servers 120 .
  • a user of one of the journalist workstations 110 may access the news servers' databases as a system librarian to search, browse and retrieve the wire service data stored in the databases of the news servers 110 . It is not generally necessary to store all text stories captured by the news servers 110 in the multimedia archive 200 .
  • a system administrator may access the news servers through one of the journalist workstations 110 , browse the catalog of data received from the news wires, determine what stories are appropriate for storage in the multimedia archive 200 and command the news servers 120 to transfer selected data to the multimedia archive 200 for storage.
  • a user of the journalist workstation 110 can access text through the news servers 120 and can create text and scripts from scratch or can use existing text and scripts stored in the news servers 120 or in the multimedia archive 200 in the creation of text and scripts.
  • the user can search, browse and retrieve text data stored in the news servers 120 and the multimedia archive 200 .
  • the user can perform this searching and browsing using complex, full-text search techniques, thereby allowing efficient research by focusing the searching to retrieve data specifically relevant to the user's needs.
  • High resolution media data utilized by the video production system is captured in the system by the media recorder 330 .
  • the high resolution media data is captured in the media recorder 330 , digitized and compressed using a broadcast quality compression technique such as MJPEG.
  • the media data captured by the media recorder 330 is transferred in compressed form to the media server 350 and is registered and stored in the media server 350 by the asset manager 360 .
  • a low resolution version of the media data is simultaneously created with the high resolution media data.
  • the high resolution media data can be browsed and edited using the video editor 320 and can be broadcast to air using the digital playback system 310 .
  • low resolution video is used by the journalist workstations 110 to provide limited editing capability.
  • a user of the video production system 300 may wish to edit low resolution media data.
  • the low resolution media data may either be a low resolution composition created by a user of a journalist workstation 110 or a low resolution version of media data captured by the media recorder 330 .
  • the video production system 300 user may search the multimedia archive 200 over the network 400 or may search the asset manager 360 over the network 400 to retrieve the low resolution media data.
  • the video editor 320 may transfer edited low resolution media data to the multimedia archive 200 for cataloging and storage therein.
  • the low resolution media data is stored in, cataloged by and retrieved from the multimedia archive 200 .
  • the low resolution media data is captured in the system 90 using the media recorder 330 .
  • the media recorder 330 performs a dual resolution digitization of media data to be captured by the system 90 .
  • the media recorder 330 in conjunction with the MPEG encoder 340 , performs a dual resolution digitization of the media data to simultaneously produce a high resolution version of the media data and a low resolution version of the media data.
  • the high resolution version of the media data is digitized and compressed in a preferred embodiment using an MJPEG encoding format.
  • the low resolution video is compressed in a preferred embodiment using known, high compression encoding techniques such as MPEG or Quick Time, available from Apple Computer, Inc, Cupertino, CA. Although it is preferred to use either MPEG or Quick Time, another compression technique which results in a high compression ratio of the media data may also be used.
  • One of the primary features of the system 90 shown in FIG. 10 is the ability to provide a user of the journalist workstations 110 with low resolution video to allow browsing and editing of the low resolution video to create storyboards which may ultimately be used by an editor using the video editor 320 to create broadcast quality media data.
  • the low resolution editing feature allows the journalist to become more involved in the finished media product and to incorporate archived media data into storyboards without the need for manual retrieval of video tapes and operation of a video tape player in an edit bay as in previous systems.
  • a journalist using the journalist workstation 110 , can search the data contained within the library server 210 of the multimedia archive 200 for low resolution video data, audio data and text related to a story that the journalist is composing on the journalist workstation 110 .
  • the multimedia archive provides a list of material contained therein related to the key words. The journalist can then select media data for browsing or editing on the journalist workstation 110 from the list of material.
  • the graphics user interface for storyboard creation provided to the journalist at the journalist workstation 110 is shown in FIG. 11.
  • the user interface 500 includes a number of windows including a viewing window 510 , a clipnotes window 520 , a storyboard window 530 , a storynotes window 540 and a script window 550 .
  • the script window 550 provides an area in which the journalist can write the main script of a story being composed on the journalist workstation 110 .
  • Text can be generated in this window using standard word processing commands.
  • Graphics, including painting functions, can be performed on the journalist workstation 110 and incorporated into the storyboard.
  • the viewing window 510 displays a low resolution video component of low resolution media data to be viewed and edited on the journalist workstation 110 .
  • the viewing window also displays the time code 516 of the video being displayed, machine controls 518 , and editing functions such as mark in 512 a and mark out 512 b buttons.
  • the machine controls 518 provide controls for playing a video clip in the viewing window and are similar to standard VTR controls.
  • the machine controls can be selected by the user using a pointing device, such as a mouse, or by using special function keys on a keyboard of the journalist workstation 110 . Selecting a clip for display in the viewing window may be done by dragging a clip from the storyboard window 530 (described below) or by selecting a new clip from the multimedia archive 200 .
  • a second viewing window can be opened on the screen at the same time as the viewing window 510 .
  • the second viewing window in a preferred embodiment, is made visible by either shrinking or eliminating the storynotes window 540 .
  • the mark in button 512 a and the mark out button 512 b are super-imposed in the upper left and upper right corners of the viewing window. These buttons are used to perform editing functions at the journalist workstation 110 .
  • audio data associated with the video data is played on speakers of the journalist workstation 110 .
  • a “video only” or “audio only” indication will appear on the video window when the media data being displayed or played on the workstation consists of audio only or video only.
  • the clipnotes window 520 provides a notepad for entry of short notes for each clip viewed on the viewing window 510 .
  • the storynotes window 540 provides an area for the entry of notes that apply to the whole story to be edited as opposed to the clipnotes window 510 which is for notes on individual clips.
  • the storyboard window 530 allows clips and subclips to be laid out in sequence.
  • Each of the clips 532 shown in the storyboard window 530 typically show the first frame of a corresponding clip, however, the user may select a frame other than the first frame to be shown in the storyboard window.
  • the collection of clips stored in the storyboard window are referred to as a bin.
  • the journalist has the option of playing one of the clips in the viewing window or playing the bin of clips as arranged in the storyboard window.
  • the final pre-edited composition contained on the journalist workstation 110 may be transferred to the multimedia archive 200 for reuse by the journalist or other journalists on other journalist workstations 110 and for final editing and playout by a user of the video production system 300 .
  • a composition produced during a low resolution activity on a journalist workstation 110 may be played out in different ways.
  • a user of a journalist workstation 110 may play the low resolution composition by retrieving the composition data from the multimedia archive 200 , or a user of the video production system 300 , for example a user of the video editor 320 , may play and edit a high resolution version of the composition.
  • the translation of the low resolution composition to its high resolution equivalent is transparent to the user of the video editor 320 .
  • the asset manager 360 using registration information of each of the low resolution sources used in the composition can identify the equivalent high resolution sources and translate the low resolution composition into its high resolution equivalent. Efficient translation by the asset manager 360 requires a unique registration system for each of the clips stored within the system. Further, the registration method must include means for identifying the corresponding high resolution version of low resolution media data. A preferred registration method is described in detail further below.
  • An editor using the video editor 320 , receives the high resolution version of the low resolution composition created by the journalist, and can further edit the composition in broadcast quality format, to provide more precise editing cuts than accomplished by the journalist.
  • the media data is organized in a media container 600 as shown in FIG. 12.
  • the media container 600 is divided into five subsections including container data 610 , container timing 620 , media security 630 , meta data 640 and media or media pointers 650 .
  • the information contained within the container data 610 describes the container itself and may include the following information: the name of the person that created the container data; the name of the person that approved the container data; an identification of the container security; a creation time stamp; the name of all people that have modified the data; a modification time stamp; a user's log; cost information associated with the data; and other user defined elements.
  • Container timing 620 includes information related to a relationship over time of the media in the container. This information is only applicable to a story being prepared for broadcast.
  • the media security segment 630 provides further information concerning the security level of the media contained within the container. This information can be used to restrict access to specified personnel of media contained within the container.
  • the meta data information describes the media stored in the container.
  • the meta data contains the following information for each media object in the container: the name of the person that approved the data; the name of the person that created the data; a creation time stamp; a media identifier; media status; media type; names of all people that have modified the data; a modification time stamp; a reference number; research descriptors; timing information; title; and other user defined elements.
  • the media and media pointers 65 are the actual raw data stored in the container. Media objects of many types may be stored in a single container. The media pointers point to a media object stored in another container. By storing a media pointer to another container, rather than the media of the other container itself, maximum storage efficiency can be attained throughout the system.
  • File structures other than the container file structure described above, may be used for storing the media data in the digital multimedia newsroom production system.
  • OMFTM Open Media Framework
  • Another feature of the system shown in FIG. 10 is the ability to uniquely identify the media objects stored within the system and to locate other versions of media data that correspond to the media objects.
  • the ability of the system 90 to locate a high resolution version of media data, corresponding to a low resolution version of the same media data, allows the asset manager 360 to provide a high resolution translation of combinations or storyboards generated by the journalist workstation 110 , such that the translation is transparent to an editor using the video editor 320 .
  • the asset manager can uniquely identify the low resolution and high resolution media data in a number of ways.
  • the media data when captured by the media recorder 330 , is assigned a unique time code stamp corresponding to the date and time that the media data is captured by the media recorder 330 .
  • the low resolution version of the media data and the high resolution version of the media data is assigned the same identification number.
  • the low resolution media data is stored in the multimedia archive 200
  • the high resolution media data is stored in the media server, there is no opportunity for confusion between the versions of the media data.
  • the asset manager in translating a combination or storyboard from a low resolution version to a high resolution version, can locate the high resolution version of each media object of the combination in the media server based on the identification number of the corresponding low resolution version of the media object.
  • the above-described media data identifying method is not preferred for use at broadcast locations that do not maintain a unique timecode stamp.
  • the asset manager 360 may be implemented using Media File Manager (MFM) and Source Manager (SM) software as described in U.S. Pat. No. 5,267,351 to Reber et al which is incorporated herein by reference.
  • MFM Media File Manager
  • SM Source Manager
  • This software provides a unique identifier to media data captured by the system and maintains a table of relationships between media objects contained within the system such that the asset manager 360 can identify a corresponding version of low resolution or high resolution media data.
  • a digital multimedia newsroom production system consists only of the core newsroom system 100 and the multimedia archive system 200 coupled by the digital network 400 .
  • a low resolution capture device is coupled to the network 400 to capture low resolution media data for storage in the news servers 120 and the multimedia archive system 200 .
  • the journalist workstations 110 provide the full storyboard functions described above with respect to the system 90 shown in FIG. 10.
  • Embodiments of the invention overcome limitations of prior art systems by providing a fully integrated digital multimedia newsroom.
  • a journalist in a newsroom may create a multimedia storyboard of a news story which is electronically transferred over a digital network to an editing and production system for final editing and broadcast to air.
  • Embodiments of the invention have been described with respect to a multimedia production system in a newsroom environment, however, embodiments of the invention are not limited to a newsroom environment, but rather may be used in other multimedia environments as well, such as radio, and in the production of entertainment programming.
  • the multimedia data processed on the journalist workstation 110 has been described as low resolution multimedia data.
  • the user interface provided by the journalist workstation 110 may also be used to create storyboards using high resolution multimedia data.
  • the embodiments have been described in a newsroom context.
  • the invention may be applied anywhere in the movie, television and cable industry, where multimedia data, and particularly, motion video data, is to be processed.
  • the invention is suitable for active movie systems, video conferencing, and cable pay per view systems.

Abstract

A digital multimedia newsroom production system allows users of the system to create, browse and catalog multimedia assets. The system includes a multimedia capture and encoding system that captures multimedia data, and substantially simultaneously provides a first compressed version of the multimedia data having a first resolution, and a second compressed version of the multimedia data having a second resolution that is different than the first resolution; a multimedia storage system, coupled to the multimedia capture and encoding system, that stores multimedia information including the first and second compressed versions of the multimedia data; and a video editing and playback system coupled to the multimedia storage system. The video editing and playback system includes editing circuitry that generates a composition that uses a portion of the first compressed version, and playback circuitry that plays the composition using a portion of the second compressed version that corresponds to the portion of the first compressed version. The multimedia storage system stores multimedia information including the compressed versions of the multimedia data, and provides to a network the first compressed version of the multimedia data substantially simultaneously as the first compressed version is stored.

Description

    CROSS-REFERENCE OF RELATED APPLICATION
  • This application is a continuing application of U.S. patent application Ser. No. 09/173,815, filed Oct. 16, 1998, which is a continuing application of U.S. patent application Ser. No. 09/019,945, filed Feb. 6, 1998, which is a continuing application of U.S. patent application Ser. No. 08/832,868, filed Apr. 4, 1997, now abandoned.[0001]
  • FIELD OF THE INVENTION
  • The present invention relates generally to a multimedia system with improved data management mechanisms, and more particularly to a method and apparatus for substantially simultaneously encoding multiple versions of a multimedia data signal, and providing substantially simultaneous access and storage of the multiple versions, a correspondence between the multiple versions being generated during storage. [0002]
  • BACKGROUND OF THE INVENTION
  • Over the last few decades, the process for producing broadcast news programs has undergone several changes. Increased competition brought about by the expansion of cable outlets and other news sources, and changes in technology, have forced news broadcasters to use their resources more effectively. [0003]
  • To produce a news program, a typical news production organization performs four major operations, which are illustrated in FIG. 1. In particular, the operations include [0004] video production 10, graphics production 12, text production 14 and on-air operations 16. Unfortunately, the results of these operations rarely are combined effectively until the actual broadcast of the news program.
  • [0005] Video production 10 includes generating and editing motion video for broadcasting using video information retrieved from a video archive or produced from various sources (e.g., cameras, either studio or field recorded). Text production 14 includes scripting and editing of text gathered from several sources including a text archive. Similar to video production 10 and text production 14, graphics production 12 includes generating and editing graphics data, such as titling and still images gathered from a variety of sources.
  • In order to produce a final news product for broadcast, results from [0006] video production 10, graphics production 12 and text production 14 must be properly integrated during the on-air operations 16. Existing news broadcast systems are capable of such integration. In particular, these systems permit complete management of the audio and video elements of the news program from acquisition, through editing, distribution and on-air play.
  • A conventional process for integrating the major operations is illustrated in FIG. 2. As shown in FIG. 2, a disk-based [0007] video production operation 30 is integrated with a media production process 32 and on air operations 34. The use of disk-based digital audio/video storage systems, digital networks, and digital non-linear editing systems has allowed for successful integration of video production, graphics production and on-air operations. Several products are available from Avid Technology, Inc., Tewksbury, Mass., for providing the integration process shown in FIG. 2.
  • The newsroom text production and [0008] management system 14 of FIG. 2 is the same text production and management system 14 shown in FIG. 1. Although newsroom computer systems have been in use for several years, these computer systems are predominately text based, and have limited integration capabilities with tape-based or disk-based audio/video production systems. Newsroom computer systems, such as those previously available from BaSys, and now from Avid Technology under the name NetStation, have developed from systems which were developed to receive news agency copy and provide simple word processing and communications facilities. In more recent years, add-ons of various kinds have been developed which provide some integration of the text production operation with the audio/video production operation. However, only limited integration of the text and audio/video data has been achieved, thereby providing only limited multimedia capability.
  • In a typical news production organization, a journalist develops an idea for a story, and determines how various audio/video clips should be used in the story. Often, the journalist will preview audio/video footage that has been archived, and select portions of the archived footage, called clips, for use in the story. Then, the journalist provides instructions to an editor who edits the clips to produce a final form of the story that is suitable for broadcast. [0009]
  • In some instances, particularly if the story is complex, the journalist may wish to prepare a rough form of the story and provide the rough form to the editor for final preparation. A rough form of what the journalist expects for the final form of the story is better than verbal or hand written instructions. To this end, if the journalist wishes to incorporate video from a previous broadcast that is contained in a video tape archive, the journalist must request that the tape be retrieved manually, and must then review the tape in an edit bay or a similar location. The journalist may then perform some preliminary editing of the archived video, with other material such as video of recent events, text and graphics received over news wire services, and archived text, before providing the rough form to the editor and instructing the editor to prepare the final form of the story for broadcast. In present day systems, the capability to perform the above-identified functions is not available to the journalist in a newsroom system, but as discussed above, must be performed remotely, for example, in an edit bay. [0010]
  • Furthermore, a journalist may wish to prepare a story about a particular event while the event unfolds. If the journalist has access to a live feed of the event, it is likely that the journalist will record the event on a video tape using a video tape recorder (VTR), or in a file on a disk using a non-linear disk-based audio/video production system. If the journalist is recording the event on video tape and wishes to prepare a rough form of the story by integrating recorded portions of event, the journalist must stop the VTR, and rewind the video tape to the specific recorded portions intended for integration. If new developments occur while the journalist is using the VTR to integrate the recorded portions, the live feed of these new developments will be lost unless the live feed is recorded simultaneously on a second tape using a second VTR. Similarly, if the journalist is using a conventional non-linear disk-based audio/video production system to record the live feed in a file, the journalist must terminate the recording before the journalist can access the recorded portions from the file for integration into the story. To record additional developments of the event on the disk-based system, the journalist must record the additional developments into a second file. Storage of the event among multiple tapes and files is inefficient and requires additional overhead to keep track of multiple tapes and files. [0011]
  • SUMMARY OF THE INVENTION
  • An embodiment of the invention is directed to a multimedia system that includes a multimedia capture and encoding system that captures multimedia data, and provides a first compressed version of the multimedia data having a first resolution and a second compressed version of the multimedia data having a second resolution that is different from the first resolution. The multimedia system further includes a multimedia storage system, coupled to the multimedia capture and encoding system, that stores multimedia information including the first and second compressed versions of the multimedia data. The multimedia system further includes a video editing and playback system coupled to the multimedia storage system. The video editing and playback system includes editing circuitry that generates a composition that uses a portion of the first compressed version, and playback circuitry that plays the composition using a portion of the second compressed version that corresponds to the portion of the first compressed version. [0012]
  • Another embodiment of the invention is directed to a multimedia system that includes a multimedia capture and encoding system that captures multimedia data, and provides a compressed version of the multimedia data having a first resolution. The multimedia system further includes a multimedia storage system, coupled to the multimedia capture and encoding system that stores multimedia information including the compressed version of the multimedia data, and provides to a network the compressed version of the multimedia data substantially simultaneously as the compressed version is stored. [0013]
  • According to an embodiment of the invention, the multimedia storage system includes a server coupled to the network that sends the compressed version on the network. [0014]
  • According to another embodiment, the multimedia system further includes a video host coupled to the network that sends a first request to the server for a first portion of the compressed version of the multimedia data, determines an amount of time to wait based on a length of the first portion and a response time of the first request, and sends a second request to the server for a second portion of the compressed version of the multimedia data after waiting the determined amount of time.[0015]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a better understanding of the present invention, reference is made to the accompanying drawings which are incorporated herein by reference and in which: [0016]
  • FIG. 1 is a block diagram illustrating components of a typical television news operation; [0017]
  • FIG. 2 is a block diagram illustrating components of a typical television news operation having audio/video production capabilities integrated with on-air operations; [0018]
  • FIG. 3 is a block diagram of a digital multimedia system according to an embodiment of the present invention; [0019]
  • FIG. 4 is a block diagram of a digital multimedia system having a capture manager and an asset manager according to an embodiment of the present invention; [0020]
  • FIG. 5 is a block diagram of a digital multimedia system having multiple low resolution encoders and multiple high resolution encoders according to an embodiment of the present invention; [0021]
  • FIG. 6 is a block diagram of a digital multimedia system having a browse server according to an embodiment of the present invention; [0022]
  • FIG. 7 is a flow diagram of a method performed by a video host of a digital multimedia system, according to an embodiment of the present invention; [0023]
  • FIG. 8 is a flow diagram of a method performed by a browse server of a digital multimedia system, according to an embodiment of the invention; [0024]
  • FIG. 9 is a view of a dialog window of a capture manager of a digital multimedia system according to an embodiment of the present invention; [0025]
  • FIG. 10 is a block diagram of a digital multimedia system having a core newsroom system, a multimedia archive, and a video production system, according to an embodiment of the present invention; [0026]
  • FIG. 11 is a view of a graphics user interface of a digital multimedia newsroom production system according to an embodiment of the present invention; and [0027]
  • FIG. 12 is a diagram of a multimedia file structure according to an embodiment of the present invention.[0028]
  • DETAILED DESCRIPTION
  • FIG. 3 shows a [0029] digital multimedia system 50 for managing motion video data in accordance with an embodiment of the invention. The multimedia system 50 enables one or more users to manipulate effectively motion video data, text, graphics and audio (i.e., multimedia data) and generate a multimedia composition. In particular, the system 50 substantially simultaneously encodes a low resolution version and a high resolution version of multimedia data. A journalist using the system generates a composition using a portion of the low resolution version, and an editor plays the composition using a portion of the high resolution version that corresponds to the portion of the low resolution version.
  • The [0030] multimedia system 50 includes a multimedia capture and encoding system 52 that captures multimedia data, and substantially simultaneously provides the first compressed version of the multimedia data having the first resolution, and the second compressed version of the multimedia data having the second resolution that is different than the first resolution.
  • The multimedia system further includes a [0031] multimedia storage system 54, coupled to the multimedia capture and encoding system 52, that stores multimedia information including the first and second compressed versions of the multimedia data. In particular, the multimedia storage system 54 includes a digital computer-readable and writable non-volatile random-access medium, such as a magnetic disk, for storing the first and second compressed versions digitally and non-linearly.
  • The [0032] multimedia system 50 further includes a video editing and playback system 56 coupled to the multimedia storage system 54. The video editing and playback system 56 includes editing circuitry 58 that generates a composition that uses a portion of the first compressed version, and playback circuitry 60 that plays the composition using a portion of the second compressed version that corresponds to the portion of the first compressed version. The composition includes one or more data structures that define a list of video entries. Each video entry indicates a name of a file containing video information, and a range of the file that defines a portion of the video information. The editing circuitry 58 and the playback circuitry 60 are typically used by a journalist and an editor, respectively. Alternatively, both the editing circuitry 58 and the playback circuitry 60 may reside on a single graphics workstation.
  • Another embodiment of the invention is a [0033] newsroom production system 700 which is illustrated in FIG. 4. The system 700 substantially simultaneously encodes a low resolution version and a high resolution version of multimedia data, and enables a journalist using the system to generate a composition using a portion of the low resolution version, and an editor to play the composition using a portion of the high resolution version that corresponds to the portion of the low resolution version. As in the embodiment of FIG. 3, the system 700 enables a news production organization to manipulate effectively multimedia data including motion video clips from a variety of sources, as well as text, live presentations by announcers and associated graphics.
  • The [0034] system 700 is similar to the system 50 described above in that the system 700 includes a multimedia capture and encoding system 710, a multimedia storage system 730 and a video editing and playback system 750. The system further includes a first computer network 704 and a second computer network 706 that are coupled to a bridge 708. Each of the multimedia capture and encoding system 710, the multimedia storage system 730, and the video editing and playback system 750 is coupled to the first network 704 and the second network 706. According to an embodiment of the invention, the network 706 is an ATM network such as AvidNet available from Avid Technology, Inc., Tewksbury, Mass., which is described in U.S. patent application Ser. No. 08/215,849, which is hereby incorporated by reference. The system 700 also includes an input 702 for receiving multimedia data, from one or more sources.
  • The multimedia capture and [0035] encoding system 710 includes a first encoder 712 coupled to the first network 704, a second encoder 716 coupled to the second network 706, and an encoding controller 714 interconnected between the encoders 712 and 716. The encoding controller 714 is also referred to as a capture manager. Each of the encoders 712 and 716 is further coupled to the video input 702 to receive the multimedia data.
  • The [0036] multimedia storage system 730 includes a first video server 732 coupled to the first network 704, a second video server 736 coupled to the second network 706, and an asset manager 734. The asset manager 734 is coupled to each of the second video server 736, the capture manager 714 and the second encoder 716.
  • The video editing and [0037] playback system 750 includes a first workstation 742 coupled to the first graphics network 704, and a second graphics workstation 744 coupled to the second network 706. The first graphics workstation includes first editing circuitry 752 coupled to the first network 704. The second graphics workstation includes playback circuitry 754 coupled to the second network 706, and second editing circuitry 756 coupled to the second network 706. Alternatively, the playback circuitry 754 and the second editing circuitry 756 may reside on separate graphics workstations each of which is coupled to the second network 706. Both the playback circuitry 754 and the second editing circuitry 756 are further coupled to the asset manager 734.
  • When the [0038] system 700 is in operation, the first and second encoders 712 and 716 substantially simultaneously receive a multimedia data signal from the input 702. The first encoder 712 outputs over the network 704 a signal containing a first compressed version of the multimedia data. The second encoder 716 outputs over the network 706 a signal containing a second compressed version of the multimedia data. The resolution of the first compressed version is different than the resolution of the second compressed version. In one embodiment, the first and second resolutions differ from a time perspective so that one of the versions uses less frames than the other over a given interval of time. In another embodiment, the first and second resolutions differ spatially, i.e., in the number of pixels used to represent a still image, so that one of the versions provides images of a better clarity than the other version. In yet another embodiment, the first and second resolutions differ both temporally, i.e., the number of images per second of motion video, and spatially. In a particular embodiment of the invention, the first compressed version is an MPEG-1 (ISO/IEC 11172-1 through 9) encoded stream, and the second compressed version is a 60 field per second motion-JPEG (MJPEG) encoded stream of broadcast television quality images so that the first and second compressed versions have different temporal and spatial resolutions.
  • The [0039] first video server 732 receives and stores the first compressed version from the first encoder 712. The second video server 736 receives and stores the second compressed version from the second encoder 716. Storage of the first and second compressed versions occurs substantially simultaneously. In a preferred embodiment, the first video server 732 is a low resolution video server that stores low resolution multimedia data such as Avid BrowseServer, and the second video server 736 is a high resolution video server that stores high resolution multimedia data such as Avid MediaServer. Both Avid MediaServer and Avid BrowseServer are motion video storage devices available from Avid Technology, Inc., Tewksbury, Mass. The capture manager 714 controls the asset manager 734 so that a correspondence between the first and second compressed versions is generated. In particular, the asset manager 734 initially creates and then maintains a mapping of the first and second compressed versions. In one embodiment, the mapping is achieved by storing file identification information and timecode data in a file. If a filename and timecode range identifying a portion of the first compressed version is provided to the asset manager 734, the asset manager can identify a portion of the second compressed version that corresponds to the portion of the first compressed version. In particular, the asset manager 734 searches the file and retrieves a filename and a timecode range identifying the portion of the second compressed version that corresponds to the portion of the first compressed version. Accordingly, correspondence between the first and second compressed versions is achieved.
  • Such a mapping mechanism may be implemented by using a form of dynamic linking as disclosed in U.S. Pat. No. 5,267,351 to Reber et al., which is hereby incorporated by reference. In particular, the [0040] asset manager 734 may maintain indications of high resolution video files which are equivalent to low resolution files. This equivalency can be used to automatically and dynamically associate the appropriate high resolution files with the low resolution files used by the journalist to create a multimedia composition.
  • If timecode information is unavailable on the audio/video feed received by the [0041] input 702, the capture manager 714 ensures that timecode information is included in the encoded multimedia data. In particular, if timecode information is not included, either the capture manager 714 or the asset manager 734 adds timecode information to the second compressed version before it is stored in the second video server 736.
  • The operation of the [0042] system 700 will now be described in connection with a newsroom setting. A journalist in the newsroom operates the first workstation 742 and an editor operates the second workstation 744. The journalist generates a composition that uses a portion of the first compressed version of the multimedia data having the first resolution. It is not necessary that the composition be suitable for broadcast. Rather, the composition may be a rough form of the journalist's story that an editor can convert into a final form for broadcast. In particular, the journalist sends the representation of the composition (not including the media data) to the editor across the first and second networks 704 and 706 through the bridge 708. When the editor receives the composition, the editor can play the composition on the second workstation 744. When the second workstation 744 plays the composition, the second workstation plays a portion of the second compressed version of the multimedia data having the second resolution rather than the portion of the first compressed version used by the journalist. The editor converts the composition into the final broadcast form by performing editing operations, such as adding blend and fade transitions between audio/video portions and other special effects, using the second editing circuitry 756.
  • It should be understood that when the journalist generates the composition using the [0043] first workstation 742, the first compressed version of the multimedia data is transferred only through the first network 704. Similarly, when the editor plays the composition using the second workstation 744, the second compressed version of the multimedia data is transferred only through the second network 706.
  • Also, it should be understood that, using the [0044] system 700, neither the journalist nor the editor leaves their respective workstations to retrieve audio/video footage for integration into the composition. The journalist has access to the first compressed version stored in the first video server 732. Similarly, the editor has access to the second compressed version stored in the second video server 736.
  • Another embodiment of the invention is directed to a newsroom production system [0045] 800 which is illustrated in FIG. 5. This embodiment is similar to the embodiment of FIG. 4 except that it includes multiple first encoders 812, multiple second encoders 816, and multiple first workstations 842. As in the other embodiments of the invention previously described, a news production organization can utilize effectively the system 800 to generate news stories that include broadcast quality motion video clips from a variety of sources. The encoders 812 are low resolution encoders that simultaneously output low resolution compressed versions of multimedia data from various sources. The encoders 816 are high resolution encoders that simultaneously output high resolution compressed versions of the multimedia data from the various sources.
  • A benefit of the system [0046] 800 is that the multimedia data transferred through the first network 804 is low resolution data which requires less bandwidth than high resolution data. Accordingly, several workstations 842 can be connected to the first network 804 without experiencing substantial degradation in performance of the first network 804. The multimedia data transferred through the second network 806 is high resolution data which uses more network bandwidth than low resolution data. Nevertheless, the second network 806 provides suitable performance since it is isolated from network traffic caused by the first workstations 842. Accordingly, more than one second workstation 844 may be connected to the second network 806. The bridge 808 allows certain signals to pass from one network to the other. In particular, the bridge 808 allows a journalist working on one of the first workstations 842 to send a generated composition, i.e., one or more data structures that define a story, to an editor working on one of the second workstations 844.
  • Since the system [0047] 800 includes more than one low resolution encoder 812 and more than one high resolution encoder 816, as illustrated in FIG. 5, the system 800 can capture, encode and store both low resolution and high resolution versions of more than one audio/video feed simultaneously. The capture manager 814 maintains control of the multiple encoding sessions simultaneously. For example, the input 802 may have a first terminal that is connected to a satellite feed so that the satellite feed can be encoded and stored by the system 800. The input 802 may have a second terminal that is connected to a live camera so that the camera feed can be encoded and stored simultaneously by the system 800.
  • According to embodiments of the invention, a user can view and edit an encoded version of an audio/video feed while the encoded version is being encoded and stored in a file on one of the video servers. In accordance with these embodiments, a [0048] system 890 includes an encoder, a browse server, and a workstation, as shown in FIG. 6. These devices can be a portion of the multicast system 800 of FIG. 5.
  • The [0049] first video server 832 illustrated in FIG. 5 is suitable as the browse server 832 in FIG. 6. The browse server 832 includes a buffer cache 870 and disk-based memory 880, as shown in FIG. 6. As the browse server 832 receives an encoded version of an audio/video feed from the low resolution video encoder 812, portions of the encoded version are initially cached in the buffer cache 870 by the browse server's operating system. The operating system writes, i.e., flushes these portions from the buffer cache 870 to a file on the disk-based memory 880. As the operating system writes the portions to the memory, the operating system simultaneously sends network packets including these portions onto the network 804 to one or more workstations 854 for viewing and possible editing if a request for the encoded version is received from the one or more workstations 854.
  • In one embodiment of the invention, the [0050] browse server 832 uses the Windows NT operating system available from Microsoft, Corporation, Redmond, Wash., which permits data to be stored into a file, and simultaneously read from the file without file contention problems. The system 890 utilizes file access operations provided by the Windows NT operating system so that multimedia data can be flushed from buffer cache 870 to the disk-based memory 880, and simultaneously sent to one or more workstations 842 through the network 804. In particular, according to an embodiment of the invention, the system utilizes a feature of the NT file system providing the ability to read data from a file while data is appended to the file.
  • In one embodiment, the portions are multicast (or “pushed”) over the [0051] network 804, and any workstations 842 wishing to have access to the portions simply register a request with the browse server 832. Then, the browse server multicasts the portions to the workstations 842 over the network 804 using IP multicasting as the browse server simultaneously stores the portions.
  • In another embodiment, the [0052] browse server 832 responds specifically to individual requests for encoded portions, i.e., the portions are “pulled” from the browse server 832 to the workstation 842. In this embodiment, the browse server 832 functions as a web server by providing packets of information onto a computer network in response to individual requests from various workstations 842. In particular, when a workstation 842 wishes to receive a portion of the encoded version of multimedia data, the workstation 842 sends a request onto the network 804 that is received by the browse server 832. The browse server 832 responds by sending a network packet containing a portion of the encoded version back to the workstation 842 substantially simultaneously as the encoded version is stored into a file on the browse server 832.
  • In accordance with an embodiment of the invention, the [0053] workstation 842 performs the method 1000 illustrated in FIG. 7. In step 1002, the workstation 842 sends a request to the browse server 832 for one or more portions of the encoded version that is being simultaneously stored in the browse server 832. In step 1004, the workstation 842 waits until it receives portions of the encoded version from the browse server 832 in response to the request. In step 1006, the workstation 842 receives and plays one or more portions, and determines when to send a next request for more portions. The time for sending a next request depends on both the amount of video data received, e.g., the number of portions, and the time it took between sending the request and receiving the data. In step 1006, the workstation sends the next request expecting to receive one or more new portions of the encoded version a predetermined amount of time before the workstation 842 is through playing the earlier received portions. Accordingly, the workstation 842 attempts to maintain some predetermined amount of lead time. In one embodiment, this lead time is approximately 0.5 seconds so that the workstation 842 sends the next request expecting that the next portions will be received 0.5 seconds before the previous portion is through playing. In step 1008, the workstation 842 checks whether the end of the file that stores the encoded version has been reached. If so, the method 1000 terminates. Otherwise, the workstation 842 repeats the method 1000.
  • In accordance with an embodiment of the invention, the [0054] workstation 842 uses an active reader thread to acquire the new portions. If more than 6 seconds worth of material is stored by the workstation 842, the reader thread sleeps for a predetermined amount of time or until it is activated.
  • In accordance with an embodiment of the invention, the [0055] browse server 832 performs the method 1100 illustrated in FIG. 8. In step 1102, the browse server 832 opens a file to store the encoded version of multimedia data. In step 1104, the browse server 832 polls the network 804 for requests for portions of the encoded version, and when a request is received from the workstation 842, the browse server 832 sends one or more portions of the encoded version to the workstation 842. The browse server 832 can track which portions of the encoded version have been sent to the workstation 842 and which portions to send in response to the next request. In particular, if the workstation 842 includes an identification ID with its request, the browse server 832 can use the ID to find determine which file and which read block need to be accessed, and then send the read block and other information such as timecode and length information of the portion or portions of the encoded version defined by the read block. Alternatively, the workstation 842 tracks which portions of the encoded version are needed next, and sends an indication of which portions it needs with the next request. In step 1106, the browse server 832 determines whether the encoded version has been completely stored, e.g., whether the encoder 812 has been stopped. If so, the browse server 1106 proceeds to step 1108 and closes the file, and sends an end of file indication along with any remaining unsent portions when a next request is received from the workstation 842. Otherwise, the browse server 832 proceeds to step 1104 to continue polling the network and storing the encoded version in the file.
  • It should be understood that the [0056] workstation 842 may be an Internet client by having an IP address, and the browser server 832 is effectively a server, such as an http server or other kind of server that uses the TCP/IP protocol. According to a preferred embodiment, communications between the workstation 842 and the browse server 832 are “connectionless.” That is, the requests sent from the workstation 842 to the browse server 832 establish a connection only for the period of time required to transmit network packets of the request. Similarly, another connection is established between the browse server 832 and the workstation 842 for transfer of one or more portions of the encoded version across the network 804. Otherwise, no connection exists, i.e., no connection stream remains open.
  • In one embodiment, http server software may be used by the [0057] browse server 832 to handle responses from the workstations 842 which are configured as web hosts. Such software is Microsoft Internet Information Server, or Microsoft Peer Web Services, available from Microsoft, Corporation, Redmond, Wash.
  • Using either of the foregoing embodiments, a journalist working at the [0058] workstation 842 may view and edit an encoded version of the multimedia data while it is being stored in the browse server 832. Accordingly, the journalist may prepare a composition that includes portions of an encoded version of multimedia data, while the multimedia data is being simultaneously stored in a browse server. Furthermore, the journalist is not burdened with having to store the encoded version in multiple files on the same browse server 832. Portions of the encoded version on the workstations 842 can be accessed with a maximum of 5 seconds of delay from the time the audio/video feed is first provided to the input 802.
  • Each journalist may create a recording session and maintain control of the recording session using a graphical user interface of the [0059] capture manager 814. This user interface also includes some access features (e.g., viewing and editing capabilities described above) allowing the journalist to access portions of an encoded version as the encoded version is being simultaneously recorded and stored. An example of the graphical user interface is illustrated in FIG. 9. The interface is in the form of a dialog window 900 that includes one or more property page displaying one or more respective encoding configurations. Each property pages includes buttons that enable the user to send commands and information to the capture manager 814 using conventional input methods with a mouse and a keyboard. Other conventional input mechanisms can be substituted for the mouse and keyboard. In the particular example shown in FIG. 9, the dialog window 900 has six property pages named: Network Feed1, Network Feed2, Archive VTR, Projects VTR, Satellite Feed1, and Satellite Feed 2. The property page for Network Feed1 is shown as being presently in the foreground by “Network Feed1” being displayed as the source 902. The other property pages are shown in the background by tables 940 with their respective names. In this particular example, the high resolution encoder 816 is named “Jupiter” and the low resolution encoder is named “MR1”. The bottom area 904 of the dialog window 900 displays a plurality of buttons including an “Exit” button 906 for exiting the graphical user interface of the capture manager 814, a “New” button 908 for creating a new property page for a new encoding configuration, a “Delete” button 910 for deleting a property page, and a “Help” button 912 for obtaining help through a help window (not shown).
  • The [0060] dialog window 900 further displays recording status information including an elapsed time 914 of the encoding session, a start time 916 that is assigned to the encoded version of the multimedia data being stored, a recorded headframe 918 that is used as a graphical image representation of the encoded version, and a flashing status 920 that indicates a current state of the encoding session.
  • The [0061] dialog window 900 further displays additional control buttons depending on the configuration of the encoding session as identified by its property page. For example, as shown in FIG. 9, the Network Feed1 property page includes encoder control buttons 922: “Standby” 924, “Start” 926 and “Stop” 928, that allow the user to respectively pause, start and stop an encoding session. The Network Feed1 property page further includes a “Previewer” (not shown) that allows the user to view progress of the encoding session, a “Synchronizer” 932 that allows the user to advance to the end of the currently encoded video to view the latest results of the encoding session, a “Metadata Edit Controller” 930 that allows the user to view and modify portions of the encoded version, and a “Headframe Grabber” 934 that allows the user to select, as the headframe for the encoded version, any frame in the encoded version that has been stored.
  • Some of the operations of the [0062] capture manager 814 will described in further detail. Each of the property pages in the dialog window 900 is tabbed, as shown in the area 940 of FIG. 9. When the user selects one of the tabs, the capture manager 814 displays the property page associated with the selected tab in the foreground of the dialog window 900. If the user cannot find an appropriate configuration to select and determines that a new configuration is needed, the user may create a new configuration and a new property page associated with the new configuration by pressing the “New” button 908. The capture manager 814 will respond by prompting the user for information regarding the new configuration until it has enough information to begin a new encoding session. The capture manager 814 begins encoding when the user selects the “Start” button 926. In particular, the capture manager 814 sends a signal to the low resolution encoder 812 through connection 820 (see FIG. 5) causing it to begin encoding. Alternatively, the capture manager 814 sends this signal to the low resolution encoder 812 when the capture manager 814 receives a signal from the high resolution encoder 816 through connection 818 indicating that the high resolution encoder 816 has started encoding. Accordingly, if the user has started the high resolution encoder 818, the low resolution encoder 812 is started automatically and simultaneously.
  • It should be understood that the user interface enable a journalist to control multiple live feeds simultaneously from one graphics workstation. When an operation is desired for one of the encoding sessions, the journalist brings the property page for that encoding session to the foreground in the [0063] dialog window 900 and performs the desired operations. Then, the journalist can perform an operation on a different encoding session by bring it to the foreground.
  • The journalist using the capture manager's [0064] dialog window 900 can view any portion of the encoded version as long as it has been stored in a file in the browse server 832. In particular, the journalist may jump to the beginning of the version, jump to the middle of the version, and jump to the end of the version. All of these access methods can occur while browse server 832 continues storing additional portions of the encoded version in the same file.
  • Furthermore, the journalist may add markers to the portions of the encoded version in real time. The journalist is not required to wait until an encoding session is over before viewing and marking multimedia data. [0065]
  • Another embodiment of the invention is directed to a multimedia [0066] newsroom production system 90, as illustrated in FIG. 10. This system is described in U.S. patent application Ser. No. 08/631,441, filed on Apr. 12, 1996, which is hereby incorporated by reference. The newsroom production system 90 enables a news production organization to manipulate effectively multimedia data to generate news stories for broadcasting. Each generated news story may include several broadcast quality motion video clips from various sources. The system 90 includes three major systems, a core newsroom system 100, a multimedia archive 200, and a video production system 300.
  • In one embodiment, the components of the systems are interconnected through a single digital network. Preferably, the single digital network is a 100 Mb/s network. [0067]
  • In another embodiment, the components of the core newsroom system and the multimedia archive are interconnected using a first [0068] digital network 400, and the components of the video production system are interconnected with a second digital network 410. An adaptor box 420 is connected to both the first digital network 400 and the second digital networks 410 to enable communication between the two networks. In a preferred embodiment of the invention, the first digital network 400 is implemented using an Ethernet system having a data rate equal to, or greater than, 100 Mb/s, and the second digital network 410 is implemented using an Ethernet system having a data rate equal to, or greater than, 10 Mb/s. The adaptor box 420 may be implemented using one of a number of commercially available products such as a FastNet 10 available from Cabletron Systems, Inc, Rochester, N.H.
  • Each of the major components of the [0069] newsroom production system 90 is described in greater detail below.
  • [0070] Video Production System 300
  • The [0071] video production system 300 provides audio/video capture, media data editing, and management and control of high quality multimedia data suitable for broadcast. The multimedia data can be any form of information that can be represented in a digital form. The video production system includes a digital playback system 310, a video editor 320, a media recorder 330 connected to an MPEG encoder 340, a media server 350 including an asset manager 360, a high bandwidth data network 364, and a graphics workstation 370.
  • The [0072] media server 350 is a large scale computer that stores and delivers high quality audio and motion JPEG video (MJPEG), suitable for broadcast, in conjunction with the other devices of the video production system 300. The media server 350 can also function as an archive system for multimedia data produced in the video production system 300. In a preferred embodiment of the invention, additional near-line storage and off-line storage is provided on a digital data storage medium, such as tape or optical disks, to relieve the media server of archive responsibilities to provide additional on-line storage capabilities within the media server 350.
  • An [0073] asset manager 360 is an integral part of the media server 350 and is implemented as software in the media server 350. The asset manager 360 stores information and is the tool used to manage the data stored in the near-line storage and the off-line storage. The material stored in the media archive can be automatically moved to on-line status on the media server by the asset manager 360. The asset manager 360 contains search support data for locating media objects stored in the media server 350, in the near-line storage system and in the offline storage system. The asset manager 360 also contains composition information that can be used to capture, edit, and play back the media objects stored in the media server 350. As described below in greater detail, the media server 350 also provides translation of low resolution media data compositions, generated within the core newsroom system, to high resolution media data compositions for editing and playback within the video production system. In a preferred embodiment, the media server 350 is implemented using an Avid MediaServer™ available from Avid Technology, Inc., Tewksbury, Mass.
  • The [0074] media recorder 330 is a disk-based digital recording workstation which is used to capture audio/video data and provide digitization and compression of the audio/video data. The media recorder 330 digitizes, compresses and records audio/video material and transmits the digitized compressed data to the media server over the high speed network for storage on the media server 350.
  • In a preferred embodiment of the invention, the [0075] media recorder 330 uses an MJPEG encoding scheme to generate high quality, high resolution, compressed digital data suitable for broadcast. In the preferred embodiment, an MPEG encoder 340 is coupled to the media recorder 330 to also provide MPEG compression capability. As described in greater detail below, the addition of the MPEG encoder 340 to the media recorder 330 provides the system with a dual-digitizing capability for media data recorded by the media recorder 330. The MPEG encoder provides greater compression of the data than the media recorder 330, thereby allowing the data to be efficiently transmitted over the Ethernet network 400 to be played on the journalist workstations 110. As shown in FIG. 4 the MPEG encoder 340 has a direct connection to the digital network 400 to provide MPEG encoded media data to the multimedia archive 200.
  • In a preferred embodiment, the [0076] media recorder 330 is implemented using an Avid Media Recorder™ available from Avid Technology Inc., Tewksbury, Mass.
  • The [0077] video editor 320 is a full-feature, digital, non-linear video editing workstation specifically tailored to provide functions for news editing. The video editor provides editing of high resolution broadcast quality images provided by the media server 350. In a preferred embodiment, the video editor is implemented using a an Avid NewsCutter™ or an Avid Media Composer®, both of which are available from Avid Technology Inc., Tewksbury, Mass. The digital playback system 310 is a digital, disk-based playback system that manages the broadcast to air of multimedia data produced and stored within the video production system 300. The digital playback system 310 plays materials stored either locally or on the media server 350 in accordance with play lists generated from a program lineup created on one of the journalist workstations 110 within the core newsroom system 100, or on a workstation directly coupled to the video production system (not shown). In a preferred embodiment of the invention, the digital playback system 310 is implemented using an Avid AirPlay® available from Avid Technology, Inc., Tewksbury, Mass.
  • The high bandwidth network [0078] 364 provides high speed communication between the components of the video production system 300. In a preferred embodiment of the invention, the high bandwidth network 364 is implemented using an ATM network as described in co-pending U.S. patent application Ser. No. 08/249,849, titled An Apparatus and Computer Implemented Process For Providing Real-Time Multimedia Data Transport in a Distributed Computing System, which is incorporated herein by reference. The high bandwidth network 364 supports real time playback of broadcast quality MJPEG video and multi-track audio over fiber optic networks.
  • The [0079] graphics workstation 370 is used for generating and editing graphics material for broadcast from and storage in the video production system. In a preferred embodiment, the graphics workstation 370 is implemented using a Matador Workstation available from Avid Technology, Inc., Tewksbury, Mass.
  • It should be understood that the [0080] media recorder 330 and the MPEG encoder 340 form a multimedia capture and encoding system, as illustrated in the embodiment of FIG. 3. In particular, the combination of the media recorder 330 and the MPEG encoder 340 captures multimedia data, and substantially simultaneously provides a first compressed version of the multimedia data having a first resolution (e.g., MPEG), and a second compressed version of the multimedia data having a second resolution (e.g., MJPEG) that is different than the first resolution.
  • It should be further understood that the [0081] graphics workstation 370 forms playback circuitry 60 of a video editing and playback system 56, as illustrated in FIG. 3. In particular, the graphics workstation plays compositions that use compressed versions of multimedia data stored in the media server 350. As will be described below, the compositions may be generated by the core newsroom system 100 using different compressed versions of multimedia data stored in the multimedia archive system 200.
  • Core Newsroom System [0082] 100
  • The core newsroom system [0083] 100 consists primarily of a number of journalist workstations 110 and a pair of news servers 120. FIG. 10 shows a newsroom system having three journalist workstations 110. In embodiments of the invention, the number of workstations 110 actually used may be much greater than three, and the actual number of journalist workstations 110 that may be used in the system is based on several factors including the amount of network activity generated by each user of the workstations and by the amount of delay each user will tolerate in accessing the system.
  • In a preferred embodiment of the invention, each of the [0084] journalist workstations 110 is implemented using an MPC III compliant workstation.
  • The [0085] journalist workstation 110 provides access to multimedia data from a variety of sources and includes the tools (i.e. software) necessary to create a multimedia storyboard of a news story for broadcast. The multimedia data available to the journalist includes the low resolution MPEG video data captured by the media recorder. In one embodiment of the invention, each of the journalist workstations 110 includes a video port for receiving video from, for example, a VTR. Each of the journalist workstations 110 also includes a serial port for controlling the VTR. The graphics user interface of the journalist workstation 110 and the functions available to a user of the journalist workstation 110 are described in greater detail below.
  • The news server [0086] 120 provide management and storage of the multimedia data in the newsroom environment. The news servers 120 are configured as distributed processors with mirrored data bases to provide maximum reliability and performance. Other centralized functions, such as communications functions, are managed by the news servers 120. In a preferred embodiment, the news servers 120 are implemented using an Avid NewsServer available from Avid Technology, Inc., Tewksbury, Mass. The news servers 120 have external connections 122 for providing access to news wire services and to allow remote access to the news servers 120 from users external to the core newsroom system.
  • The core newsroom system [0087] 100 may also include one or more terminal servers 140 to provide connection to the digital network 400 for user terminals 130. The user terminals may be one of several different terminals used in prior art systems primarily for text processing and communications functions. A device controller 150, or a number of device controllers 150, may also be coupled to the digital network 400 to provide control of several multimedia devices, such as teleprompters, from the journalist workstations.
  • It should be understood that a [0088] journalist workstation 110 of the core newsroom system 100 in combination with a graphics workstation 370 of the video production system 300 form of a video editing and playback system 76, as illustrated in the embodiment of FIG. 3. The journalist workstation 110 forms editing circuitry 58 that generates a composition that uses a portion of a first compressed version of multimedia data having a first resolution. As stated above, the graphics workstation 370 forms playback circuitry 60 that plays the composition using a portion of a second compressed version of the multimedia data stored in the media server 350.
  • [0089] Multimedia Archive System 200
  • The multimedia archive (MMA) [0090] 200 includes a library server 210 and one or more object servers 220. The library server 210 holds catalog and search support meta data for locating objects stored in the multimedia archive 200.
  • The [0091] object server 220 provides the primary storage media for browsing and archival of material generated during news gathering and production processes. The object server 220 works in conjunction with the library server 210 to facilitate distribution of multimedia material to the journalist workstations 110. The objects stored in the multimedia archive can be low resolution versions of video, audio, graphics, and text. The MMA can be used to store finished stories, audio, video and other content for reuse in creating new stories. In a preferred embodiment, the multimedia archive 200 is implemented using the IBM Digital Library 5765-258.
  • It should be understood that the [0092] multimedia archive system 200 in combination with the media server 350 of the video production system form a multimedia storage system 54, as illustrated in the embodiment of FIG. 3. The multimedia archive system 200 and the media server 350 are coupled to the media recorder 330 and the MPEG encoder 340 that form the multimedia capture and encoding system 52, and are further coupled to the journalist workstations 110 and the graphics workstation 370 that form the video editing and playback system 56. The multimedia archive system 200 and the media server 350 store multimedia information including the first and second compressed versions of the multimedia data, which are described above.
  • Operation of the [0093] Newsroom Production System 90
  • The operation of the digital multimedia [0094] newsroom production system 90 shown in FIG. 10 is described below. The operation of the system 90 can be described as a collection of distinct function specific workloads characterized at a high level as asset creation, asset use, asset storage, and asset administration. The system 90 provides the capability for the following functions:
  • News wire text capture, storage, and catalog; [0095]
  • News story text creation, storage, and catalog; [0096]
  • High resolution video capture, edit, playout, storage and catalog; [0097]
  • Video production system low resolution media data editing; [0098]
  • Real-time dual resolution digitization, storage and catalog; [0099]
  • Low resolution video browsing and editing; and [0100]
  • High-resolution playout and editing of low resolution composition Each of the functions described above, along with user interfaces for accomplishing these functions, are described below in greater detail. [0101]
  • News Wire Text Capture, Storage and Catalog [0102]
  • The news servers [0103] 120 provide capability for capture and storage of news wire text data through the external interfaces 122. News wire text stories are captured by the news servers 120 and cataloged in a database of the news servers 120. A user of one of the journalist workstations 110 may access the news servers' databases as a system librarian to search, browse and retrieve the wire service data stored in the databases of the news servers 110. It is not generally necessary to store all text stories captured by the news servers 110 in the multimedia archive 200. A system administrator may access the news servers through one of the journalist workstations 110, browse the catalog of data received from the news wires, determine what stories are appropriate for storage in the multimedia archive 200 and command the news servers 120 to transfer selected data to the multimedia archive 200 for storage.
  • News Story Text Creation, Storage, and Catalog [0104]
  • A user of the [0105] journalist workstation 110 can access text through the news servers 120 and can create text and scripts from scratch or can use existing text and scripts stored in the news servers 120 or in the multimedia archive 200 in the creation of text and scripts. The user can search, browse and retrieve text data stored in the news servers 120 and the multimedia archive 200. The user can perform this searching and browsing using complex, full-text search techniques, thereby allowing efficient research by focusing the searching to retrieve data specifically relevant to the user's needs.
  • High Resolution Video Capture, Edit, Playout, Storage and Catalog [0106]
  • High resolution media data utilized by the video production system is captured in the system by the [0107] media recorder 330. The high resolution media data is captured in the media recorder 330, digitized and compressed using a broadcast quality compression technique such as MJPEG. The media data captured by the media recorder 330 is transferred in compressed form to the media server 350 and is registered and stored in the media server 350 by the asset manager 360. As discussed further below, in a preferred embodiment of the invention, a low resolution version of the media data is simultaneously created with the high resolution media data.
  • The high resolution media data can be browsed and edited using the [0108] video editor 320 and can be broadcast to air using the digital playback system 310.
  • Video Production System Low Resolution Media Data Editing [0109]
  • As discussed above, low resolution video is used by the [0110] journalist workstations 110 to provide limited editing capability. A user of the video production system 300, for example a user of the video editor 320, may wish to edit low resolution media data. The low resolution media data may either be a low resolution composition created by a user of a journalist workstation 110 or a low resolution version of media data captured by the media recorder 330. In either case, the video production system 300 user may search the multimedia archive 200 over the network 400 or may search the asset manager 360 over the network 400 to retrieve the low resolution media data. After editing the low resolution media data, the video editor 320 may transfer edited low resolution media data to the multimedia archive 200 for cataloging and storage therein.
  • Real-time Dual Resolution Digitization, Storage and Catalog [0111]
  • As described above, news video production from the [0112] journalist workstation 110 requires that an editable form of media data be available to a user of the journalist workstation 110. The low resolution media data is stored in, cataloged by and retrieved from the multimedia archive 200. The low resolution media data is captured in the system 90 using the media recorder 330. The media recorder 330 performs a dual resolution digitization of media data to be captured by the system 90.
  • When media data is captured, the [0113] media recorder 330, in conjunction with the MPEG encoder 340, performs a dual resolution digitization of the media data to simultaneously produce a high resolution version of the media data and a low resolution version of the media data. As discussed above, the high resolution version of the media data is digitized and compressed in a preferred embodiment using an MJPEG encoding format. The low resolution video is compressed in a preferred embodiment using known, high compression encoding techniques such as MPEG or Quick Time, available from Apple Computer, Inc, Cupertino, CA. Although it is preferred to use either MPEG or Quick Time, another compression technique which results in a high compression ratio of the media data may also be used. By performing simultaneous capture of both the high resolution version and the low resolution version of the media data, both forms of media data are immediately available in the system 90 so that story editing can be performed to meet the stringent deadlines encountered in broadcast news operations even with late breaking material.
  • Low Resolution Video Browsing and Editing [0114]
  • One of the primary features of the [0115] system 90 shown in FIG. 10 is the ability to provide a user of the journalist workstations 110 with low resolution video to allow browsing and editing of the low resolution video to create storyboards which may ultimately be used by an editor using the video editor 320 to create broadcast quality media data. The low resolution editing feature allows the journalist to become more involved in the finished media product and to incorporate archived media data into storyboards without the need for manual retrieval of video tapes and operation of a video tape player in an edit bay as in previous systems.
  • A journalist, using the [0116] journalist workstation 110, can search the data contained within the library server 210 of the multimedia archive 200 for low resolution video data, audio data and text related to a story that the journalist is composing on the journalist workstation 110. In response to key search words provided by the journalist, the multimedia archive provides a list of material contained therein related to the key words. The journalist can then select media data for browsing or editing on the journalist workstation 110 from the list of material.
  • The graphics user interface for storyboard creation provided to the journalist at the [0117] journalist workstation 110 is shown in FIG. 11. The user interface 500 includes a number of windows including a viewing window 510, a clipnotes window 520, a storyboard window 530, a storynotes window 540 and a script window 550.
  • The [0118] script window 550 provides an area in which the journalist can write the main script of a story being composed on the journalist workstation 110. Text can be generated in this window using standard word processing commands. Graphics, including painting functions, can be performed on the journalist workstation 110 and incorporated into the storyboard.
  • The [0119] viewing window 510 displays a low resolution video component of low resolution media data to be viewed and edited on the journalist workstation 110. The viewing window also displays the time code 516 of the video being displayed, machine controls 518, and editing functions such as mark in 512 a and mark out 512 b buttons. The machine controls 518 provide controls for playing a video clip in the viewing window and are similar to standard VTR controls. The machine controls can be selected by the user using a pointing device, such as a mouse, or by using special function keys on a keyboard of the journalist workstation 110. Selecting a clip for display in the viewing window may be done by dragging a clip from the storyboard window 530 (described below) or by selecting a new clip from the multimedia archive 200.
  • A second viewing window can be opened on the screen at the same time as the [0120] viewing window 510. The second viewing window, in a preferred embodiment, is made visible by either shrinking or eliminating the storynotes window 540.
  • The mark in [0121] button 512 a and the mark out button 512 b are super-imposed in the upper left and upper right corners of the viewing window. These buttons are used to perform editing functions at the journalist workstation 110. When a video clip is being played in the viewing window 510, audio data associated with the video data is played on speakers of the journalist workstation 110. A “video only” or “audio only” indication will appear on the video window when the media data being displayed or played on the workstation consists of audio only or video only.
  • The [0122] clipnotes window 520 provides a notepad for entry of short notes for each clip viewed on the viewing window 510. The storynotes window 540 provides an area for the entry of notes that apply to the whole story to be edited as opposed to the clipnotes window 510 which is for notes on individual clips.
  • The [0123] storyboard window 530 allows clips and subclips to be laid out in sequence. Each of the clips 532 shown in the storyboard window 530 typically show the first frame of a corresponding clip, however, the user may select a frame other than the first frame to be shown in the storyboard window. The collection of clips stored in the storyboard window are referred to as a bin. The journalist has the option of playing one of the clips in the viewing window or playing the bin of clips as arranged in the storyboard window.
  • The final pre-edited composition contained on the [0124] journalist workstation 110 may be transferred to the multimedia archive 200 for reuse by the journalist or other journalists on other journalist workstations 110 and for final editing and playout by a user of the video production system 300.
  • High Resolution Playout and Editing of Low Resolution Compositions [0125]
  • A composition produced during a low resolution activity on a [0126] journalist workstation 110 may be played out in different ways. A user of a journalist workstation 110 may play the low resolution composition by retrieving the composition data from the multimedia archive 200, or a user of the video production system 300, for example a user of the video editor 320, may play and edit a high resolution version of the composition. The translation of the low resolution composition to its high resolution equivalent is transparent to the user of the video editor 320. The asset manager 360 using registration information of each of the low resolution sources used in the composition can identify the equivalent high resolution sources and translate the low resolution composition into its high resolution equivalent. Efficient translation by the asset manager 360 requires a unique registration system for each of the clips stored within the system. Further, the registration method must include means for identifying the corresponding high resolution version of low resolution media data. A preferred registration method is described in detail further below.
  • An editor, using the [0127] video editor 320, receives the high resolution version of the low resolution composition created by the journalist, and can further edit the composition in broadcast quality format, to provide more precise editing cuts than accomplished by the journalist.
  • In order to provide efficient transmission and storage of media data in the system shown in FIG. 10, a standard file structure is used for the media data contained within the system. In one embodiment of the invention, the media data is organized in a [0128] media container 600 as shown in FIG. 12. The media container 600 is divided into five subsections including container data 610, container timing 620, media security 630, meta data 640 and media or media pointers 650.
  • The information contained within the [0129] container data 610 describes the container itself and may include the following information: the name of the person that created the container data; the name of the person that approved the container data; an identification of the container security; a creation time stamp; the name of all people that have modified the data; a modification time stamp; a user's log; cost information associated with the data; and other user defined elements.
  • [0130] Container timing 620 includes information related to a relationship over time of the media in the container. This information is only applicable to a story being prepared for broadcast.
  • The [0131] media security segment 630 provides further information concerning the security level of the media contained within the container. This information can be used to restrict access to specified personnel of media contained within the container.
  • The meta data information describes the media stored in the container. In one embodiment, the meta data contains the following information for each media object in the container: the name of the person that approved the data; the name of the person that created the data; a creation time stamp; a media identifier; media status; media type; names of all people that have modified the data; a modification time stamp; a reference number; research descriptors; timing information; title; and other user defined elements. [0132]
  • The media and media pointers [0133] 65 are the actual raw data stored in the container. Media objects of many types may be stored in a single container. The media pointers point to a media object stored in another container. By storing a media pointer to another container, rather than the media of the other container itself, maximum storage efficiency can be attained throughout the system.
  • File structures, other than the container file structure described above, may be used for storing the media data in the digital multimedia newsroom production system. For example, the Open Media Framework (OMF™) file structure, described in Avid Technology, Inc. publication OMF™ Interchange Specification, which is incorporated herein by reference, may be used as the file structure for media files in the system. The file structure described in Published PCT Application WO 93/21636, A Method and Apparatus For Representing and Editing Multimedia Compositions, incorporated herein by reference, may also be used in embodiments of the invention. [0134]
  • Another feature of the system shown in FIG. 10 is the ability to uniquely identify the media objects stored within the system and to locate other versions of media data that correspond to the media objects. The ability of the [0135] system 90 to locate a high resolution version of media data, corresponding to a low resolution version of the same media data, allows the asset manager 360 to provide a high resolution translation of combinations or storyboards generated by the journalist workstation 110, such that the translation is transparent to an editor using the video editor 320.
  • The asset manager can uniquely identify the low resolution and high resolution media data in a number of ways. In one embodiment of the invention, the media data, when captured by the [0136] media recorder 330, is assigned a unique time code stamp corresponding to the date and time that the media data is captured by the media recorder 330. Using this scheme, the low resolution version of the media data and the high resolution version of the media data is assigned the same identification number. However, since the low resolution media data is stored in the multimedia archive 200, and the high resolution media data is stored in the media server, there is no opportunity for confusion between the versions of the media data. The asset manager, in translating a combination or storyboard from a low resolution version to a high resolution version, can locate the high resolution version of each media object of the combination in the media server based on the identification number of the corresponding low resolution version of the media object. The above-described media data identifying method is not preferred for use at broadcast locations that do not maintain a unique timecode stamp.
  • In one embodiment of the invention, the [0137] asset manager 360 may be implemented using Media File Manager (MFM) and Source Manager (SM) software as described in U.S. Pat. No. 5,267,351 to Reber et al which is incorporated herein by reference. This software provides a unique identifier to media data captured by the system and maintains a table of relationships between media objects contained within the system such that the asset manager 360 can identify a corresponding version of low resolution or high resolution media data.
  • In an alternate embodiment of the invention, a digital multimedia newsroom production system consists only of the core newsroom system [0138] 100 and the multimedia archive system 200 coupled by the digital network 400. In this alternate embodiment, a low resolution capture device is coupled to the network 400 to capture low resolution media data for storage in the news servers 120 and the multimedia archive system 200. In this embodiment, the journalist workstations 110 provide the full storyboard functions described above with respect to the system 90 shown in FIG. 10.
  • Embodiments of the invention overcome limitations of prior art systems by providing a fully integrated digital multimedia newsroom. In embodiments of the invention, a journalist in a newsroom may create a multimedia storyboard of a news story which is electronically transferred over a digital network to an editing and production system for final editing and broadcast to air. Embodiments of the invention have been described with respect to a multimedia production system in a newsroom environment, however, embodiments of the invention are not limited to a newsroom environment, but rather may be used in other multimedia environments as well, such as radio, and in the production of entertainment programming. [0139]
  • In embodiments of the invention described above, the multimedia data processed on the [0140] journalist workstation 110 has been described as low resolution multimedia data. The user interface provided by the journalist workstation 110 may also be used to create storyboards using high resolution multimedia data.
  • Furthermore, the embodiments have been described in a newsroom context. However, the invention may be applied anywhere in the movie, television and cable industry, where multimedia data, and particularly, motion video data, is to be processed. In particular, the invention is suitable for active movie systems, video conferencing, and cable pay per view systems. [0141]
  • Having thus described at least one illustrative embodiment of the invention, various alterations, modifications and improvements will readily occur to those skilled in the art. Such alterations, modifications and improvements are intended to be within the scope and spirit of the invention. Accordingly, the foregoing description by way of example only, it is not intended as limiting. The invention's limit is defined only in the claims and the equivalents thereto. [0142]
  • What is claimed is: [0143]

Claims (68)

1. A multimedia system, comprising:
a multimedia capture and encoding system that captures multimedia data, and provides an encoded version of the multimedia data; and
a file system, coupled to the multimedia capture and encoding system, that opens a file at least for write in response to a request from a first request source, opens the file at least for read in response to a request from a second request source, writes the encoded multimedia data to the file while the file is open at least for write and at least for read, and reads encoded multimedia data from the file while the file is open at least for write and at least for read.
2. The multimedia system of claim 1, wherein the multimedia system includes a server coupled to a network, that sends the encoded version of the multimedia data on the network in multicast form.
3. The multimedia system of claim 1, wherein the multimedia system includes a server coupled to a network, that sends the encoded version of the multimedia data on the network in response to a request from a video host.
4. The multimedia system of claim 3, further comprising:
a video host, coupled to a network, that sends a first request to the server for a first portion of the encoded version of the multimedia data, determines an amount of time to wait based on a length of the first portion and a response time of the first request, and sends a second request to the server for a second portion of the encoded version of the multimedia data after waiting the determined amount of time.
5. A video host for connection via a computer network to a multimedia storage system for storing multimedia data, comprising:
means for sending a first request to the multimedia storage system over the computer network for a first portion of the multimedia data;
means for determining an amount of time to wait based on a length of the first portion and a response time of the first request; and
means for sending a second request to the multimedia storage system over the computer network for a second portion of the multimedia data after waiting the determined amount of time.
6. The multimedia system of claim 1 wherein the file system writes the encoded version of the multimedia data from the multimedia capture and encoding system by appending to the encoded multimedia data written in the file.
7. The multimedia system of claim 1, wherein the file system writes the encoded version of the multimedia data to a computer data file on a computer readable and writable random access medium.
8. A multimedia system comprising:
means for capturing multimedia data and providing an encoded version of the multimedia data; and
means for opening a file at least for write in response to a request from a first request source;
means for opening the file at least for read in response to a request from a second request source;
means for writing the encoded multimedia data to the file while the file is open at least for write and at least for read; and
means for reading encoded multimedia data from the file while the file is open at least for write and at least for read.
9. The multimedia system of claim 8, wherein the multimedia system further includes means, coupled to a network, for sending the encoded version of the multimedia data on the network in multicast form.
10. The multimedia system of claim 8, wherein the multimedia system further includes:
means, coupled to a network, for receiving a request from a video host; and
means, coupled to the network, for sending the encoded version of the multimedia data video to the video host in response to the request.
11. The multimedia system of claim 10, further comprising a video host, the video host comprising:
means for sending a first request for a first portion of the encoded version of the multimedia data;
means for determining an amount of time to wait based on a length of the first portion and a response time of the first request; and
means for sending a second request for a second portion of the encoded version of the multimedia data after waiting the determined amount of time.
12. The multimedia system of claim 8, wherein the means for writing includes means for appending to the encoded multimedia data written in the file.
13. The multimedia system of claim 8, wherein the means for writing the encoded multimedia data to the file comprises means for writing the encoded multimedia data to a computer data file on a computer readable and writable medium.
14. A method for use in a multimedia system, comprising:
capturing multimedia data and providing an encoded version of the multimedia data;
opening a file at least for write in response to a request from a first request source;
opening the file at least for read in response to a request from a second request source;
writing the encoded multimedia data to the file while the file is open at least for write and at least for read; and
reading encoded multimedia data from the file while the file is open at least for write and at least for read.
15. The method of claim 14, further comprising sending the encoded version of the multimedia data on a network in multicast form.
16. The method of claim 14, further comprising:
receiving a request over a network from a video host; and
sending the encoded version of the multimedia data video over the network to the video host in response to the request.
17. The method of claim 16, further comprising:
sending a first request over a network for a first portion of the encoded version of the multimedia data;
determining an amount of time to wait based on a length of the first portion and a response time of the first request; and
sending a second request over a network for a second portion of the encoded version of the multimedia data after waiting the determined amount of time.
18. The method of claim 14, wherein the writing includes appending to the encoded multimedia data written in the file.
19. The method of claim 14, wherein the writing the encoded multimedia data to the file comprises writing the encoded multimedia data to a computer data file on a computer readable and writable medium.
20. A computer program product comprising:
a computer readable medium; and
information stored on the computer readable medium indicative of a program to be executed by a computer to carry out the method of:
capturing multimedia data and providing an encoded version of the multimedia data; and
opening a file at least for write in response to a request from a first request source;
opening the file at least for read in response to a request from a second request source;
writing the encoded multimedia data to the file while the file is open at least for write and at least for read; and
reading encoded multimedia data from the file while the file is open at least for write and at least for read.
21. The computer program product of claim 20, wherein method the further comprises sending the encoded version of the multimedia data on a network in multicast form.
22. The computer program product of claim 20, wherein method the further comprises:
receiving a request over a network from a video host; and
sending the encoded version of the multimedia data video over the network to the video host in response to the request.
23. The computer program product of claim 22, wherein method the further comprises:
sending a first request over a network for a first portion of the encoded version of the multimedia data;
determining an amount of time to wait based on a length of the first portion and a response time of the first request; and
sending a second request over a network for a second portion of the encoded version of the multimedia data after waiting the determined amount of time.
24. The computer program product of claim 20, wherein the writing includes appending to the encoded multimedia data written in the file.
25. The computer program product of claim 20, wherein the writing the encoded multimedia data to the file comprises writing the encoded multimedia data to a computer data file on a computer readable and writable medium.
26. A multimedia system, comprising:
a multimedia capture and encoding system that captures multimedia data and provides an encoded version of the multimedia data; and
an operating system, coupled to the multimedia capture and encoding system, that provides to a network at least a portion of the encoded version of the multimedia data, and substantially simultaneously stores, in a file, the encoded version of the multimedia data.
27. The multimedia system of claim 26, wherein the operating system provides the at least a portion of the encoded version of the multimedia data without having to read the at least a portion of the multimedia data from the file.
28. The multimedia system of claim 26, wherein the operating system reads the at least a portion of the encoded version of the multimedia data from the file and provides the at least a portion of the multimedia data read from the file to the network.
29. A multimedia system comprising:
means for capturing multimedia data and providing an encoded version of the multimedia data;
means for providing to a network at least a portion of the encoded version of the multimedia data; and
means for substantially simultaneously storing, in a file, the encoded version of the multimedia data.
30. The multimedia system of claim 29, wherein the means for providing to a network at least a portion of the encoded version of the multimedia data comprises means for providing to a network at least a portion of the encoded version of the multimedia data without having to read the at least a portion of the multimedia data from the file.
31. The multimedia system of claim 29, wherein the means for providing to a network at least a portion of the encoded version of the multimedia data comprises means for reading the at least a portion of the encoded version of the multimedia data from the file and means for providing the at least a portion of the multimedia data read from the file to a network.
32. A method for use in a multimedia system, the method comprising:
capturing multimedia data and providing an encoded version of the multimedia data;
providing to a network at least a portion of the encoded version of the multimedia data; and
substantially simultaneously storing, in a file, the encoded version of the multimedia data.
33. The method of claim 32, wherein the providing to a network at least a portion of the encoded version of the multimedia data comprises providing to a network at least a portion of the encoded version of the multimedia data without having to read the at least a portion of the multimedia data from the file.
34. The multimedia system of claim 32, wherein the providing to a network at least a portion of the encoded version of the multimedia data comprises reading the at least a portion of the encoded version of the multimedia data from the file and providing the at least a portion of the multimedia data read from the file to a network.
35. A computer program product comprising:
a computer readable medium; and
information stored on the computer readable medium indicative of a program to be executed by a computer to carry out the method of:
capturing multimedia data and providing an encoded version of the multimedia data;
providing to a network at least a portion of the encoded version of the multimedia data; and
substantially simultaneously storing, in a file, the encoded version of the multimedia data.
36. The computer program product of claim 35, wherein the providing to a network at least a portion of the encoded version of the multimedia data comprises providing to a network at least a portion of the encoded version of the multimedia data without having to read the at least a portion of the multimedia data from the file.
37. The computer program product of claim 35, wherein the providing to a network at least a portion of the encoded version of the multimedia data comprises reading the at least a portion of the encoded version of the multimedia data from the file and providing the at least a portion of the multimedia data read from the file to a network.
38. A multimedia system including:
a file system that receives a request from a first request source to open a file at least for write, opens the file at least for write in response to the request from the first request source, receives a request from a second request source to open a file at least for read, opens the file at least for read in response to the request from the second request source, receives while the file is open at least for write in response to the request from the first request source and open at least for read in response to the request from the second request source, a request from the first request source to write multimedia data to the file, writes data indicative of multimedia data to the file in response to the request from the first request source to write, receives, while the file is open at least for write in response to the request from the first request source and open at least for read in response to the request from second request source, a request from the second request source to read multimedia data from the file, reads multimedia data from the file in response to the request from the second request source, and provides data indicative of the multimedia data read from the file in response to the request from the second request source.
39. A multimedia system including:
a) means for receiving a request from a first request source to open a file at least for write;
b) means for opening the file at least for write in response to a);
c) means for receiving a request from a second request source to open a file at least for read;
d) means for opening the file at least for read in response to c);
e) means for receiving, while the file is open at least for write in response to a) and open at least for read in response to b), a request from the first request source to write multimedia data to the file;
f) means for writing data indicative of multimedia data to the file in response to e);
g) means for receiving, while the file is open at least for write in response to a) and open at least for read in response to b), a request from the second request source to read multimedia data from the file;
h) means for reading multimedia data from the file in response to g); and
i) means for providing data indicative of the multimedia data read from the file in h).
40. A method for operating a file system of a multimedia system, the method including:
a) receiving a request from a first request source to open a file at least for write;
b) opening the file at least for write in response to a);
c) receiving a request from a second request source to open a file at least for read;
d) opening the file at least for read in response to c);
e) receiving, while the file is open at least for write in response to a) and open at least for read in response to b), a request from the first request source to write multimedia data to the file;
f) writing data indicative of multimedia data to the file in response to e);
g) receiving, while the file is open at least for write in response to a) and open at least for read in response to b), a request from the second request source to read multimedia data from the file;
h) reading multimedia data from the file in response to g); and
i) providing data indicative of the multimedia data read from the file in h).
41. A computer program product comprising:
a computer readable medium; and
information stored on the computer readable medium indicative of a program to be executed by a computer to carry out the method of:
a) receiving a request from a first request source to open a file at least for write;
b) opening the file at least for write in response to a);
c) receiving a request from a second request source to open a file at least for read;
d) opening the file at least for read in response to c);
e) receiving, while the file is open at least for write in response to a) and open at least for read in response to b), a request from the first request source to write multimedia data to the file;
f) writing data indicative of multimedia data to the file in response to e);
g) receiving, while the file is open at least for write in response to a) and open at least for read in response to b), a request from the second request source to read multimedia data from the file;
h) reading multimedia data from the file in response to g); and
i) providing data indicative of the multimedia data read from the file in h).
42. An apparatus for use in a multimedia system, the apparatus comprising:
a video host that sends a first request to the multimedia storage system over a computer network for a first portion of multimedia data, determines an amount of time to wait based on a length of the first portion and a response time of the first request, and sends a second request to the multimedia storage system over the computer network for a second portion of the multimedia data after waiting the determined amount of time.
43. A method for use in a multimedia system, the method comprising:
sending a first request to a multimedia storage system over a computer network for a first portion of multimedia data;
determining an amount of time to wait based on a length of the first portion and a response time of the first request; and
sending a second request to the multimedia storage system over the computer network for a second portion of the multimedia data after waiting the determined amount of time.
44. A computer program product comprising:
a computer readable medium; and
information stored on the computer readable medium indicative of a program to be executed by a computer to carry out the method of:
sending a first request to a multimedia storage system over a computer network for a first portion of multimedia data;
determining an amount of time to wait based on a length of the first portion and a response time of the first request; and
sending a second request to the multimedia storage system over the computer network for a second portion of the multimedia data after waiting the determined amount of time.
45. A system for reading video data over a network from a file from a file system for storing at least one file in which an encoded version of video data is stored, the file system being responsive to requests to provide at least a portion of the at least one file, the system comprising:
a video host that requests from the file system over the network an initial portion of the at least one file, determines a response time for receiving the initial portion, and requests further portions of the at least one file from the file system over the network according to the response time for receiving the initial portion and a desired playback rate of the received video data.
46. A system for reading video data over a network from a file from a file system for storing at least one file in which an encoded version of video data is stored, the file system being responsive to requests to provide at least a portion of the at least one file, the system comprising:
means for requesting from the file system over the network an initial portion of the at least one file;
means for determining a response time for receiving the initial portion; and
means for requesting further portions of the at least one file from the file system over the network according to the response time for receiving the initial portion and a desired playback rate of the received video data.
47. A method for reading video data over a network from a file from a file system for storing at least one file in which an encoded version of video data is stored, the file system being responsive to requests to provide at least a portion of the at least one file, the method comprising:
requesting from the file system over the network an initial portion of the at least one file;
determining a response time for receiving the initial portion; and
requesting further portions of the at least one file from the file system over the network according to the response time for receiving the initial portion and a desired playback rate of the received video data.
48. A computer program product comprising:
a computer readable medium; and
information stored on the computer readable medium indicative of a program to be executed by a computer to carry out the method of reading video data over a network from a file from a file system for storing at least one file in which an encoded version of video data is stored, the file system being responsive to requests to provide at least a portion of the at least one file, the method comprising:
requesting from the file system over the network an initial portion of the at least one file;
determining a response time for receiving the initial portion; and
requesting further portions of the at least one file from the file system over the network according to the response time for receiving the initial portion and a desired playback rate of the received video data.
49. A system comprising:
a video host that sends a first request over a network for one or more portions of multimedia data, receives and plays the one or more portions of multimedia data corresponding to the first request, determines a time to send a second request for one or more portions of multimedia data, the time to send a second request being determined at least in part on an amount of time needed to be through playing the one or more portions corresponding to the first request, and an amount of time taken between sending the first request and receiving the data corresponding to the first request.
50. The system of claim 49, wherein the time is determined so as to expect to begin to receive the one or more portions corresponding to the second request a predetermined amount of time before being through playing the one or more portions corresponding to the first request.
51. The system of claim 49, wherein the video host further determines the amount of one or more portions of multimedia data to request based at least in part on the amount of time needed to be through playing the one or more portions corresponding to the first request, and the amount of time taken between sending the first request and receiving the data corresponding to the first request.
52. A system comprising:
means for sending a first request over a network for one or more portions of multimedia data;
means for receiving and playing the one or more portions of multimedia data corresponding to the first request; and
means for determining a time to send a second request for one or more portions of multimedia data, the means for determining including means for determining the time to send a second request based at least in part on an amount of time needed to be through playing the one or more portions corresponding to the first request, and an amount of time taken between sending the first request and receiving the data corresponding to the first request.
53. The system of claim 52, wherein the means for determining a time to send a second request includes means for determining a time to send a second request so as to expect to begin to receive the one or more portions corresponding to the second request a predetermined amount of time before being through playing the one or more portions corresponding to the first request.
54. The system of claim 52, wherein the means for determining a time to send a second request further includes means for determining the amount of one or more portions of multimedia data to request based at least in part on the amount of time needed to be through playing the one or more portions corresponding to the first request, and the amount of time taken between sending the first request and receiving the data corresponding to the first request.
55. A method comprising:
sending a first request over a network for one or more portions of multimedia data;
receiving and playing the one or more portions of multimedia data corresponding to the first request;
determining a time to send a second request for one or more portions of multimedia data, based at least in part on an amount of time needed to be through playing the one or more portions corresponding to the first request, and an amount of time taken between sending the first request and receiving the data corresponding to the first request.
56. The method of claim 55, wherein the determining a time to send a second request includes determining a time to send a second request so as to expect to begin to receive the one or more portions corresponding to the second request a predetermined amount of time before being through playing the one or more portions corresponding to the first request.
57. The method of claim 55, wherein the determining a time to send a second request further includes determining the amount of one or more portions of multimedia data to request based at least in part on the amount of time needed to be through playing the one or more portions corresponding to the first request, and the amount of time taken between sending the first request and receiving the data corresponding to the first request.
58. A computer product comprising:
a computer readable medium; and
data stored on the computer readable medium indicative of instructions to be executed by a computer to carry out the method of:
sending a first request over a network for one or more portions of multimedia data;
receiving and playing the one or more portions of multimedia data corresponding to the first request; and
determining a time to send a second request for one or more portions of multimedia data, based at least in part on an amount of time needed to be through playing the one or more portions corresponding to the first request, and an amount of time taken between sending the first request and receiving the data corresponding to the first request.
59. The computer program product claim 58, wherein the determining a time to send a second request includes determining a time to send a second request so as to expect to begin to receive the one or more portions corresponding to the second request a predetermined amount of time before being through playing the one or more portions corresponding to the first request.
60. The computer program product of claim 58, wherein the determining a time to send a second request further includes determining the amount of one or more portions of multimedia data to request based at least in part on the amount of time needed to be through playing the one or more portions corresponding to the first request, and the amount of time taken between sending the first request and receiving the data corresponding to the first request.
61. A system that receives a multimedia data stream and stores an encoded version of the multimedia data stream in a file, and while receiving and storing, prompts a user to select any one or more portions of the encoded version of the multimedia data stream from the file, receives an indication of the user's selection of one or more portions of the encoded version of the multimedia data from the file, and further while receiving and storing interactively plays for the user, the selected one or more portions of the encoded version of the multimedia data from the file.
62. The system of claim 61, wherein the multimedia system receives a plurality of multimedia data streams and stores a corresponding plurality of encoded versions of multimedia data streams in a corresponding plurality of files, and while receiving and storing, prompts a user to select any one or more portions of the encoded versions of the multimedia data streams from the files, receives an indication of the user's selection of any one or more portions of two or more of the encoded versions of the multimedia data streams from the files, and further while receiving and storing, interactively plays for the user, the selected one or more portions of the two or more of the encoded versions of the multimedia data streams from the files.
63. A system comprising:
means for receiving a multimedia data stream and storing an encoded version of the multimedia data stream in a file;
means for prompting, while receiving and storing, a user to select any one or more portions of the encoded version of the multimedia data stream from the file;
means for receiving an indication of the user's selection of one or more portions of the encoded version of the multimedia data from the file; and
means for interactively playing, while receiving and storing, the selected one or more portions of the encoded version of the multimedia data from the file for the user.
64. The system of claim 63, wherein the means for receiving a multimedia data stream and storing includes means for receiving a plurality of multimedia data streams and storing a corresponding plurality of encoded versions of multimedia data streams in a corresponding plurality of files, and wherein the means for prompting includes means for prompting, while receiving and storing, a user to select any one or more portions of the encoded versions of the multimedia data streams from the files, and wherein the means for receiving an indication includes means for receiving an indication of a user's selection of any one or more portions of two or more of the encoded versions of the multimedia data streams from the files, and wherein the means for interactively playing further includes means for interactively playing, while receiving and storing, the selected one or more portions of the two or more of the encoded versions of the multimedia data streams from the files.
65. A method comprising:
receiving a multimedia data stream and storing an encoded version of the multimedia data stream in a file;
prompting, while receiving and storing, a user to select any one or more portions of the encoded version of the multimedia data stream from the file;
receiving an indication of the user's selection of one or more portions of the encoded version of the multimedia data from the file; and
interactively playing, while receiving and storing, the selected one or more portions of the encoded version of the multimedia data from the file for the user.
66. The method of claim 65, wherein the receiving a multimedia data stream and storing includes receiving a plurality of multimedia data streams and storing a corresponding plurality of encoded versions of multimedia data streams in a corresponding plurality of files, and wherein the prompting includes prompting, while receiving and storing, a user to select any one or more portions of the encoded versions of the multimedia data streams from the files, and wherein the receiving an indication includes receiving an indication of a user's selection of any one or more portions of two or more of the encoded versions of the multimedia data streams from the files, and wherein the interactively playing further includes interactively playing, while receiving and storing, the selected one or more portions of the two or more of the encoded versions of the multimedia data streams from the files.
67. A computer program product comprising:
a computer readable medium; and
data stored on the computer readable medium indicative of a set of instructions to be executed by a computer to carry out a method of:
receiving a multimedia data stream and storing an encoded version of the multimedia data stream in a file;
prompting, while receiving and storing, a user to select any one or more portions of the encoded version of the multimedia data stream from the file;
receiving an indication of the user's selection of one or more portions of the encoded version of the multimedia data from the file; and
interactively playing, while receiving and storing, the selected one or more portions of the encoded version of the multimedia data from the file for the user.
68. The computer program product of claim 67, wherein the receiving a multimedia data stream and storing includes receiving a plurality of multimedia data streams and storing a corresponding plurality of encoded versions of multimedia data streams in a corresponding plurality of files, and wherein the prompting includes prompting, while receiving and storing, a user to select any one or more portions of the encoded versions of the multimedia data streams from the files, and wherein the receiving an indication includes receiving an indication of a user's selection of any one or more portions of two or more of the encoded versions of the multimedia data streams from the files, and wherein the interactively playing further includes interactively playing, while receiving and storing, the selected one or more portions of the two or more of the encoded versions of the multimedia data streams from the files.
US09/804,946 1997-04-04 2001-03-13 Multimedia system with improved data management mechanisms Abandoned US20030088877A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/804,946 US20030088877A1 (en) 1997-04-04 2001-03-13 Multimedia system with improved data management mechanisms

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US83286897A 1997-04-04 1997-04-04
US1994598A 1998-02-06 1998-02-06
US17381598A 1998-10-16 1998-10-16
US09/322,810 US6211869B1 (en) 1997-04-04 1999-05-27 Simultaneous storage and network transmission of multimedia data with video host that requests stored data according to response time from a server
US09/804,946 US20030088877A1 (en) 1997-04-04 2001-03-13 Multimedia system with improved data management mechanisms

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/322,810 Division US6211869B1 (en) 1997-04-04 1999-05-27 Simultaneous storage and network transmission of multimedia data with video host that requests stored data according to response time from a server

Publications (1)

Publication Number Publication Date
US20030088877A1 true US20030088877A1 (en) 2003-05-08

Family

ID=27361329

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/322,810 Expired - Lifetime US6211869B1 (en) 1997-04-04 1999-05-27 Simultaneous storage and network transmission of multimedia data with video host that requests stored data according to response time from a server
US09/804,946 Abandoned US20030088877A1 (en) 1997-04-04 2001-03-13 Multimedia system with improved data management mechanisms

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/322,810 Expired - Lifetime US6211869B1 (en) 1997-04-04 1999-05-27 Simultaneous storage and network transmission of multimedia data with video host that requests stored data according to response time from a server

Country Status (1)

Country Link
US (2) US6211869B1 (en)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020087970A1 (en) * 2000-04-05 2002-07-04 Dorricott Martin Rex Electronic media distribution
US20030078958A1 (en) * 2000-09-01 2003-04-24 Pace Charles P. Method and system for deploying an asset over a multi-tiered network
US20030112268A1 (en) * 2001-09-11 2003-06-19 Sony Corporation Device for producing multimedia presentation
US20030200212A1 (en) * 2002-04-23 2003-10-23 International Business Machiness Corporation Method, system, and program product for transaction management in a distributed content management application
US20050069296A1 (en) * 2003-09-30 2005-03-31 Victor Company Of Japan, Ltd. Video recording apparatus and method, and edit-data forming apparatus, method and program
US20050117055A1 (en) * 2003-12-01 2005-06-02 Sharp Laboratories Of America, Inc. Low-latency random access to compressed video
US20050213935A1 (en) * 2003-04-04 2005-09-29 Sony Corporation Data processing method, device thereof, video recording device
US20050215432A1 (en) * 2001-09-28 2005-09-29 Christian Schlatter Aqueous neonicotinoid compositions for seed treatment
US20050262536A1 (en) * 2004-05-18 2005-11-24 Kaoru Urata Video data reproducing apparatus, video data reproducing method, video data transfer system and data transfer method for video data transfer system
US20060067654A1 (en) * 2004-09-24 2006-03-30 Magix Ag Graphical user interface adaptable to multiple display devices
US20060136457A1 (en) * 2004-11-29 2006-06-22 Park Seung W Method for supporting scalable progressive downloading of video signal
US20060156219A1 (en) * 2001-06-27 2006-07-13 Mci, Llc. Method and system for providing distributed editing and storage of digital media over a network
US20060236221A1 (en) * 2001-06-27 2006-10-19 Mci, Llc. Method and system for providing digital media management using templates and profiles
US20060253542A1 (en) * 2000-06-28 2006-11-09 Mccausland Douglas Method and system for providing end user community functionality for publication and delivery of digital media content
US20070089151A1 (en) * 2001-06-27 2007-04-19 Mci, Llc. Method and system for delivery of digital media experience via common instant communication clients
US20070106419A1 (en) * 2005-09-07 2007-05-10 Verizon Business Network Services Inc. Method and system for video monitoring
US20070107012A1 (en) * 2005-09-07 2007-05-10 Verizon Business Network Services Inc. Method and apparatus for providing on-demand resource allocation
US20070106680A1 (en) * 2001-06-27 2007-05-10 Mci, Llc. Digital media asset management system and method for supporting multiple users
US20070113184A1 (en) * 2001-06-27 2007-05-17 Mci, Llc. Method and system for providing remote digital media ingest with centralized editorial control
US20070127667A1 (en) * 2005-09-07 2007-06-07 Verizon Business Network Services Inc. Method and apparatus for providing remote workflow management
US20070260968A1 (en) * 2004-04-16 2007-11-08 Howard Johnathon E Editing system for audiovisual works and corresponding text for television news
US20080005205A1 (en) * 2006-06-30 2008-01-03 Broadcom Corporation Fast and efficient method for deleting very large files from a filesystem
US20080005206A1 (en) * 2006-06-30 2008-01-03 Broadcom Corporation Method for automatically managing disk fragmentation
US20080273862A1 (en) * 2004-10-14 2008-11-06 Keishi Okamoto Recording Apparatus, Editing Apparatus, Digital Video Recording System, and File Format
US20090103835A1 (en) * 2006-01-13 2009-04-23 Yahoo! Inc. Method and system for combining edit information with media content
US20100135383A1 (en) * 2008-11-28 2010-06-03 Microsoft Corporation Encoder with multiple re-entry and exit points
US20110102670A1 (en) * 2008-07-10 2011-05-05 Panasonic Corporation Audio video recording device
US20110286533A1 (en) * 2010-02-23 2011-11-24 Fortney Douglas P Integrated recording and video on demand playback system
US20120266203A1 (en) * 2011-04-13 2012-10-18 Dalet, S.A. Ingest-once write-many broadcast video production system
US20120317302A1 (en) * 2011-04-11 2012-12-13 Vince Silvestri Methods and systems for network based video clip generation and management
US20130091431A1 (en) * 2011-10-05 2013-04-11 Microsoft Corporation Video clip selector
US20140007172A1 (en) * 2012-06-29 2014-01-02 Samsung Electronics Co. Ltd. Method and apparatus for transmitting/receiving adaptive media in a multimedia system
US8775480B2 (en) 2011-01-28 2014-07-08 Apple Inc. Media clip management
US9401080B2 (en) 2005-09-07 2016-07-26 Verizon Patent And Licensing Inc. Method and apparatus for synchronizing video frames
US9871842B2 (en) 2012-12-08 2018-01-16 Evertz Microsystems Ltd. Methods and systems for network based video clip processing and management
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US10324605B2 (en) 2011-02-16 2019-06-18 Apple Inc. Media-editing application with novel editing tools
US10534525B1 (en) * 2014-12-09 2020-01-14 Amazon Technologies, Inc. Media editing system optimized for distributed computing systems
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools

Families Citing this family (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9714624D0 (en) * 1997-07-12 1997-09-17 Trevor Burke Technology Limite Visual programme distribution system
US20050039177A1 (en) * 1997-07-12 2005-02-17 Trevor Burke Technology Limited Method and apparatus for programme generation and presentation
GB0225339D0 (en) * 2002-10-31 2002-12-11 Trevor Burke Technology Ltd Method and apparatus for programme generation and classification
JP4086344B2 (en) * 1997-07-31 2008-05-14 キヤノン株式会社 Image transmitting apparatus and control method
US6360234B2 (en) 1997-08-14 2002-03-19 Virage, Inc. Video cataloger system with synchronized encoders
US7263659B2 (en) * 1998-09-09 2007-08-28 Ricoh Company, Ltd. Paper-based interface for multimedia information
US7596755B2 (en) * 1997-12-22 2009-09-29 Ricoh Company, Ltd. Multimedia visualization and integration environment
US7954056B2 (en) * 1997-12-22 2011-05-31 Ricoh Company, Ltd. Television-based visualization and navigation interface
US6763523B1 (en) * 1998-04-03 2004-07-13 Avid Technology, Inc. Intelligent transfer of multimedia data files from an editing system to a playback device
US6833865B1 (en) * 1998-09-01 2004-12-21 Virage, Inc. Embedded metadata engines in digital capture devices
US7263671B2 (en) 1998-09-09 2007-08-28 Ricoh Company, Ltd. Techniques for annotating multimedia information
US7215436B2 (en) * 1998-09-09 2007-05-08 Ricoh Company, Ltd. Device for generating a multimedia paper document
US7266782B2 (en) * 1998-09-09 2007-09-04 Ricoh Company, Ltd. Techniques for generating a coversheet for a paper-based interface for multimedia information
US8560951B1 (en) * 1998-12-18 2013-10-15 Thomson Licensing System and method for real time video production and distribution
US6452612B1 (en) * 1998-12-18 2002-09-17 Parkervision, Inc. Real time video production system and method
US20040027368A1 (en) * 2002-05-09 2004-02-12 Parkervision, Inc. Time sheet for real time video production system and method
US20030001880A1 (en) * 2001-04-18 2003-01-02 Parkervision, Inc. Method, system, and computer program product for producing and distributing enhanced media
US11109114B2 (en) 2001-04-18 2021-08-31 Grass Valley Canada Advertisement management method, system, and computer program product
US7024677B1 (en) 1998-12-18 2006-04-04 Thomson Licensing System and method for real time video production and multicasting
US9123380B2 (en) 1998-12-18 2015-09-01 Gvbb Holdings S.A.R.L. Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution, and multiple aspect ratio automated simulcast production
US7835920B2 (en) * 1998-12-18 2010-11-16 Thomson Licensing Director interface for production automation control
US6760916B2 (en) * 2000-01-14 2004-07-06 Parkervision, Inc. Method, system and computer program product for producing and distributing enhanced media downstreams
US6909874B2 (en) * 2000-04-12 2005-06-21 Thomson Licensing Sa. Interactive tutorial method, system, and computer program product for real time media production
US6952221B1 (en) 1998-12-18 2005-10-04 Thomson Licensing S.A. System and method for real time video production and distribution
US6532218B1 (en) * 1999-04-05 2003-03-11 Siemens Information & Communication Networks, Inc. System and method for multimedia collaborative conferencing
US6766357B1 (en) * 1999-04-15 2004-07-20 Avid Technology, Inc. Apparatus and method for efficient transfer of multimedia data for playback
EP1194872A1 (en) * 1999-06-11 2002-04-10 CCI Europe A/S (Stibo A/S) A content management computer system for managing publishing content objects
US6745368B1 (en) 1999-06-11 2004-06-01 Liberate Technologies Methods, apparatus, and systems for storing, retrieving and playing multimedia data
US7996878B1 (en) 1999-08-31 2011-08-09 At&T Intellectual Property Ii, L.P. System and method for generating coded video sequences from still media
FI112427B (en) * 1999-11-05 2003-11-28 Nokia Corp A method for determining the capabilities of a wireless terminal in a multimedia messaging service, a multimedia messaging service, and a multimedia terminal
US6868440B1 (en) 2000-02-04 2005-03-15 Microsoft Corporation Multi-level skimming of multimedia content using playlists
JP2001238193A (en) * 2000-02-18 2001-08-31 Sony Corp Video display device and video supply method
US6785739B1 (en) * 2000-02-23 2004-08-31 Eastman Kodak Company Data storage and retrieval playback apparatus for a still image receiver
US7788686B1 (en) * 2000-03-01 2010-08-31 Andrews Christopher C Method of and apparatus for describing, promoting, publishing, aggregating, distributing and accessing live content information
US7260564B1 (en) * 2000-04-07 2007-08-21 Virage, Inc. Network video guide and spidering
US7222163B1 (en) * 2000-04-07 2007-05-22 Virage, Inc. System and method for hosting of video content over a network
US7962948B1 (en) 2000-04-07 2011-06-14 Virage, Inc. Video-enabled community building
US8171509B1 (en) 2000-04-07 2012-05-01 Virage, Inc. System and method for applying a database to video multimedia
FI112307B (en) * 2000-08-02 2003-11-14 Nokia Corp communication Server
EP1325408A2 (en) * 2000-10-06 2003-07-09 Ampex Corporation System and method for transferring data between recording devices
WO2002043396A2 (en) * 2000-11-27 2002-05-30 Intellocity Usa, Inc. System and method for providing an omnimedia package
US20020063715A1 (en) * 2000-11-30 2002-05-30 Silicon Graphics, Inc. System, method, and computer program product for capturing a visualization session
US20020108115A1 (en) * 2000-12-11 2002-08-08 The Associated Press News and other information delivery system and method
US7054887B2 (en) * 2001-01-30 2006-05-30 Ibm Corporation Method and system for object replication in a content management system
US7280738B2 (en) 2001-04-09 2007-10-09 International Business Machines Corporation Method and system for specifying a selection of content segments stored in different formats
US6870887B2 (en) * 2001-04-09 2005-03-22 International Business Machines Corporation Method and system for synchronization between different content encoding formats
US6996393B2 (en) * 2001-08-31 2006-02-07 Nokia Corporation Mobile content delivery system
GB2416099B (en) * 2001-10-24 2006-05-31 Accenture Global Services Gmbh Data processing system and method
US7747655B2 (en) * 2001-11-19 2010-06-29 Ricoh Co. Ltd. Printable representations for time-based media
US8635531B2 (en) * 2002-02-21 2014-01-21 Ricoh Company, Ltd. Techniques for displaying information stored in multiple multimedia documents
US7861169B2 (en) 2001-11-19 2010-12-28 Ricoh Co. Ltd. Multimedia print driver dialog interfaces
US7743347B2 (en) * 2001-11-19 2010-06-22 Ricoh Company, Ltd. Paper-based interface for specifying ranges
US7788080B2 (en) * 2001-11-19 2010-08-31 Ricoh Company, Ltd. Paper interface for simulation environments
US7149957B2 (en) 2001-11-19 2006-12-12 Ricoh Company, Ltd. Techniques for retrieving multimedia information using a paper-based interface
US7703044B2 (en) 2001-11-19 2010-04-20 Ricoh Company, Ltd. Techniques for generating a static representation for time-based media information
US8539344B2 (en) * 2001-11-19 2013-09-17 Ricoh Company, Ltd. Paper-based interface for multimedia information stored by multiple multimedia documents
US7495795B2 (en) * 2002-02-21 2009-02-24 Ricoh Company, Ltd. Interface for printing multimedia information
US6820116B1 (en) 2001-12-21 2004-11-16 Nokia Corporation Mobile browsing booster system
GB2387086A (en) * 2002-03-25 2003-10-01 Sony Uk Ltd System
GB2387087A (en) * 2002-03-25 2003-10-01 Sony Uk Ltd System
US8213917B2 (en) 2006-05-05 2012-07-03 Waloomba Tech Ltd., L.L.C. Reusable multimodal application
AU2003228529A1 (en) * 2002-04-15 2003-11-03 Lakeview Capital Trust Method and system for internet-based interactive television
US20040032486A1 (en) * 2002-08-16 2004-02-19 Shusman Chad W. Method and apparatus for interactive programming using captioning
US20040210947A1 (en) 2003-04-15 2004-10-21 Shusman Chad W. Method and apparatus for interactive video on demand
US20030196206A1 (en) 2002-04-15 2003-10-16 Shusman Chad W. Method and apparatus for internet-based interactive programming
WO2004049709A1 (en) * 2002-11-22 2004-06-10 Sony Corporation System and method for referencing av data accumulated in av server
US20040166798A1 (en) * 2003-02-25 2004-08-26 Shusman Chad W. Method and apparatus for generating an interactive radio program
GB2400254A (en) * 2003-03-31 2004-10-06 Sony Uk Ltd Video processing
US20040234934A1 (en) * 2003-05-23 2004-11-25 Kevin Shin Educational and training system
US7330733B2 (en) 2003-07-08 2008-02-12 Motorola, Inc. Method and apparatus for reducing paging-related delays for anticipated target mobile stations
JP4117616B2 (en) * 2003-07-28 2008-07-16 ソニー株式会社 Editing system, control method thereof and editing apparatus
GB2406014B (en) * 2003-09-10 2007-01-31 Thales Uk Plc Video system
JP2005094391A (en) * 2003-09-18 2005-04-07 Pioneer Electronic Corp Device, method and program for editing and recording data, and recording medium having data editing and recording program recorded thereon
US7882436B2 (en) * 2004-03-10 2011-02-01 Trevor Burke Technology Limited Distribution of video data
US7779355B1 (en) 2004-03-30 2010-08-17 Ricoh Company, Ltd. Techniques for using paper documents as media templates
US8843978B2 (en) * 2004-06-29 2014-09-23 Time Warner Cable Enterprises Llc Method and apparatus for network bandwidth allocation
US7567565B2 (en) 2005-02-01 2009-07-28 Time Warner Cable Inc. Method and apparatus for network bandwidth conservation
WO2006089140A2 (en) * 2005-02-15 2006-08-24 Cuvid Technologies Method and apparatus for producing re-customizable multi-media
JP2006331591A (en) * 2005-05-30 2006-12-07 Sony Corp Information processor and method, and program
US7835158B2 (en) * 2005-12-30 2010-11-16 Micron Technology, Inc. Connection verification technique
EP1929407A4 (en) * 2006-01-13 2009-09-23 Yahoo Inc Method and system for online remixing of digital multimedia
US8170065B2 (en) 2006-02-27 2012-05-01 Time Warner Cable Inc. Methods and apparatus for selecting digital access technology for programming and data delivery
US8458753B2 (en) 2006-02-27 2013-06-04 Time Warner Cable Enterprises Llc Methods and apparatus for device capabilities discovery and utilization within a content-based network
EP2005325A4 (en) * 2006-04-10 2009-10-28 Yahoo Inc Video generation based on aggregate user data
US7962937B2 (en) * 2006-08-01 2011-06-14 Microsoft Corporation Media content catalog service
US20080235746A1 (en) 2007-03-20 2008-09-25 Michael James Peters Methods and apparatus for content delivery and replacement in a network
US9071859B2 (en) 2007-09-26 2015-06-30 Time Warner Cable Enterprises Llc Methods and apparatus for user-based targeted content delivery
US8561116B2 (en) 2007-09-26 2013-10-15 Charles A. Hasek Methods and apparatus for content caching in a video network
US8099757B2 (en) 2007-10-15 2012-01-17 Time Warner Cable Inc. Methods and apparatus for revenue-optimized delivery of content in a network
US20090182712A1 (en) * 2008-01-15 2009-07-16 Kamal Faiza H Systems and methods for rapid delivery of media content
US8813143B2 (en) 2008-02-26 2014-08-19 Time Warner Enterprises LLC Methods and apparatus for business-based network resource allocation
JP4672788B2 (en) * 2008-09-16 2011-04-20 株式会社東芝 Video data processing system, video server, gateway server, and video data management method
KR20100059011A (en) * 2008-11-25 2010-06-04 삼성전자주식회사 Broadcast receiver for providing time-shifted broadcast signal, broadcast output apparatus, and method for providing time-shifted image
AU2008255228B8 (en) * 2008-12-10 2012-02-16 Canon Kabushiki Kaisha Method of selecting a frame from motion video
FR2940481B1 (en) * 2008-12-23 2011-07-29 Thales Sa METHOD, DEVICE AND SYSTEM FOR EDITING ENRICHED MEDIA
US9866609B2 (en) 2009-06-08 2018-01-09 Time Warner Cable Enterprises Llc Methods and apparatus for premises content distribution
US20120030550A1 (en) * 2010-07-28 2012-02-02 Chin Ai Method for editing multimedia
US9767195B2 (en) 2011-04-21 2017-09-19 Touchstream Technologies, Inc. Virtualized hosting and displaying of content using a swappable media player
US20130031589A1 (en) * 2011-07-27 2013-01-31 Xavier Casanova Multiple resolution scannable video
US9063938B2 (en) * 2012-03-30 2015-06-23 Commvault Systems, Inc. Search filtered file system using secondary storage, including multi-dimensional indexing and searching of archived files
US9639297B2 (en) 2012-03-30 2017-05-02 Commvault Systems, Inc Shared network-available storage that permits concurrent data access
US9854280B2 (en) 2012-07-10 2017-12-26 Time Warner Cable Enterprises Llc Apparatus and methods for selective enforcement of secondary content viewing
US9131283B2 (en) 2012-12-14 2015-09-08 Time Warner Cable Enterprises Llc Apparatus and methods for multimedia coordination
US8976223B1 (en) * 2012-12-21 2015-03-10 Google Inc. Speaker switching in multiway conversation
US20150296169A1 (en) * 2014-04-14 2015-10-15 Lamie Saif Time-Space Storage Solution (TSSS)
US10687115B2 (en) 2016-06-01 2020-06-16 Time Warner Cable Enterprises Llc Cloud-based digital content recorder apparatus and methods
US10911794B2 (en) 2016-11-09 2021-02-02 Charter Communications Operating, Llc Apparatus and methods for selective secondary content insertion in a digital network
US11109290B2 (en) 2017-08-04 2021-08-31 Charter Communications Operating, Llc Switching connections over frequency bands of a wireless network
US10939142B2 (en) 2018-02-27 2021-03-02 Charter Communications Operating, Llc Apparatus and methods for content storage, distribution and security within a content distribution network
EP4143699A1 (en) * 2020-04-28 2023-03-08 Editshare, LLC Heterogeneous media editing across storage platforms

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8617076D0 (en) * 1986-07-14 1986-08-20 British Broadcasting Corp Video scanning systems
US5426513A (en) * 1989-06-16 1995-06-20 Harris Corporation Prioritized image transmission system and method
US5706290A (en) * 1994-12-15 1998-01-06 Shaw; Venson Method and apparatus including system architecture for multimedia communication
US5623690A (en) 1992-06-03 1997-04-22 Digital Equipment Corporation Audio/video storage and retrieval for multimedia workstations by interleaving audio and video data in data file
US5581784A (en) 1992-11-17 1996-12-03 Starlight Networks Method for performing I/O's in a storage system to maintain the continuity of a plurality of video streams
DE69428180T2 (en) 1993-12-13 2002-05-02 Sony Corp Cutting systems and processes
JPH07231309A (en) * 1994-02-17 1995-08-29 Hitachi Ltd Information distribution system
US5710895A (en) * 1994-03-22 1998-01-20 Intel Corporation Method and apparatus for capturing and compressing video data in real time
US5732239A (en) 1994-05-19 1998-03-24 Starlight Networks Method for operating a disk storage system which stores video data so as to maintain the continuity of a plurality of video streams
US5642171A (en) 1994-06-08 1997-06-24 Dell Usa, L.P. Method and apparatus for synchronizing audio and video data streams in a multimedia system

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020087970A1 (en) * 2000-04-05 2002-07-04 Dorricott Martin Rex Electronic media distribution
US20060253542A1 (en) * 2000-06-28 2006-11-09 Mccausland Douglas Method and system for providing end user community functionality for publication and delivery of digital media content
US9038108B2 (en) 2000-06-28 2015-05-19 Verizon Patent And Licensing Inc. Method and system for providing end user community functionality for publication and delivery of digital media content
US20030078958A1 (en) * 2000-09-01 2003-04-24 Pace Charles P. Method and system for deploying an asset over a multi-tiered network
US7209921B2 (en) * 2000-09-01 2007-04-24 Op40, Inc. Method and system for deploying an asset over a multi-tiered network
US8977108B2 (en) 2001-06-27 2015-03-10 Verizon Patent And Licensing Inc. Digital media asset management system and method for supporting multiple users
US20070089151A1 (en) * 2001-06-27 2007-04-19 Mci, Llc. Method and system for delivery of digital media experience via common instant communication clients
US7970260B2 (en) 2001-06-27 2011-06-28 Verizon Business Global Llc Digital media asset management system and method for supporting multiple users
US20110217023A1 (en) * 2001-06-27 2011-09-08 Verizon Business Global Llc Digital media asset management system and method for supporting multiple users
US8972862B2 (en) * 2001-06-27 2015-03-03 Verizon Patent And Licensing Inc. Method and system for providing remote digital media ingest with centralized editorial control
US8990214B2 (en) 2001-06-27 2015-03-24 Verizon Patent And Licensing Inc. Method and system for providing distributed editing and storage of digital media over a network
US20070113184A1 (en) * 2001-06-27 2007-05-17 Mci, Llc. Method and system for providing remote digital media ingest with centralized editorial control
US20060156219A1 (en) * 2001-06-27 2006-07-13 Mci, Llc. Method and system for providing distributed editing and storage of digital media over a network
US20070106680A1 (en) * 2001-06-27 2007-05-10 Mci, Llc. Digital media asset management system and method for supporting multiple users
US20060236221A1 (en) * 2001-06-27 2006-10-19 Mci, Llc. Method and system for providing digital media management using templates and profiles
US7120859B2 (en) * 2001-09-11 2006-10-10 Sony Corporation Device for producing multimedia presentation
US20030112268A1 (en) * 2001-09-11 2003-06-19 Sony Corporation Device for producing multimedia presentation
US20050215432A1 (en) * 2001-09-28 2005-09-29 Christian Schlatter Aqueous neonicotinoid compositions for seed treatment
US6873995B2 (en) * 2002-04-23 2005-03-29 International Business Machines Corporation Method, system, and program product for transaction management in a distributed content management application
US20030200212A1 (en) * 2002-04-23 2003-10-23 International Business Machiness Corporation Method, system, and program product for transaction management in a distributed content management application
US20050213935A1 (en) * 2003-04-04 2005-09-29 Sony Corporation Data processing method, device thereof, video recording device
US7587123B2 (en) * 2003-04-04 2009-09-08 Sony Corporation Method of processing data, system of the same and video recording system
US7450822B2 (en) * 2003-09-30 2008-11-11 Victor Company Of Japan, Ltd. Video recording apparatus and method, and edit-data forming apparatus, method and program
US20050069296A1 (en) * 2003-09-30 2005-03-31 Victor Company Of Japan, Ltd. Video recording apparatus and method, and edit-data forming apparatus, method and program
US20050117055A1 (en) * 2003-12-01 2005-06-02 Sharp Laboratories Of America, Inc. Low-latency random access to compressed video
US8327411B2 (en) * 2003-12-01 2012-12-04 Sharp Laboratories Of America, Inc. Low-latency random access to compressed video
US7836389B2 (en) * 2004-04-16 2010-11-16 Avid Technology, Inc. Editing system for audiovisual works and corresponding text for television news
US20070260968A1 (en) * 2004-04-16 2007-11-08 Howard Johnathon E Editing system for audiovisual works and corresponding text for television news
US20050262536A1 (en) * 2004-05-18 2005-11-24 Kaoru Urata Video data reproducing apparatus, video data reproducing method, video data transfer system and data transfer method for video data transfer system
US20060067654A1 (en) * 2004-09-24 2006-03-30 Magix Ag Graphical user interface adaptable to multiple display devices
US20080273862A1 (en) * 2004-10-14 2008-11-06 Keishi Okamoto Recording Apparatus, Editing Apparatus, Digital Video Recording System, and File Format
US7813620B2 (en) 2004-10-14 2010-10-12 Panasonic Corporation Recording apparatus, editing apparatus, digital video recording system, and file format
US8635356B2 (en) * 2004-11-29 2014-01-21 Lg Electronics Inc. Method for supporting scalable progressive downloading of video signal
US20060136457A1 (en) * 2004-11-29 2006-06-22 Park Seung W Method for supporting scalable progressive downloading of video signal
US20070106419A1 (en) * 2005-09-07 2007-05-10 Verizon Business Network Services Inc. Method and system for video monitoring
US20070127667A1 (en) * 2005-09-07 2007-06-07 Verizon Business Network Services Inc. Method and apparatus for providing remote workflow management
US9401080B2 (en) 2005-09-07 2016-07-26 Verizon Patent And Licensing Inc. Method and apparatus for synchronizing video frames
US9076311B2 (en) 2005-09-07 2015-07-07 Verizon Patent And Licensing Inc. Method and apparatus for providing remote workflow management
US20070107012A1 (en) * 2005-09-07 2007-05-10 Verizon Business Network Services Inc. Method and apparatus for providing on-demand resource allocation
US8631226B2 (en) 2005-09-07 2014-01-14 Verizon Patent And Licensing Inc. Method and system for video monitoring
US20090103835A1 (en) * 2006-01-13 2009-04-23 Yahoo! Inc. Method and system for combining edit information with media content
US20100290755A1 (en) * 2006-06-30 2010-11-18 Broadcom Corporation Fast and efficient method for deleting very large files from a filesystem
US7860896B2 (en) 2006-06-30 2010-12-28 Broadcom Corporation Method for automatically managing disk fragmentation
US7660837B2 (en) 2006-06-30 2010-02-09 Broadcom Corporation Method for automatically managing disk fragmentation
US7966351B2 (en) 2006-06-30 2011-06-21 Broadcom Corporation Fast and efficient method for deleting very large files from a filesystem
US20080005205A1 (en) * 2006-06-30 2008-01-03 Broadcom Corporation Fast and efficient method for deleting very large files from a filesystem
US20080005206A1 (en) * 2006-06-30 2008-01-03 Broadcom Corporation Method for automatically managing disk fragmentation
US7765244B2 (en) * 2006-06-30 2010-07-27 Broadcom Corporation Fast and efficient method for deleting very large files from a filesystem
US9282291B2 (en) * 2008-07-10 2016-03-08 Socionext Inc. Audio video recording device
US20110102670A1 (en) * 2008-07-10 2011-05-05 Panasonic Corporation Audio video recording device
US8320448B2 (en) 2008-11-28 2012-11-27 Microsoft Corporation Encoder with multiple re-entry and exit points
US20100135383A1 (en) * 2008-11-28 2010-06-03 Microsoft Corporation Encoder with multiple re-entry and exit points
US20110286533A1 (en) * 2010-02-23 2011-11-24 Fortney Douglas P Integrated recording and video on demand playback system
US8954477B2 (en) 2011-01-28 2015-02-10 Apple Inc. Data structures for a media-editing application
US9870802B2 (en) 2011-01-28 2018-01-16 Apple Inc. Media clip management
US9099161B2 (en) 2011-01-28 2015-08-04 Apple Inc. Media-editing application with multiple resolution modes
US9251855B2 (en) 2011-01-28 2016-02-02 Apple Inc. Efficient media processing
US8886015B2 (en) 2011-01-28 2014-11-11 Apple Inc. Efficient media import
US8775480B2 (en) 2011-01-28 2014-07-08 Apple Inc. Media clip management
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US10324605B2 (en) 2011-02-16 2019-06-18 Apple Inc. Media-editing application with novel editing tools
US11157154B2 (en) 2011-02-16 2021-10-26 Apple Inc. Media-editing application with novel editing tools
US20120317302A1 (en) * 2011-04-11 2012-12-13 Vince Silvestri Methods and systems for network based video clip generation and management
US9996615B2 (en) 2011-04-11 2018-06-12 Evertz Microsystems Ltd. Methods and systems for network based video clip generation and management
US10078695B2 (en) * 2011-04-11 2018-09-18 Evertz Microsystems Ltd. Methods and systems for network based video clip generation and management
US10575031B2 (en) 2011-04-11 2020-02-25 Evertz Microsystems Ltd. Methods and systems for network based video clip generation and management
US11240538B2 (en) 2011-04-11 2022-02-01 Evertz Microsystems Ltd. Methods and systems for network based video clip generation and management
US20120266203A1 (en) * 2011-04-13 2012-10-18 Dalet, S.A. Ingest-once write-many broadcast video production system
US20130091431A1 (en) * 2011-10-05 2013-04-11 Microsoft Corporation Video clip selector
WO2014003515A1 (en) * 2012-06-29 2014-01-03 삼성전자 주식회사 Method and apparatus for transmitting adaptive media structure in multimedia system
US20140007172A1 (en) * 2012-06-29 2014-01-02 Samsung Electronics Co. Ltd. Method and apparatus for transmitting/receiving adaptive media in a multimedia system
US9871842B2 (en) 2012-12-08 2018-01-16 Evertz Microsystems Ltd. Methods and systems for network based video clip processing and management
US10542058B2 (en) 2012-12-08 2020-01-21 Evertz Microsystems Ltd. Methods and systems for network based video clip processing and management
US10534525B1 (en) * 2014-12-09 2020-01-14 Amazon Technologies, Inc. Media editing system optimized for distributed computing systems

Also Published As

Publication number Publication date
US6211869B1 (en) 2001-04-03

Similar Documents

Publication Publication Date Title
US6211869B1 (en) Simultaneous storage and network transmission of multimedia data with video host that requests stored data according to response time from a server
EP0895623B1 (en) A multimedia system with improved data management mechanisms
US5852435A (en) Digital multimedia editing and data management system
US5832171A (en) System for creating video of an event with a synchronized transcript
JP3726957B2 (en) Method and system for specifying selection of content segments stored in different formats
US8972862B2 (en) Method and system for providing remote digital media ingest with centralized editorial control
US7769270B2 (en) Editing system and control method thereof
JP5112287B2 (en) Method and system for providing distributed editing and storage of digital media over a network
US8977108B2 (en) Digital media asset management system and method for supporting multiple users
US6134378A (en) Video signal processing device that facilitates editing by producing control information from detected video signal information
US7424202B2 (en) Editing system and control method using a readout request
US6870887B2 (en) Method and system for synchronization between different content encoding formats
US9210482B2 (en) Method and system for providing a personal video recorder utilizing network-based digital media content
GB2312081A (en) Text-based video editing system
JP2002354423A (en) Method for accommodating contents
JP4278189B2 (en) Digital multimedia editing and data management system
US20060168521A1 (en) Edition device and method
JPH10285534A (en) Video signal processor
WO2004088990A2 (en) Media storage control
AU3785400A (en) A multimedia production system
JP2004015436A (en) Program, record medium, methodology, and instrument for video image content creation
JP2004015437A (en) Program, recording medium, method, and apparatus for recording and reproducing video / audio data
JP2003169297A (en) Data processing apparatus
JP2001136479A (en) Program production transmitter
Chadwick et al. Using a high-performance file system in video production

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION