WO2010017217A2 - Methods, equipment and systems utilizing pre-stored picture data to represent missing video data - Google Patents

Methods, equipment and systems utilizing pre-stored picture data to represent missing video data Download PDF

Info

Publication number
WO2010017217A2
WO2010017217A2 PCT/US2009/052734 US2009052734W WO2010017217A2 WO 2010017217 A2 WO2010017217 A2 WO 2010017217A2 US 2009052734 W US2009052734 W US 2009052734W WO 2010017217 A2 WO2010017217 A2 WO 2010017217A2
Authority
WO
WIPO (PCT)
Prior art keywords
video data
video
network media
data
media appliance
Prior art date
Application number
PCT/US2009/052734
Other languages
French (fr)
Other versions
WO2010017217A3 (en
Inventor
Brian Scott Kersey
William M. Leblanc
Original Assignee
Adtech-Gesi, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adtech-Gesi, Llc filed Critical Adtech-Gesi, Llc
Publication of WO2010017217A2 publication Critical patent/WO2010017217A2/en
Publication of WO2010017217A3 publication Critical patent/WO2010017217A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image

Definitions

  • the present application relates to video image data processing methods and equipment and, more particularly, to providing solutions addressing issues stemming from lost video data.
  • BACKGROUND [003] Conventional hardware encoders and associated equipment use a frame grabber to convert analog video into an encoded picture. However, this process consumes a substantial amount of a system's processing capabilities and is susceptible to problems related to lost video sources.
  • methods, equipment and systems are configured to continually send a pre-stored, user-definable picture for display to an end user to visually represent the loss of video data in a sequence of video data images.
  • FIG. 1 is a block diagram illustrating an exemplary operating environment for performing the disclosed method and including, among other equipment, one or more system components configured to implement the disclosed method so as to operate as the disclosed system.
  • FIG. 2 illustrates a block diagram an example of network media appliance in combination with other pieces of equipment utilized in, or interacting with, a system utilizing pre-stored picture data to represent missing video data in accordance with at least one embodiment of the system.
  • FIG. 3 illustrates a method of utilizing pre-stored picture data to represent missing video data in accordance with at least one embodiment of the system.
  • the frame grabber continues to encode pictures and the picture encoder is black because there is no longer a source providing the video to be encoded.
  • this black picture is sent to a storage device consuming both I/O bandwidth to the storage device and storage media to actually store the picture. Additionally, during playback the black pictures must be read back from the storage media requiring more I/O bandwidth from the storage device.
  • Other encoders recognize the loss of video source and may encode a lost video image on the fly for inclusion into the outbound digital stream, repeatedly send the last sampled picture again or stop sending pictures in the outbound stream altogether.
  • the disclosed methods, equipment and systems provided can continually send a user-definable, pre- stored encoded picture to an end user to visually represent the loss of video data in a sequence of video data images.
  • the encoded picture can be sent once a second for the duration of the gap in recorded video data.
  • 17005v2 When video data is being recorded in a storage media from a video source, e.g., a video camera or the like, during the time that the video source is lost, no meaningful video data can be stored to the storage media. As explained above, conventional systems would typically record a blank or black screen or may display the last video image generated by the video source.
  • equipment may be configured to stop transmission, recordation or other processing of such non-meaningful video data. This saves substantially on processing consumption, I/O to a storage device and on the amount of storage media required during the time that the video source/data is unavailable.
  • the pre-stored, replacement picture may be pulled from RAM and inserted into the data stream. Pulling this picture from RAM rather than a storage media storing the video data itself may further save substantially on I/O to the storage device during the time that the video source/data was unavailable.
  • FIG. 2 illustrates an example of network media appliance in combination with other pieces of equipment utilized in, or interacting with, a system utilizing pre- stored picture data to represent missing video data.
  • a network media appliance 201 is coupled to one or more video sources 202a, b, and c via, for example, one or more communication and/or control networks 203 (which may or may not include the Internet).
  • video sources 202a, b, c may be, for example, video cameras, encoders of video data, video storage devices, etc.
  • the network media appliance 201 is also coupled to one or more storage media 204 (either directly or through the one or more networks 202) that are configured to store video data for playback and/or data associated with and representing one or more user-defined pictures that are used to represent missing video data in video data sequences received from the video sources 202a, b, c. [0024]
  • the network media appliance 201 is further coupled (either directly or via the one or more networks 203) one or more end user devices 205.
  • Such user devices 205 may be, for example, one or more analog monitors configured to enable viewing of video data transmitted from the network media appliance 201, one or more computer applications running on one or more general or specific purpose computers to perform
  • 1700Sv2 4 remote monitoring functions one or more general or specific purpose computers running one or more analytic programs configured to analyze data, store, compress and/or otherwise process the video data transmitted from the network media appliance 201, and/or one or more pieces of command and control equipment implemented on one or more general or specific purpose computers and configured to support command and control functionality performed on the basis of the video data transmitted from the network media appliance 201.
  • the network media appliance 201 may be configured in a number of alternative implementations to provide one or more optional functions.
  • the methods, equipment and systems provided do not require the network media appliance to encode a picture, the inclusion of such functionality in the network media appliance 201 may enable a further reduction is the amount of processing required by a Computer Processing Unit (CPU) used to implement some or all of the disclosed functionality.
  • CPU Computer Processing Unit
  • the network media appliance 201 may be what is construed as a "transport layer"-oriented device in that it does not encode or decode video data.
  • the methods, equipment and systems provided do not require the network media appliance 201 to store the pre-stored, user-definable picture to storage media 204 or in a storage media included in the network media appliance 201 itself and/or elsewhere; however, the inclusion of such functionality in the network media appliance 201 may enable a further reduction in the amount of storage capacity needed to store a particular sequence of video data with one or more gaps in video image data.
  • the methods, equipment and systems provided do not require the network media appliance 201 to read the pre-stored pictures from the storage media 204 or in a storage media included in the network media appliance 201 ; however, inclusion of such functionality in the network media appliance 201 may require less I/O bandwidth from the storage media than in conventional video data processing methodologies.
  • the methods, equipment and systems provided in accordance with at least one embodiment of the invention can use a pre-stored encoded picture for purposes of presenting a currently unavailable video source (e.g., encoder) when attempting to display live video. Further, in accordance with at least one
  • the methods, equipments and systems provided can use a pre-stored encoded picture for purposes of presenting gaps in time for recorded video during video playback.
  • picture should be understood to be a discrete, two-dimensional sample of pixels.
  • picture should be construed with this characterization in mind as well as being consistent with relevant Motion Pictures Experts Group (MPEG) standards referred to in this disclosure.
  • MPEG Motion Pictures Experts Group
  • the user-definable picture may be defined or selected by a user or user organization that is utilizing the disclosed methods, equipment and/or systems. As such, the identity and characteristics of the picture may differ from user to user. However, it should be appreciated that increased utility may be provided by utilizing a picture that most effectively indicates when video stream data is not available. When a user (or user device or user application) is able to effectively determine when video stream data has been lost, is unavailable is frozen and/or hung up, it may be most effectively determined whether diagnostics may be needed to be performed on the equipment and/or system, the video stream data may not be relied upon for certain purposes and/or various maintenance operations or investigations may need to be performed.
  • a replacement picture may be acquired by the network media appliance for a specified resolution and encoding profile. This may involve the acquisition of a single replacement picture (e.g., in the case of intra-coded picture data) or more than one (e.g., in the case of a predictive picture data)
  • data for the pre-stored, user-definable picture can be stored for each resolution and each encoder type. That data can be stored by requesting a video stream from an encoder that is supplying a black video. The picture data from this stream can be captured and converted to an array and stored in a header file. For example, resolutions for MJPEG, MPEG-2 and MPEG-4 can be stored. H.264 and other encoders can also be used. The picture data can be stored locally in an array in RAM for quick access. By way of example, one or more of the formats and sizes (and the like) for the visual representation, graphic or picture illustrated in Tables 1-6 can be supported.
  • the network media appliance may be configured to manage video switching and storage tasks to enable simultaneous live viewing, recording and playback of hundreds of concurrent video and audio streams.
  • the network media appliance may be configured to controls distribution of video data by being implemented as part of a communication and/or control network (which may be, for example, a LAN or WAN, and/or coupled to one or more other communication networks of various different types including those networks that are collectively referred to as the Internet). Accordingly, the network media appliance may be configured to actively manage resources rather than simply flooding video streams into the network infrastucture, as in conventional Network Video Recorders (NVR), Digital Video Recorder (DVR) and Personal Computer (PC) solutions.
  • NVR Network Video Recorders
  • DVR Digital Video Recorder
  • PC Personal Computer
  • the network media appliance may be configured to manage, replicate, switch and change the quality of video data, based on situational awareness, making live and stored streams transparent and immediately available.
  • the network media appliance may utilize a purpose-built operating system configured for packet switching video data.
  • data may be provided in security and surveillance networks (for example, Closed Circuit Television (CCT) systems).
  • CCT Closed Circuit Television
  • 17005v2 amount of video data to be recorded is quite large and storage and archiving activating require ongoing purging activities to ensure the ongoing storage of significant data (e.g., recent data such as data captured in the last thirty days, or tagged data such as data identified based on one or more criteria as being actually or potentially significant).
  • significant data e.g., recent data such as data captured in the last thirty days, or tagged data such as data identified based on one or more criteria as being actually or potentially significant.
  • some or all of the hardware and software used to implement the network media appliance may be configured such that multiple network media appliances may be coupled together to provide increased processing capability and/or functional redundancy.
  • network media appliance hardware may be provided in a single component sized and shaped to be accommodated in a compact rack space allowing for incorporation of multiple appliance units to provide unlimited network size and capabilities.
  • a replacement picture can be transmitted as an indication that video data that would normally be sent in the video data sequence is not available.
  • Such a replacement picture can be sent, for example, at a rate of once per second until such a time that the video data becomes available, for example, because a corresponding video feed is restored, an end user jumps or otherwise triggers movement to a time when video data from a disk or other data storage medium is available, the end user stops or deletes the playback, etc.
  • a replacement picture can be sent when a playback is requested for a time that there is no data available from disk or other data storage medium.
  • the replacement picture can be sent, for example, at a rate of once per second until such a time that the video data is unavailable.
  • Such a replacement picture may be sent, for example, until the video is available from disk, the user jumps or otherwise triggers movement to a time when video from disk is available, the end user jumps the video to live and video is available from a video source, (e.g., a storage medium, an encoder, a video camera or other video data acquisition device, etc.), the end user stops or deletes the playback, etc.
  • a video source e.g., a storage medium, an encoder, a video camera or other video data acquisition device, etc.
  • a replacement picture can be transmitted when a playback from a disk or other storage medium reaches the end of a file including video data.
  • the replacement picture can be transmitted as an indication that no additional video data is available.
  • Such a replacement picture can be sent, for example, at a rate of once per
  • FIG. 3 illustrates a method of utilizing pre-stored picture data to represent missing video data in accordance with at least one embodiment of the system.
  • the method operations begin at 301 and control proceeds to 302, at which output (e.g., playback) of video stream (live or recorded) is initiated with a specified resolution and encoding profile.
  • Control then proceeds to 303, at which a next sample time t[i] is set; this may be set based, at least in part, on an end user's control playback options such as, pause, jump, fast-forward, rewind, step, slow motion, etc.
  • a picture e.g., video data from a designated video source
  • control proceeds directly to 305 at which the picture identified in either 304 or 305 is packaged for transport. This may be done by a simple modification to header fields. Control then proceeds to 307, at which the picture is sent to the transport layer. Control then returns to 303 if additional sample times are to be processed, otherwise control proceeds to 308 at which method operations end.
  • Video data playbacks that support replacement pictures may be sent, for example, via one of more transport protocols including Hypertext Transfer Protocol (HTTP), Real Time Protocol (RTP), User Datagram Protocol (UDP), MPEG-2 transport Protocol, etc.
  • HTTP Hypertext Transfer Protocol
  • RTP Real Time Protocol
  • UDP User Datagram Protocol
  • MPEG-2 transport Protocol etc.
  • the replacement picture is sent via RTP
  • various pieces of information can be maintained and accurately updated including the Synchronization Source (SSRC), the RTP Timestamp and the RTP sequence number.
  • SSRC Synchronization Source
  • RTP Timestamp the RTP sequence number.
  • the SSRC is the source of a stream of RTP packets, identified by a 32-bit numeric SSRC identifier carried in the RTP header so as not to be dependent upon the network address.
  • the SSRC value used for normal playback of video can be re-used when sending replacement pictures during the same RTP session.
  • the RTP Timestamp is a timestamp that reflects the sampling instant of a first octet in the RTP data packet.
  • the sampling instant can be derived from a clock that increments monotonically and linearly in time to allow synchronization and jitter calculations.
  • the RTP sequence number is a sequence number that can increment by one for each RTP data packet sent, and can be used by the receiver to detect packet loss and to restore packet sequence. When switching back and forth between normal playback of video and inserting replacement pictures the RTP sequence number can be maintained and incremented appropriately.
  • the processing of the disclosed methods, equipment and systems can be performed by software components.
  • program modules comprise computer code, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the disclosed method can also be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules can be located in both local and remote computer storage media including memory storage devices.
  • FIG. 1 is a block diagram illustrating an exemplary operating environment for performing one method according to the present invention.
  • This exemplary operating environment is only an example of an operating environment and is not intended to suggest any limitation as to the scope of use or functionality of operating environment architecture. Neither should the operating environment be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment.
  • the present methods, equipment and systems can be operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that can be suitable for use with the system and method comprise, but are not limited to, personal computers, server computers, laptop devices, and multiprocessor systems. Additional examples comprise set top boxes, programmable consumer
  • FIG. 2 may be implemented using one or more of the components illustrated in FIG. 1. More specifically, it should be understood that the network media appliance 201 illustrated in FIG. 2 may be implemented in whole or in part using the computer 101 illustrated in FIG. 1 and/or any of the components included in computer 101.
  • the system and method disclosed herein can be implemented via a general -purpose computing device in the form of a computer 101.
  • the components of the computer 101 can comprise, but are not limited to, one or more processors or processing units 103, a system memory 112, and a system bus 113 that couples various system components including the processor 103 to the system memory 112.
  • the system can utilize parallel computing.
  • the system bus 113 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • bus architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI- Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • AGP Accelerated Graphics Port
  • PCI Peripheral Component Interconnects
  • PCI- Express PCI- Express
  • PCMCIA Personal Computer Memory Card Industry Association
  • USB Universal Serial Bus
  • the bus 113, and all buses specified in this description can also be implemented over a wired or wireless network connection and each of the subsystems, including the processor 103, a mass storage device 104, an operating system 105, software 106, data 107, a network adapter 108, system memory 112, an Input/Output Interface 110, a display adapter 109, a display device 111, and a human machine interface 102, can be contained within one or more remote computing devices 114a,b,c at physically separate locations, connected through buses of this form, in effect implementing a fully distributed system.
  • the computer 101 typically comprises a variety of computer readable media.
  • Exemplary readable media can be any available media that is accessible by the computer 101 and comprises, for example and not meant to be limiting, both volatile and
  • the system memory 112 comprises computer readable media in the form of volatile memory, such as Random Access Memory (RAM), and/or non-volatile memory, such as Read Only Memory (ROM).
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the system memory 112 typically contains data such as data 107 and/or program modules such as operating system 105 and software 106 that are immediately accessible to and/or are presently operated on by the processing unit 103.
  • the computer 101 can also comprise other removable/non-removable, volatile/non-volatile computer storage media.
  • FIG. 1 illustrates a mass storage device 104 which can provide non- volatile storage of computer code, computer readable instructions, data structures, program modules, and other data for the computer 101.
  • a mass storage device 104 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, RAM, ROM, Electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
  • any number of program modules can be stored on the mass storage device 104, including by way of example, an operating system 105 and software 106.
  • Each of the operating system 105 and software 106 (or some combination thereof) can comprise elements of the programming and the software 106.
  • Data 107 can also be stored on the mass storage device 104.
  • Data 107 can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2®, Microsoft® Access, Microsoft® SQL Server, Oracle®, mySQL, PostgreSQL, and the like.
  • the databases can be centralized or distributed across multiple systems.
  • the user can enter commands and information into the computer 101 via an input device (not shown).
  • input devices comprise, but are not limited to, a keyboard, pointing device (e.g., a "mouse"), a microphone, a joystick, a scanner, tactile input devices such as gloves, and other body coverings, and the like.
  • a human machine interface 102 that is coupled to the system bus 113, but can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a USB.
  • a display device 111 can also be connected to the system bus 113 via an interface, such as a display adapter 109. It is contemplated that the computer 101 can have more than one display adapter 109 and the computer 101 can have
  • a display device can be a monitor, a Liquid Crystal Display (LCD), or a projector.
  • other output peripheral devices can comprise components such as speakers (not shown) and a printer (not shown) which can be connected to the computer 101 via Input/Output Interface 110. Any step and/or result of the methods can be output in any form to an output device.
  • Such output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like.
  • the computer 101 can operate in a networked environment using logical connections to one or more remote computing devices 114a, b, and c.
  • a remote computing device can be a personal computer, portable computer, a server, a router, a network computer, a peer device or other common network node, and so on.
  • Logical connections between the computer 101 and a remote computing device 114a, b, c can be made via a Local Area Network (LAN) and a general Wide Area Network (WAN).
  • LAN Local Area Network
  • WAN Wide Area Network
  • Such network connections can be through a network adapter 108.
  • a network adapter 108 can be implemented in both wired and wireless environments. Such networking environments are conventional and commonplace in offices, enterprise-wide computer networks, intranets, and the Internet 115.
  • Computer readable media can comprise “computer storage media” and “communications media.”
  • “Computer storage media” comprise volatile and non-volatile, removable and nonremovable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data.
  • Exemplary computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, DVD or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic
  • 17005v2 15 storage devices or any other medium which can be used to store the desired information and which can be accessed by a computer.
  • the methods, equipment and systems can employ Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. Expert inference rules generated through a neural network or production rules from statistical learning).
  • Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. Expert inference rules generated through a neural network or production rules from statistical learning).
  • Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior

Abstract

Methods, equipment and systems are configured to send a pre-stored, user-definable picture for display to an end user to visually represent the loss of video data in a sequence of video data images.

Description

Attorney Docket No.: 141794-Steel Customer No: 87343
PATENT COOPERATION TREATY
TITLE
METHODS, EQUIPMENT AND SYSTEMS UTILIZING PRE-STORED PICTURE DATA
TO REPRESENT MISSING VIDEO DATA
INVENTOR
BRIAN SCOTT KERSEY WILLIAM M. LEBLANC
PRIORITY CLAIM
[001] This patent application claims priority to U.S. Provisional Patent Application No. 61/086,009, filed August 4, 2008, the disclosure of which is incorporated herein by reference in its entirety. FIELD
[002] The present application relates to video image data processing methods and equipment and, more particularly, to providing solutions addressing issues stemming from lost video data. BACKGROUND [003] Conventional hardware encoders and associated equipment use a frame grabber to convert analog video into an encoded picture. However, this process consumes a substantial amount of a system's processing capabilities and is susceptible to problems related to lost video sources.
17005v2 1 SUMMARY
[004] The present disclosure describes several exemplary embodiments of the present invention.
[005] In accordance with at least one embodiment, methods, equipment and systems are configured to continually send a pre-stored, user-definable picture for display to an end user to visually represent the loss of video data in a sequence of video data images. BRIEF DESCRIPTION OF THE DRAWINGS
[006] Various aspects of the present disclosure are described herein below with reference to the accompanying figures.
[007] FIG. 1 is a block diagram illustrating an exemplary operating environment for performing the disclosed method and including, among other equipment, one or more system components configured to implement the disclosed method so as to operate as the disclosed system. [008] FIG. 2 illustrates a block diagram an example of network media appliance in combination with other pieces of equipment utilized in, or interacting with, a system utilizing pre-stored picture data to represent missing video data in accordance with at least one embodiment of the system.
[009] FIG. 3 illustrates a method of utilizing pre-stored picture data to represent missing video data in accordance with at least one embodiment of the system. DETAILED DESCRIPTION
[0010] Before the present methods, equipment and systems are disclosed and described, it is to be understood that the methods, equipment and systems are not limited to specific synthetic methods, specific components, or to particular compositions, as such may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting.
[0011] The present methods, equipment and systems may be understood more readily by reference to the following detailed description of embodiments and to the Figures and their previous and following description. [0012] Conventional hardware encoders and associated equipment use a frame grabber to convert analog video into an encoded picture. However, this process consumes a substantial amount of a system's processing capabilities.
[0013] Moreover, when an encoder or other video equipment loses the video source (e.g., through connectivity problems, problems with access to a video data storage
17005v2 2 device, video is intentionally removed for privacy or legal issues, etc.), the frame grabber continues to encode pictures and the picture encoder is black because there is no longer a source providing the video to be encoded. As a result, using conventional hardware encoders that use frame grabbers, this black picture is sent to a storage device consuming both I/O bandwidth to the storage device and storage media to actually store the picture. Additionally, during playback the black pictures must be read back from the storage media requiring more I/O bandwidth from the storage device.
[0014] Other encoders recognize the loss of video source and may encode a lost video image on the fly for inclusion into the outbound digital stream, repeatedly send the last sampled picture again or stop sending pictures in the outbound stream altogether.
[0015] When a lost video image is encoded on the fly, the encoding requires a process to transcode a picture into the established resolution and format using the agreed upon encoding parameters. As a result, such process requirements continue to tax encoding and bandwidth processes. Moreover, the video image is predefined is not user definable; therefore, it may difficult for an end user to determine whether he is viewing such a lost video image or video image data that has not been lost.
[0016] When an encoder repeatedly sends the last sampled picture again, there is no visual indication or added information to the end user of the video samples to indicate that video data has been lost; moreover, this approach continues to tax bandwidth resources without providing any real benefit.
[0017] When an encoder simply stops sending picture in an outbound stream altogether, the approach relieves the waste of resources resulting from other conventional approaches; however, again, there is no real visual information in the stream.
[0018] Conventional devices that do not include an encoding functionality are limited to simply encoding a lost video image on the fly or repeatedly sending a last sampled picture again for their in-line handling of video loss. As a result, many conventional devices are forced to adopt non-standard non-inline event notification methods.
[0019] To the contrary, in accordance with at least one embodiment, the disclosed methods, equipment and systems provided can continually send a user-definable, pre- stored encoded picture to an end user to visually represent the loss of video data in a sequence of video data images. The encoded picture can be sent once a second for the duration of the gap in recorded video data.
17005v2 [0020] When video data is being recorded in a storage media from a video source, e.g., a video camera or the like, during the time that the video source is lost, no meaningful video data can be stored to the storage media. As explained above, conventional systems would typically record a blank or black screen or may display the last video image generated by the video source.
[0021] By indicating that the video source is unavailable by transmitting a pre- stored, user-definable encoded picture during this video source (and corresponding video data) loss, equipment may be configured to stop transmission, recordation or other processing of such non-meaningful video data. This saves substantially on processing consumption, I/O to a storage device and on the amount of storage media required during the time that the video source/data is unavailable.
[0022] Additionally, in accordance with at least one embodiment of the invention, during playback of stored video image data recorded with this feature, the pre-stored, replacement picture may be pulled from RAM and inserted into the data stream. Pulling this picture from RAM rather than a storage media storing the video data itself may further save substantially on I/O to the storage device during the time that the video source/data was unavailable.
[0023] FIG. 2 illustrates an example of network media appliance in combination with other pieces of equipment utilized in, or interacting with, a system utilizing pre- stored picture data to represent missing video data. As illustrated in FIG. 2, a network media appliance 201 is coupled to one or more video sources 202a, b, and c via, for example, one or more communication and/or control networks 203 (which may or may not include the Internet). Such video sources 202a, b, c may be, for example, video cameras, encoders of video data, video storage devices, etc. The network media appliance 201 is also coupled to one or more storage media 204 (either directly or through the one or more networks 202) that are configured to store video data for playback and/or data associated with and representing one or more user-defined pictures that are used to represent missing video data in video data sequences received from the video sources 202a, b, c. [0024] The network media appliance 201 is further coupled (either directly or via the one or more networks 203) one or more end user devices 205. Such user devices 205 may be, for example, one or more analog monitors configured to enable viewing of video data transmitted from the network media appliance 201, one or more computer applications running on one or more general or specific purpose computers to perform
1700Sv2 4 remote monitoring functions, one or more general or specific purpose computers running one or more analytic programs configured to analyze data, store, compress and/or otherwise process the video data transmitted from the network media appliance 201, and/or one or more pieces of command and control equipment implemented on one or more general or specific purpose computers and configured to support command and control functionality performed on the basis of the video data transmitted from the network media appliance 201.
[0025] In accordance with at least one embodiment of the invention, the network media appliance 201 may be configured in a number of alternative implementations to provide one or more optional functions. For example, although the methods, equipment and systems provided do not require the network media appliance to encode a picture, the inclusion of such functionality in the network media appliance 201 may enable a further reduction is the amount of processing required by a Computer Processing Unit (CPU) used to implement some or all of the disclosed functionality. [0026] However, it should also be understood that, in accordance with at least one embodiment, the network media appliance 201 may be what is construed as a "transport layer"-oriented device in that it does not encode or decode video data.
[0027] Further, in accordance with at least one embodiment, the methods, equipment and systems provided do not require the network media appliance 201 to store the pre-stored, user-definable picture to storage media 204 or in a storage media included in the network media appliance 201 itself and/or elsewhere; however, the inclusion of such functionality in the network media appliance 201 may enable a further reduction in the amount of storage capacity needed to store a particular sequence of video data with one or more gaps in video image data. [0028] Additionally, in accordance with at least one embodiment, the methods, equipment and systems provided do not require the network media appliance 201 to read the pre-stored pictures from the storage media 204 or in a storage media included in the network media appliance 201 ; however, inclusion of such functionality in the network media appliance 201 may require less I/O bandwidth from the storage media than in conventional video data processing methodologies.
[0029] It should be understood that the methods, equipment and systems provided in accordance with at least one embodiment of the invention can use a pre-stored encoded picture for purposes of presenting a currently unavailable video source (e.g., encoder) when attempting to display live video. Further, in accordance with at least one
17005v2 5 embodiment of the invention, the methods, equipments and systems provided can use a pre-stored encoded picture for purposes of presenting gaps in time for recorded video during video playback.
[0030] It should be understood that, for the purposes of this disclosure, the term "picture" should be understood to be a discrete, two-dimensional sample of pixels. Thus, the term "picture" should be construed with this characterization in mind as well as being consistent with relevant Motion Pictures Experts Group (MPEG) standards referred to in this disclosure.
[0031] The user-definable picture may be defined or selected by a user or user organization that is utilizing the disclosed methods, equipment and/or systems. As such, the identity and characteristics of the picture may differ from user to user. However, it should be appreciated that increased utility may be provided by utilizing a picture that most effectively indicates when video stream data is not available. When a user (or user device or user application) is able to effectively determine when video stream data has been lost, is unavailable is frozen and/or hung up, it may be most effectively determined whether diagnostics may be needed to be performed on the equipment and/or system, the video stream data may not be relied upon for certain purposes and/or various maintenance operations or investigations may need to be performed.
[0032] It should be appreciated that a replacement picture may be acquired by the network media appliance for a specified resolution and encoding profile. This may involve the acquisition of a single replacement picture (e.g., in the case of intra-coded picture data) or more than one (e.g., in the case of a predictive picture data)
[0033] It should be understood that data for the pre-stored, user-definable picture can be stored for each resolution and each encoder type. That data can be stored by requesting a video stream from an encoder that is supplying a black video. The picture data from this stream can be captured and converted to an array and stored in a header file. For example, resolutions for MJPEG, MPEG-2 and MPEG-4 can be stored. H.264 and other encoders can also be used. The picture data can be stored locally in an array in RAM for quick access. By way of example, one or more of the formats and sizes (and the like) for the visual representation, graphic or picture illustrated in Tables 1-6 can be supported.
TABLE 1
MJPEG
17005v2 160x120
240x180
320x240
480x360
640x480
704x480 (4CIF)
704x480 (2CIFE)
704x240 (2CIF)
352x240 (CIF)
176x120 (QCIF)
TABLE 2
MPEG-4
160x120
240x180
320x240
480x360
640x480
704x480 (4CIF)
704x480 (2CIFE)
704x240 (2CIF)
352x240 (CIF)
176x120 (QCIF)
TABLE 3
H.264
704x480 (4CIF)
704x480 (2CIFE)
704x240 (2CIF)
352x240 (CIF)
176x120 (QCIF)
160x120
17005v2 240x180
320x240
480x360
640x480
TABLE 4
MJPEG
800x600
1024x768
1280x720
1280x960
1280x1024
1600x1200
2048x1536
2560x1920
704x576 (4CIF)
720x576 (4CIF)
720x288 (2CIF)
352x288 (CIF)
176x144 (QCIF)
TABLE 5
MPEG-4
1280x960
1280x1024
1600x1200
720x576 (4CIF)
720x480 (Dl)
704x576 (4CIF)
720x288 (2CIF)
352x576 (HaIfDl)
352x288 (CIF)
17005v2 176x144 (QCIF)
TABLE 6
H.264
1280x960
1280x1024
1600x1200
720x576 (4CIF)
704x576 (4CIF)
720x288 (2CIF)
352x288 (CIF)
176x144 (QCIF)
[0034] In accordance with at least one embodiment, the network media appliance may be configured to manage video switching and storage tasks to enable simultaneous live viewing, recording and playback of hundreds of concurrent video and audio streams.
[0035] In accordance with at least one embodiment, the network media appliance may be configured to controls distribution of video data by being implemented as part of a communication and/or control network (which may be, for example, a LAN or WAN, and/or coupled to one or more other communication networks of various different types including those networks that are collectively referred to as the Internet). Accordingly, the network media appliance may be configured to actively manage resources rather than simply flooding video streams into the network infrastucture, as in conventional Network Video Recorders (NVR), Digital Video Recorder (DVR) and Personal Computer (PC) solutions. [0036] In accordance with at least one embodiment, the network media appliance may be configured to manage, replicate, switch and change the quality of video data, based on situational awareness, making live and stored streams transparent and immediately available.
[0037] In accordance with at least one embodiment, the network media appliance may utilize a purpose-built operating system configured for packet switching video data. Such data may be provided in security and surveillance networks (for example, Closed Circuit Television (CCT) systems). In such networks, it should be understood that the
17005v2 amount of video data to be recorded is quite large and storage and archiving activating require ongoing purging activities to ensure the ongoing storage of significant data (e.g., recent data such as data captured in the last thirty days, or tagged data such as data identified based on one or more criteria as being actually or potentially significant). [0038] In accordance with at least one embodiment, some or all of the hardware and software used to implement the network media appliance may be configured such that multiple network media appliances may be coupled together to provide increased processing capability and/or functional redundancy. In such an implementation, network media appliance hardware may be provided in a single component sized and shaped to be accommodated in a compact rack space allowing for incorporation of multiple appliance units to provide unlimited network size and capabilities.
[0039] As explained above, in accordance with at least one embodiment, a replacement picture can be transmitted as an indication that video data that would normally be sent in the video data sequence is not available. Such a replacement picture can be sent, for example, at a rate of once per second until such a time that the video data becomes available, for example, because a corresponding video feed is restored, an end user jumps or otherwise triggers movement to a time when video data from a disk or other data storage medium is available, the end user stops or deletes the playback, etc.
[0040] However, it should be appreciated that various picture/frame rates may be utilized. Thus, for example, for progressive video data, the picture may be sent once per frame, whereas, for interlaced video, multiple pictures may be sent per frame.
[0041] Moreover, a replacement picture can be sent when a playback is requested for a time that there is no data available from disk or other data storage medium. Thus, the replacement picture can be sent, for example, at a rate of once per second until such a time that the video data is unavailable. Such a replacement picture may be sent, for example, until the video is available from disk, the user jumps or otherwise triggers movement to a time when video from disk is available, the end user jumps the video to live and video is available from a video source, (e.g., a storage medium, an encoder, a video camera or other video data acquisition device, etc.), the end user stops or deletes the playback, etc.
[0042] Further, a replacement picture can be transmitted when a playback from a disk or other storage medium reaches the end of a file including video data. Thus, the replacement picture can be transmitted as an indication that no additional video data is available. Such a replacement picture can be sent, for example, at a rate of once per
17005v2 10 second until such a time that additional video data is available from a disk or other data storage medium, the user jumps or otherwise triggers movement to a time when video data from disk is available, the user jumps the video to live and video is available from the encoder, the user stops or deletes the playback, etc. It should be understood that replacement pictures can also be sent when a playback from a disk or other recording mechanism reaches the beginning of a file in the same manner.
[0043] FIG. 3 illustrates a method of utilizing pre-stored picture data to represent missing video data in accordance with at least one embodiment of the system. As shown in FIG. 3, the method operations begin at 301 and control proceeds to 302, at which output (e.g., playback) of video stream (live or recorded) is initiated with a specified resolution and encoding profile. Control then proceeds to 303, at which a next sample time t[i] is set; this may be set based, at least in part, on an end user's control playback options such as, pause, jump, fast-forward, rewind, step, slow motion, etc. Control then proceeds to 304 where a determination is made whether there is a picture (e.g., video data from a designated video source) for the specified sample time t[i]. If there is no picture available, control proceeds to 305 at which a pre-stored, user-definable picture is retrieved for the specified resolution and encoding profile for the video stream. Control then proceeds to 305.
[0044] If there is a picture available, control proceeds directly to 305 at which the picture identified in either 304 or 305 is packaged for transport. This may be done by a simple modification to header fields. Control then proceeds to 307, at which the picture is sent to the transport layer. Control then returns to 303 if additional sample times are to be processed, otherwise control proceeds to 308 at which method operations end.
[0045] Video data playbacks that support replacement pictures may be sent, for example, via one of more transport protocols including Hypertext Transfer Protocol (HTTP), Real Time Protocol (RTP), User Datagram Protocol (UDP), MPEG-2 transport Protocol, etc.
[0046] When the replacement picture is sent via RTP, for example, various pieces of information can be maintained and accurately updated including the Synchronization Source (SSRC), the RTP Timestamp and the RTP sequence number.
[0047] The SSRC is the source of a stream of RTP packets, identified by a 32-bit numeric SSRC identifier carried in the RTP header so as not to be dependent upon the network address. The SSRC value used for normal playback of video can be re-used when sending replacement pictures during the same RTP session.
17005v2 1 1 [0048] The RTP Timestamp is a timestamp that reflects the sampling instant of a first octet in the RTP data packet. The sampling instant can be derived from a clock that increments monotonically and linearly in time to allow synchronization and jitter calculations. [0049] The RTP sequence number is a sequence number that can increment by one for each RTP data packet sent, and can be used by the receiver to detect packet loss and to restore packet sequence. When switching back and forth between normal playback of video and inserting replacement pictures the RTP sequence number can be maintained and incremented appropriately. [0050] The processing of the disclosed methods, equipment and systems can be performed by software components. The disclosed system and method can be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers or other devices. Generally, program modules comprise computer code, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The disclosed method can also be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote computer storage media including memory storage devices.
[0051] FIG. 1 is a block diagram illustrating an exemplary operating environment for performing one method according to the present invention. This exemplary operating environment is only an example of an operating environment and is not intended to suggest any limitation as to the scope of use or functionality of operating environment architecture. Neither should the operating environment be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment.
[0052] The present methods, equipment and systems can be operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that can be suitable for use with the system and method comprise, but are not limited to, personal computers, server computers, laptop devices, and multiprocessor systems. Additional examples comprise set top boxes, programmable consumer
17005v2 12 electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that comprise any of the above systems or devices, and the like.
[0053] It should be understood by one of ordinary skill in the art that one or more of the components illustrated in FIG. 2 may be implemented using one or more of the components illustrated in FIG. 1. More specifically, it should be understood that the network media appliance 201 illustrated in FIG. 2 may be implemented in whole or in part using the computer 101 illustrated in FIG. 1 and/or any of the components included in computer 101.
[0054] Further, one skilled in the art will appreciate that the system and method disclosed herein can be implemented via a general -purpose computing device in the form of a computer 101. The components of the computer 101 can comprise, but are not limited to, one or more processors or processing units 103, a system memory 112, and a system bus 113 that couples various system components including the processor 103 to the system memory 112. In the case of multiple processing units 103, the system can utilize parallel computing.
[0055] The system bus 113 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI- Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like. The bus 113, and all buses specified in this description can also be implemented over a wired or wireless network connection and each of the subsystems, including the processor 103, a mass storage device 104, an operating system 105, software 106, data 107, a network adapter 108, system memory 112, an Input/Output Interface 110, a display adapter 109, a display device 111, and a human machine interface 102, can be contained within one or more remote computing devices 114a,b,c at physically separate locations, connected through buses of this form, in effect implementing a fully distributed system.
[0056] The computer 101 typically comprises a variety of computer readable media. Exemplary readable media can be any available media that is accessible by the computer 101 and comprises, for example and not meant to be limiting, both volatile and
17005v2 13 non- volatile media, removable and non-removable media. The system memory 112 comprises computer readable media in the form of volatile memory, such as Random Access Memory (RAM), and/or non-volatile memory, such as Read Only Memory (ROM). The system memory 112 typically contains data such as data 107 and/or program modules such as operating system 105 and software 106 that are immediately accessible to and/or are presently operated on by the processing unit 103.
[0057] In another aspect, the computer 101 can also comprise other removable/non-removable, volatile/non-volatile computer storage media. By way of example, FIG. 1 illustrates a mass storage device 104 which can provide non- volatile storage of computer code, computer readable instructions, data structures, program modules, and other data for the computer 101. For example and not meant to be limiting, a mass storage device 104 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, RAM, ROM, Electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
[0058] Optionally, any number of program modules can be stored on the mass storage device 104, including by way of example, an operating system 105 and software 106. Each of the operating system 105 and software 106 (or some combination thereof) can comprise elements of the programming and the software 106. Data 107 can also be stored on the mass storage device 104. Data 107 can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2®, Microsoft® Access, Microsoft® SQL Server, Oracle®, mySQL, PostgreSQL, and the like. The databases can be centralized or distributed across multiple systems.
[0059] In another aspect, the user can enter commands and information into the computer 101 via an input device (not shown). Examples of such input devices comprise, but are not limited to, a keyboard, pointing device (e.g., a "mouse"), a microphone, a joystick, a scanner, tactile input devices such as gloves, and other body coverings, and the like. These and other input devices can be connected to the processing unit 103 via a human machine interface 102 that is coupled to the system bus 113, but can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a USB.
[0060] In yet another aspect, a display device 111 can also be connected to the system bus 113 via an interface, such as a display adapter 109. It is contemplated that the computer 101 can have more than one display adapter 109 and the computer 101 can have
17005v2 14 more than one display device 111. For example, a display device can be a monitor, a Liquid Crystal Display (LCD), or a projector. In addition to the display device 111, other output peripheral devices can comprise components such as speakers (not shown) and a printer (not shown) which can be connected to the computer 101 via Input/Output Interface 110. Any step and/or result of the methods can be output in any form to an output device. Such output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like.
[0061] The computer 101 can operate in a networked environment using logical connections to one or more remote computing devices 114a, b, and c. By way of example, a remote computing device can be a personal computer, portable computer, a server, a router, a network computer, a peer device or other common network node, and so on. Logical connections between the computer 101 and a remote computing device 114a, b, c can be made via a Local Area Network (LAN) and a general Wide Area Network (WAN). Such network connections can be through a network adapter 108. A network adapter 108 can be implemented in both wired and wireless environments. Such networking environments are conventional and commonplace in offices, enterprise-wide computer networks, intranets, and the Internet 115.
[0062] For purposes of illustration, application programs and other executable program components such as the operating system 105 are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computing device 101, and are executed by the data processor(s) of the computer. An implementation of software 106 can be stored on or transmitted across some form of computer readable media. Any of the disclosed methods can be performed by computer readable instructions embodied on computer readable media. Computer readable media can be any available media that can be accessed by a computer. By way of example and not meant to be limiting, computer readable media can comprise "computer storage media" and "communications media." "Computer storage media" comprise volatile and non-volatile, removable and nonremovable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. Exemplary computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, DVD or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic
17005v2 15 storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
[0063] The methods, equipment and systems can employ Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. Expert inference rules generated through a neural network or production rules from statistical learning). [0064] While the methods, equipment and systems have been described in connection with specific embodiments, it is not intended that the scope be limited to the particular embodiments set forth, as the embodiments herein are intended in all respects to be illustrative rather than restrictive.
[0065] Unless otherwise expressly stated, it is in no way intended that any method set forth herein be construed as requiring that its steps be performed in a specific order. Accordingly, where a method claim does not actually recite an order to be followed by its steps or it is not otherwise specifically stated in the claims or descriptions that the steps are to be limited to a specific order, it is no way intended that an order be inferred, in any respect. This holds for any possible non-express basis for interpretation, including: matters of logic with respect to arrangement of steps or operational flow; plain meaning derived from grammatical organization or punctuation; the number or type of embodiments described in the specification.
[0066] As used in the specification and the appended claims, the singular forms "a," "an" and "the" include plural referents unless the context clearly dictates otherwise. Ranges may be expressed herein as from "about" one particular value, and/or to "about" another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent "about," it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
[0067] "Optional" or "optionally" means that the subsequently described event or circumstance may or may not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not.
17005v2 16 [0068] Throughout the description and claims of this specification, the word "comprise" and variations of the word, such as "comprising" and "comprises," means "including but not limited to," and is not intended to exclude, for example, other additives, components, integers or steps. "Exemplary" means "an example of and is not intended to convey an indication of a preferred or ideal embodiment. "Such as" is not used in a restrictive sense, but for explanatory purposes.
[0069] Disclosed are components that can be used to perform the disclosed methods, equipment and systems. These and other components are disclosed herein, and it is understood that when combinations, subsets, interactions, groups, etc. of these components are disclosed that while specific reference of each various individual and collective combinations and permutation of these may not be explicitly disclosed, each is specifically contemplated and described herein, for all methods, equipment and systems. This applies to all aspects of this application including, but not limited to, steps in disclosed methods. Thus, if there are a variety of additional steps that can be performed it is understood that each of these additional steps can be performed with any specific embodiment or combination of embodiments of the disclosed methods.
[0070] It will be apparent to those skilled in the art that various modifications and variations can be made without departing from the scope or spirit. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following inventive concepts.
17005v2 17

Claims

What is claimed is:
L A network media appliance configured to utilize pre-stored picture data to represent missing video data in video data received from at least one video source, the network media appliance configured to: a) determine whether video data is available from a video source at a specified time; b) select a pre-stored replacement picture with a resolution and encoding profile that matches that of the video source if video data is unavailable from the video source at the specified time; c) package the video data from the video source at the specified time if the video data is available and, alternatively, packaging the selected, pre-stored picture data if the video data is unavailable; and, d) send the packaged data to a transport layer.
2. The network media appliance of Claim 1, wherein the video source is a video data recording medium.
3. The network media appliance of Claim 1, wherein the video source is a video camera.
4. The network media appliance of Claim 1, wherein the video source is a video data encoder.
5. The network media appliance of Claim 1, wherein the pre-stored replacement picture is a discrete, two-dimensional sample of pixels.
6. The network media appliance of Claim 1, wherein the pre-stored replacement picture is user-defined.
17005v2 18
7. The network media appliance of Claim 1, the pre-stored replacement picture is stored in a storage medium that is included in or accessible by the network media appliance.
8. The network media appliance of Claim 1, wherein the format of the replacement picture is selected from one of: MJPEG, MPEG-2, MPEG-4 and H.264.
9. The network media appliance of Claim 1, wherein the network media applicant is configured to transmit the selected picture to an end user device at a rate of once per second until such a time that the video data becomes available.
10. The network media appliance of Claim 9, wherein the video data is available when connectivity between the network media appliance and the video source is restored.
11. The network media appliance of Claim 9, wherein the video data is available when a user of an end user device coupled to the network media appliance issues a command to alter the specified time to a time when video data is available from the video source.
12. The network media appliance of Claim 11, wherein the video source is a video storage medium.
13. The network media appliance of Claim 1, wherein the pre-stored replacement picture is transmitted to the transport layer via a transport protocol selected from the group consisting of: Hypertext Transfer Protocol, Real Time Protocol, User Datagram Protocol, and MPEG-2 transport protocol.
14. A method of utilizing pre-stored picture data to represent missing video data IN video data received from at least one video source, the method comprising: a) determining whether video data is available from a video source at a specified time;
17005v2 19 b) selecting a pre-stored replacement picture with a resolution and encoding profile that matches that of the video source if video data is unavailable from the video source at the specified time; c) packaging the video data from the video source at the specified time if the video data is available and, alternatively, packaging the selected, pre-stored picture data if the video data is unavailable; and, d) sending the packaged data to a transport layer.
15. The method of Claim 14, wherein the video source is one of a video data recording medium, a video camera, and a video data encoder.
16. The method of Claim 14, wherein the pre-stored replacement picture is a discrete, two-dimensional sample of pixels.
17. The method of Claim 14, wherein the format of the replacement picture is selected from one of: MJPEG, MPEG-2, MPEG-4 and H.264.
18. The method of Claim 14, further comprising transmitting the selected picture to an end user device at a rate of once per second until such a time that the video data becomes available.
19. The method of Claim 9, wherein the video data is available when one of the following occurs: connectivity between the network media appliance and the video source is restored; and an end user device coupled to the network media appliance issues a command to alter the specified time to a time when video data is available from the video source.
20. A remote monitoring system configured to support remote monitoring and recording of video data, the system comprising, the network media appliance of claim 1 and at least one end user device configured to record, analyze and/or output the video data transmitted from the network media appliance.
17005v2 20
PCT/US2009/052734 2008-08-04 2009-08-04 Methods, equipment and systems utilizing pre-stored picture data to represent missing video data WO2010017217A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US8600908P 2008-08-04 2008-08-04
US61/086,009 2008-08-04

Publications (2)

Publication Number Publication Date
WO2010017217A2 true WO2010017217A2 (en) 2010-02-11
WO2010017217A3 WO2010017217A3 (en) 2010-04-01

Family

ID=41664171

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/052734 WO2010017217A2 (en) 2008-08-04 2009-08-04 Methods, equipment and systems utilizing pre-stored picture data to represent missing video data

Country Status (1)

Country Link
WO (1) WO2010017217A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101860532A (en) * 2010-05-07 2010-10-13 深圳市共进电子有限公司 Saving and loading method for multiprocess configuration files

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870521A (en) * 1992-09-30 1999-02-09 Kabushiki Kaisha Toshiba Edited coded data decoding apparatus with substitute image replacement
US6078328A (en) * 1998-06-08 2000-06-20 Digital Video Express, Lp Compressed video graphics system and methodology
US6157396A (en) * 1999-02-16 2000-12-05 Pixonics Llc System and method for using bitstream information to process images for use in digital display systems
US7116342B2 (en) * 2003-07-03 2006-10-03 Sportsmedia Technology Corporation System and method for inserting content into an image sequence

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870521A (en) * 1992-09-30 1999-02-09 Kabushiki Kaisha Toshiba Edited coded data decoding apparatus with substitute image replacement
US6078328A (en) * 1998-06-08 2000-06-20 Digital Video Express, Lp Compressed video graphics system and methodology
US6157396A (en) * 1999-02-16 2000-12-05 Pixonics Llc System and method for using bitstream information to process images for use in digital display systems
US7116342B2 (en) * 2003-07-03 2006-10-03 Sportsmedia Technology Corporation System and method for inserting content into an image sequence

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101860532A (en) * 2010-05-07 2010-10-13 深圳市共进电子有限公司 Saving and loading method for multiprocess configuration files

Also Published As

Publication number Publication date
WO2010017217A3 (en) 2010-04-01

Similar Documents

Publication Publication Date Title
JP6753902B2 (en) Storage management of data streamed from video source devices
US9819973B2 (en) Embedded appliance for multimedia capture
CN108259934B (en) Method and apparatus for playing back recorded video
EP3100245B1 (en) Selection and display of adaptive rate streams in video security system
US20070043875A1 (en) Systems and methods for media stream processing
KR20150106351A (en) Method and system for playback of motion video
US20140196102A1 (en) Method for transmitting video signals from an application on a server over an ip network to a client device
CN112543348A (en) Remote screen recording method, device, equipment and computer readable storage medium
WO2010017217A2 (en) Methods, equipment and systems utilizing pre-stored picture data to represent missing video data
US20110161515A1 (en) Multimedia stream recording method and program product and device for implementing the same
EP3276967A1 (en) Systems and methods for adjusting the frame rate of transmitted video based on the level of motion in the video
AU2019204751B2 (en) Embedded appliance for multimedia capture
Vuppala et al. Measurement of user-related performance problems of live video streaming in the user interface
CN115604496A (en) Display device, live broadcast channel switching method and storage medium
Janson A comparison of different multimedia streaming strategies over distributed IP networks State of the art report [J]
CA2914803A1 (en) Embedded appliance for multimedia capture
AU2013254937A1 (en) Embedded Appliance for Multimedia Capture
AU2012202843A1 (en) Embedded appliance for multimedia capture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09805455

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09805455

Country of ref document: EP

Kind code of ref document: A2