US20110126250A1 - System and method for account-based storage and playback of remotely recorded video data - Google Patents

System and method for account-based storage and playback of remotely recorded video data Download PDF

Info

Publication number
US20110126250A1
US20110126250A1 US13/019,072 US201113019072A US2011126250A1 US 20110126250 A1 US20110126250 A1 US 20110126250A1 US 201113019072 A US201113019072 A US 201113019072A US 2011126250 A1 US2011126250 A1 US 2011126250A1
Authority
US
United States
Prior art keywords
video data
video
streams
live video
remote sites
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/019,072
Inventor
Brian Turner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/819,206 external-priority patent/US20090007197A1/en
Application filed by Individual filed Critical Individual
Priority to US13/019,072 priority Critical patent/US20110126250A1/en
Publication of US20110126250A1 publication Critical patent/US20110126250A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2747Remote storage of video programs received via the downstream path, e.g. from the server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4751End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for defining user accounts, e.g. accounts for children
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests

Definitions

  • the present invention is directed to a method and system for assembling real-time and non-real-time information sources, and in one embodiment to a method and system for integrating real-time recording of video from at least one of a camera and a web camera and playback from a remote storage system.
  • FIG. 1 is an exemplary screenshot of a World Wide Web interface for controlling a first video stream
  • FIG. 2 is an exemplary screenshot of a World Wide Web interface for controlling first and second video streams
  • FIG. 3 is an exemplary screenshot of a World Wide Web interface for controlling first and second video streams which alternate in the same video window;
  • FIG. 4 is a block diagram of a system for viewing remotely stored video data recorded at plural remote sites
  • FIG. 5 is a ladder diagram showing messages between various components in the system
  • FIGS. 6A and 6B are bloom patterns used in determining if motion exists between corresponding pixels of consecutive video frames
  • FIGS. 7-10 are exemplary graphical user interfaces for creating access lists associated with an account such that the permissions of various users associated with an account can be specified;
  • FIG. 11 is a block diagram showing various communications channels in a communication session between a client and a Data Management Service
  • FIG. 12 is a flowchart illustrating a prioritized communication system for providing priority to system channel data over video channel data
  • FIG. 13 is a block diagram showing network sessions between various components of a system such as the system shown in FIG. 4 ;
  • FIG. 14 is a ladder diagram showing messages between components in the system to support creation of a downloadable video clip.
  • FIGS. 15A and 15B are a ladder diagrams showing messages between various components in the system during video recording and video playback, respectively.
  • a user subscribes to a data management service which enables one or more feeds from a video camera and or a web camera or other device defined within the data management service to be streamed to the data management service.
  • the data management service then provides storage, playback, and/or event trigger services to the subscriber (e.g., for a monthly fee or a usage fee).
  • the subscriber does not need to have the storage to save the video nor does it need the backup services required to protect the stored video from loss, nor does it need to have a programmed data interaction facility to generate triggered actions or events from the streamed data.
  • the data management service may be requested to playback video from an earlier time or generate a triggered action in addition to continuing to record.
  • the recording, playback, and event generation therefore compete for network utilization.
  • the real-time data stream can be lost if it is not captured as it is being recorded, or a critical triggered event may be missed.
  • the playback becomes slower, then there is no information loss, just unresponsiveness and/or choppiness.
  • the communications protocol is preferably designed such that event generation and data recording are provided a greater priority than playback. For example, if an alarm at a remotely monitored facility causes a user to receive an e-mail, the user may wish to see why the alarm occurred, but it is more important to continue to send immediate notifications and to record the conditions that caused the alarm.
  • a user arranges for data management with a data management facility.
  • the data management facility may charge a fee to the user for every camera that is connected, for a given number of triggered events, for a set amount of data to be recorded, or for a number of users who may have live interaction with the data streams.
  • the data management facility may not charge a fee for interaction but instead provide advertisements on the screen while the video is being played back or other data streams are being monitored.
  • the fees for storage, playback, and event generation may be combined or separate.
  • the user then connects user-side camera systems to the data storage facility, either by pushing the video data to the data storage facility or by allowing the data storage facility to pull the data by accessing a user-side camera system (e.g., a camera with integrated software or a video server computer to which one or more cameras are attached).
  • Cameras may be connected to a video server computer through a standard interface (e.g., a USB interface) or through custom hardware (e.g., a video capture board).
  • a server may be inside the camera itself.
  • the data storage facility includes wired and/or wireless communications adapters (e.g., WiFi adapters, Ethernet adapters, WiMax adapters, telephone line adapters such as modems) for receiving and transmitting the video data from/to the cameras and the users.
  • Such communications adapters may use any unreliable or reliable transmission protocol (e.g., UDP/IP or TCP/IP).
  • the data storage facility also includes computer storage devices (e.g., hard disks, hard disk arrays and optical disks and/or arrays) for storage of the video data.
  • the data storage facility further includes a command interpreter for interpreting the requests from the user which specify how the video data is to be played back to the user.
  • a number of messages are sent between the various components of the system, as shown in FIG. 5 .
  • the user-side camera system When the user-side camera system is powered on and the video server software is loaded, the user-side camera system opens a network connection to at least a portion of the server equipment (e.g., to a server acting as an XML router).
  • the user-side camera system sends a login request to the XML router, including the customer account that the user-side camera system is running under and the password.
  • the XML router will authenticate that the account exists and the password is valid and then return a response to the user-side camera system indicating that login was successful.
  • the network connection that was opened remains open for as long as the user-side camera system is powered on. This network connection is referred to as the system channel, and it is used by the user-side camera system and the XML router to send protocol messages back and forth.
  • the user-side camera system verifies that each video adapter is still connected and working properly, and then, for each video adapter, it sends a publication request (including a unique identifier) to the XML router over the system channel.
  • This request tells the XML router which video adapter (identified by a unique identifier) is requesting the rights to publish a video feed.
  • the XML router validates that this video adapter is authorized to publish video, subject to limitations on the number of video feeds that are allowed by the customer's subscription plan.
  • the XML router also looks at the customer's account information to determine the maximum frames per second (FPS) rate that is allowed for this video adapter.
  • FPS frames per second
  • the XML router has the ability to control the FPS rate for each of the customer's video adapters individually using account configuration values that are stored in the administrative database.
  • the XML router sends a response back to the user-side camera system indicating that publishing a video feed is OK. This response includes the maximum FPS rate that the video adapter is allowed to capture video.
  • the user-side camera system initializes each video adapter, and configures the video drivers to capture video at the maximum FPS rate specified in the response from the XML router for each video adapter.
  • a secondary communication channel is opened (a “video publication channel”) between each video adapter and the XML router. The purpose of this channel is to transport video frames to the XML router. Transporting these frames across a secondary channel allows protocol messages to continue to be exchanged across the system channel without experiencing any delay due to high network traffic.
  • each video adapter After each video adapter is initialized, it enters into a continuous loop that will continue to run for as long as the user-side camera system is active. Each time a frame is captured from the video device, the video adapter first checks to make sure that frames are being received from the adapter at a speed less than or equal to the maximum FPS rate specified by the XML router. This is a secondary check to ensure that the maximum FPS setting is respected even in cases where the driver does not support FPS throttling. If frames are received faster than allowed, they are ignored. Following this FPS speed test, the video adapter preferably uses a motion detection algorithm to determine if motion was detected between the incoming frame and the previous frame that was sent to the XML router. This is a bandwidth saving feature that prevents the sending of redundant frames to increase performance and reduce network usage. If motion was detected, the frame is sent across the video publication channel to the XML router for distribution to clients that are subscribed to this video feed.
  • each video adapter will send frames to the server occasionally (e.g., at a rate of not less than one frame per minute), even if no motion is detected in the frames that are captured by the device.
  • network traffic is reduced by not requiring clients to send video frames unless motion is detected. This allows the software to run even on networks with a small amount of bandwidth.
  • the user-side camera system remembers the bitmap data for the last frame that was sent to an XML Router. Remembering the previous frame is necessary because in order to execute a motion test, the method needs two frames to compare against each other—the last frame that was sent to the server will be compared against frames received from the video adapter to test for motion.
  • a frame is received from the video adapter, it is first converted from color to a gray scale image. This can be done using standard, publically available techniques that convert RGB values to a luminance value.
  • a bitmap also is created with the same size as the incoming video frame, which will be referred to as the “motion map”.
  • the motion map will indicate, for each pixel in the frame, whether or not the pixel is still being considered as a possible source of motion.
  • the method will calculate the difference in luminance values between the incoming frame and the same pixel in the previously sent frame. If this value exceeds the noise threshold then corresponding pixel in the motion map is flagged. This represents the first pass in the motion detection method.
  • a method can look for blocks of pixels in the motion map that have all been flagged for motion. This block of pixels is referred to as the “bloom” because the motion detection technique blooms out in a diamond pattern around the target pixel when testing for motion.
  • the bloom level is configurable (e.g., in a configuration file or software command-line “switch”). A bloom of size 5 is shown in FIG.
  • a bloom size of 7 is also possible, as is a bloom size of 9 (not shown) or any other assigned value.
  • the method scans the entire motion map, applying the bloom processing from top-left to bottom-right. This means that the range of pixels considered is actually smaller than the size of the bitmap, because it is impossible for a pixel in the corner of the bitmap to pass the bloom test. Therefore, to improve performance, the process only tests pixels with sufficient area surrounding them to pass the test. For example, with a size 5 bloom, and a bitmap coordinate system that begins in the top left with (0,0), the scanning would begin at (2,2). The process looks at the target pixel in the motion map. If the pixel was flagged for motion in the previous step, the bloom test is applied. If the pixel fails the bloom test (meaning not all surrounding pixels in the bloom area were also flagged in the motion map), then the pixel's flag is cleared.
  • the entire square area surrounding the pixel is flagged in the motion map—meaning that the corner areas surrounding the diamond are flagged for motion as well, if they were not flagged already.
  • the method continues moving through the motion map left to right, top to bottom, applying this logic to each pixel.
  • the process is able to use a relatively low value for the noise threshold. This allows for even small amounts of motion to be detected without triggering false positives.
  • a low noise threshold many pixels are flagged as candidates for motion in the first pass, but are disregarded in the second pass bloom test unless all of the surrounding pixels were also flagged for motion.
  • the process also can dynamically adjust the noise threshold.
  • the process attempts to maintain a noise threshold value that results in 25% of the pixels reporting motion at any given time. Even with 25% of the pixels reporting potential motion, the bloom test is able to reliably eliminate false positives.
  • the process recalibrates the value for noise threshold once every second. It does this by testing to see what percentage of the pixels were flagged during the first pass.
  • the noise threshold will be adjusted up or down based on how the current percentage compares to the target percentage. If the current percentage is greater than the target percentage, the noise threshold is increased by one, and if the current percentage is less than the target percentage, the noise threshold is decreased by one.
  • the noise threshold is never adjusted by more than one unit at a time, and the recalibration happens only once per second.
  • the process will define a minimum and maximum noise threshold values (e.g., 4 and 85, respectively) and prevents the noise thresholds from being adjusted beyond those values.
  • the noise recalibration feature of the process helps the process to remain accurate as conditions viewed by the video device change over time. For example, for many video devices, noise levels increase as the captured image becomes darker. This is particularly important for cameras that capture images outside or in rooms where the lights may be turned on or off.
  • the process counts the number of pixels that passed the bloom test and compares this count against a threshold value, which also is configurable (e.g., 8). If the number of pixels that passed the bloom test meets or exceeds this threshold, the process concludes that motion was detected in this frame.
  • a threshold value also is configurable (e.g., 8).
  • the process contains several values that can be tuned to affect the process's performance. These values may be changed (e.g., by a system administrator) later to improve the process's performance. Exemplary values are shown in the table below.
  • a first data retrieval interface comprises a World Wide Web browser that connects to a server associated with the data management facility.
  • the user authenticates itself to the server and is provided a list of available cameras from which he/she can obtain video.
  • the number of cameras and their sources that can be seen can be configured on a user-by-user and client-by-client basis. Thus, some cameras may be available to only one user while other cameras may be accessible to multiple users.
  • the user authentication can also specify what kinds of video data a user can see and what he/she can do with that video data. For example, some users may only be able to see delayed live feeds, and others may be able to control the feed (e.g., rewind, pan, zoom). Also, some users may be allowed to download stored video while others may not.
  • the user begins to see video from that camera.
  • the browser may be supplemented with one or more active components (e.g., an ACTIVEX, JAVA, ADOBE FLASH or SILVERLIGHT user interface control, as well as JAVASCRIPT programming language instructions).
  • the video may be either the most recently received video, thereby forming a time-delayed live feed (as the video is first received from the remote source and then sent to the interface), or it may be recorded video data from earlier in the day or from a previous day.
  • Video feeds using web browsers is a known technology in the area of traffic cameras in several metropolitan areas. For example, trafficland.com provides such a service for monitoring cameras in the Washington, D.C. area.
  • the interface preferably includes the ability to select video from multiple sources simultaneously.
  • the video can be displayed in a matrix of sub-windows inside a browser where the user selects which cameras of the available cameras are to be displayed.
  • the user may be able to specify the order and placement of the cameras within the matrix (e.g., in order to get different views of the same area in close proximity to each other).
  • Each sub-window operates independently of the other sub-windows in the matrix and can be used to view video from any of the cameras available to the user.
  • the same camera feed could be displayed in more than one sub-window in the matrix if desired.
  • the user could choose to view the live feed from a camera in one sub-window in the matrix while simultaneously viewing recorded view from the same camera in another sub-window, or to show a digitally zoomed in view of a camera feed in one sub-window while simultaneously displaying the full view of the same camera feed in another sub-window.
  • the interface may provide the ability to superimpose the camera name and/or a date and time stamp on top of the video image being displayed, as shown in FIG. 2 .
  • the interface may provide the user with a method to enable or disable these features individually for each video display sub-window in the window matrix (e.g., check boxes).
  • the date and time stamp When the date and time stamp is activated, the date and time will be automatically converted to and displayed in the user's current time zone format, even if the video feed originates from a camera located in another time zone.
  • the interface can further enable any one of the windows to rotate between sources.
  • a list of sources may be displayed upon request (e.g., by right clicking on the video window).
  • sources may be non-live video sources such as VCRs or digital video recorders (DVRs).
  • the user's interface may include, but is not limited to, a series of controls to pause the feed, reverse the feed, and return the feed to the delayed live feed.
  • a user interface that provides the ability to display multiple video feeds simultaneously using a matrix of sub-windows can allow the user to apply these feed control options to one video feed without affecting changes to the other sub-windows displayed in the matrix (e.g. rewind one video feed while continuing to view live video from the other stream displayed in the matrix).
  • the user interface may also include an input area for specifying a time of day for a particular day that the system should rewind to (e.g., to see the cause of an alarm).
  • the system may further include a user interface that is integrated with or runs on a cellular telephone.
  • Video may be delivered to the cellular telephone by utilizing a web browser thereon, using a combination of active languages or controls (e.g. an ACTIVEX, SILVERLIGHT, FLASH, or JAVA user interface control, as well as JAVASCRIPT programming language intructions), or through an application built to run on the cellular telephone's native application development platform.
  • the phone may either receive an actual stream or may make a series of rapid, successive requests for parts of the recorded or delayed video which grab a sufficient number of frames per second to achieve the appearance of a stream.
  • TABLET PCs and other portable computing devices are used to connect to the data management facility.
  • the user interface may be exposed by utilizing a web browser thereon, using a combination of active languages or controls (e.g. an ACTIVEX, SILVERLIGHT, FLASH, or JAVA user interface control, as well as JAVASCRIPT programming language instructions), or through an application built to run on the portable computing device's native application development platform.
  • active languages or controls e.g. an ACTIVEX, SILVERLIGHT, FLASH, or JAVA user interface control, as well as JAVASCRIPT programming language instructions
  • the user interfaces may further be supplemented with user interface controls for remotely controlling the operation of the camera.
  • the cameras may be controlled to zoom in or out, pan and tilt (e.g., using buttons for those functions or gestures on touch screen interfaces supporting gestures).
  • Such control may result in actual physical movement of the camera, if it is supported, or may be achieved virtually, if supported.
  • a camera can appear to pan right and/or left by performing a virtual zoom (i.e., enlarging one area of an image without actually changing the focus) and then moving to the right or left of the actual image and enlarging the new virtually zoomed image.
  • Physical movement of a camera would affect the video stream provided by the camera to all users subscribed to the video feed.
  • a user could apply a virtual zoom to a camera feed, enlarging the user's video of the camera feed, without affecting the video feed displayed to other user's subscribed to the video feed.
  • the interface may further provide the ability to specify a portion of a previously recorded stream that should be downloaded to the user's computer (or phone).
  • the downloaded file may be in any format (e.g., MPEG, MPEG-2, MPEG-4, WINDOWS MEDIA PLAYER).
  • video encoding is a time consuming operation
  • the user interface may allow the user to request that a video file be created and then continue with other tasks while the video file is generated in the background by the data management server.
  • the data management server will provide the user with a notification (e.g., email or SMS text message) informing the user that the video file is available for download with the message including, for example, a link to the created video file or instructions on how to access the file.
  • the video data may contain sensitive information
  • the video data is preferably encrypted along each of the transmission links.
  • the video data would be encrypted.
  • the data storage facility to the user's playback device would be encrypted.
  • the cameras, the computer connected to the cameras, or the data storage facility may additionally provide a motion sensing service such that a user is notified of motion in an area where none is expected. For example, no motion is expected in a locked warehouse, so if motion occurs, then the user could be notified by a specified communications mechanism (e.g., by email, phone, cellular phone, pager, etc.). If supported by the notification delivery method, the system may include additional details (e.g., attached video clip of the incident, or web hyperlink to allow user to quickly connect to the server and review).
  • an organization e.g., a company
  • various account provisioning controls may be added to the above-described system.
  • multiple accounts can be stored on one or more centrally managed computers to which an organization can be given access.
  • the video streams of a first organization can then be stored separately from the video streams of a second organization such that only those persons authorized by the first organization can get access to the video streams associated with the first organization's account.
  • only those persons authorized by the second organization can get access to the video streams associated with the second organization's account.
  • a manager of an organization may be reviewing a recorded stream while a supervisor is watching a live video; however, neither the manager nor the supervisor need be at the location where the video is being recorded from nor at the same location as each other.
  • the users can authenticate themselves to the system using known authentication methods (e.g., username/password, RSA SECURID).)
  • the account of an organization may also include the threshold information that controls the amount of data that can be stored and/or played back for that account, as well as the number and/or types of actions that may be generated. For example the number of streams that can be recorded simultaneously by an account may be limited or the amount of video (e.g., megabytes/hour) may be limited. Similarly, amount and length of video streams as well as the replacement policy may be controlled for an account. For example, an account may be configured to purge old video streams based on a selected replacement policy (e.g., a first-in-first-out erasure method) or based on location specific information (e.g., erase recordings from camera A before erasing recordings from camera B).
  • a selected replacement policy e.g., a first-in-first-out erasure method
  • location specific information e.g., erase recordings from camera A before erasing recordings from camera B.
  • the number of simultaneous viewer requests for streams associated with an account or organization may be limited.
  • the number and type of generated actions may be defined or limited (e.g., generate notification e-mails but do not activate on-premise alarm lights or sirens) based on association with a defined subscription type.
  • Such thresholds serve to limit the system's incoming bandwidth utilization, outgoing bandwidth utilization, storage space (hard drive) utilization, and business liability on an account-level. The system also need not utilize the same threshold levels for all organizations.
  • the first organization may pay a higher price for 5 incoming streams, 3 outgoing streams, 300 GB of storage space, and on-premise alarm activation while a second organization may pay a lower price for 3 incoming streams, 1 outgoing stream, 100 GB of storage space, and basic e-mail or SMS text alerting.
  • the transmitting adapters can be configured to implement a motion sensing procedure or method, switching to a mode where frames are only sent at a specified interval during times of non-activity, as described in greater detail herein.
  • the system can further be configured to programmatically throttle the adapters which transmit live video streams into the system.
  • the system may provide throttling at an account level such that an amount of data transmitted to the system by all cameras associated with an account does not exceed the threshold level for the account.
  • the system can send one or more messages to the cameras associated with the account to request that video be sent at a lower level.
  • the cameras can then be configured to return to the original level if the cause for the increased level is addressed. For example, if an account has four cameras associated with it and the cameras are each transmitting at a medium resolution, motion at a first camera may cause the need for higher resolution images from the first camera, and the first camera may begin transmitting higher resolution images (or more frequent images). However, if the combined bandwidth of the four cameras exceeds the receiving bandwidth threshold, then the other three cameras may be sent messages by the system to instruct them to transmit at a next lower resolution (e.g., low resolution being the next lower resolution from medium resolution) or less frequently (e.g., 10 frames per second (fps) instead of 15 fps).
  • a next lower resolution e.g., low resolution being the next lower resolution from medium resolution
  • fps frames per second
  • an organization may require various levels or types of access to video streams associated with the organization.
  • an account control system may be provided so that access lists may be associated with an account such that the permissions of various users associated with an account can be specified.
  • an account with four users e.g., two clerks, a supervisor and a manager
  • an account with four users may have some users which can access the recorded or live video of some cameras that others cannot (e.g., clerk 1 can access camera A and clerk 2 can access camera B, but not vice versa) while some users (e.g., the supervisor and the manager) can access the video from all the cameras.
  • some users may have rights to perform actions on stored videos that others do not.
  • a supervisor may review recorded video from when he/she was on duty but not from other times, and the supervisor cannot delete recorded video.
  • a manager's access may be configured to allow all recorded (and live) video to be reviewed, and may even allow video to be erased.
  • interfaces with similar functionality can be provided using other interfaces, such as World Wide Web interfaces and/or Dynamic HTML interfaces.
  • the system may also be configured to allow certain users to access video from particular streams by specifying a time of the video to be viewed. For example, a manager may wish to see the number of people present on the video at the opening or closing of a store or at lunch time. A user with sufficient permissions/rights may then also extract time slices of the recorded video in order to transfer it (e.g., to store it on another medium). For example, if a manager views that a shoplifter has stolen from the store, the manager may tell the system to extract the relevant stored frames of video, to use commonly available encoding software, and store the compiled video file (e.g., an MPEG file, Windows Media Player file, or Advanced Systems Format file). Such a file can be played back with a conventional video player application running on a computer after having been transferred to a third party (e.g., an insurance or police).
  • a third party e.g., an insurance or police
  • the system can use communication prioritization, and such prioritization preferably controls the flow of data between clients and the data management service.
  • a client connects to the data management service by opening a network connection to the server and passing the required authentication information to the server.
  • the server will validate the information and then, if everything is OK, sets up the session and return a message to the client indicating that the connection was successful.
  • a single network session may contain multiple data channels.
  • the first channel is referred to as the “system channel” and is responsible for carrying protocol messages between the client and the server. Traffic in the system channel always has the highest priority. When messages are waiting to be sent across the system channel, pending traffic for all other channels must wait until the system channel data is first processed. This ensures that the client and server can always exchange protocol messages (change operating parameters, notify clients of events and status changes, etc.) even in cases where a large volume of data is being sent across the network.
  • a client may have a number of additional channels.
  • the infrastructure need not place any limit on the number of additional data channels that can be open between the client and the server, although the data management service will enforce limits on the maximum number of channels based on account subscription levels, network bandwidth limits, etc.
  • clients may also support a configuration where only the system channel is active, in cases where no additional channels are required (e.g., video server which has cameras temporarily disabled, or a viewing client that has not yet selected any cameras for viewing).
  • Video traffic between clients and the data management service is sent using a frame-by-frame paradigm.
  • Each video channel can be envisioned as a queue, or a waiting list, of frames waiting to be sent across the network channel.
  • Frames in each video channel queue operate using a “first in, first out” method, meaning that the frame that has been waiting in line the longest will always be the next frame scheduled to be sent across that video channel.
  • each queue may contain numerous frames of video that are waiting to be sent.
  • the infrastructure waited until the queue of frames from the first camera was completely sent across the network before processing the pending frames for the second camera, it would result in camera two falling behind “real time”, or, if the bandwidth was severely restricted, possibly never being sent to the server at all.
  • the infrastructure maintains a “queue of queues” identifying all of the different video channels that have one or more frames waiting to be sent.
  • Each of these video channels are “in line” and waiting for data to be sent to the server. When a video channel is at the front of the line, the next frame for that channel is sent to the data management server, and then that video channel is moved to the end of the line. This ensures that traffic for each video channel shares the available network bandwidth equally.
  • the frame-by-frame paradigm provides a standardized unit of measure that the system can use to control network usage, rather than simply thinking of network traffic in bytes.
  • the frame-by-frame design allows the data management service to dynamically adjust network bandwidth across video channels, while still maintaining a “real time” video link with each camera. For example, if motion is detected on one camera, the data management service may send a request to that camera to switch to a higher resolution to capture more detail, and simultaneously instruct the other cameras to switch to a lower resolution mode, in effect, giving the camera with activity a larger share of the available bandwidth.
  • all of the video channels would still receive an equal send priority, so even though the size of the frame from the active camera (and therefore, the bandwidth required) would be larger than the frame size for the other cameras, each video channel would still continue to send frames at an equal rate.
  • a video server may include a system channel and at least one video channel (when at least one camera is active).
  • the video channels are shown with a “->” designation that shows the direction of video flow (i.e., from the video server to the data management service).
  • the history server similarly has a system channel but also can have a number of recording video channels (e.g., 1 to N) (in an in-bound direction) and a number of playback video channels (e.g., 1 to M) (in an out-bound direction), where the number of recording and playback video channels need not be the same.
  • a video viewer also has a system channel and at least one (in-bound) video channel when at least one video feed is active.
  • the Encoding service also includes a system channel and a number of (in-bound) play video channels.
  • video clips may be generated in a standard video file format (e.g., an MPEG file, Windows Media Player file, or Advanced Systems Format file) such that clips can be played back with a conventional video player application running on a computer after having been transferred to a third party (e.g., an insurance or police).
  • this encoding functionality is provided by an encoding service, which is a client connected to the data management service. As shown in FIG. 14 , when the encoding service receives a request to create a video clip, it will first validate that the request, and then opens a connection to the data management service.
  • the encoding service will send a playback request, including information about the desired camera, start, and end time for the video clip to the data management service.
  • the data management service will forward this request to the software/hardware responsible for providing video history recording and playback services in the system.
  • the history service begins publishing a playback feed upon receiving the playback request, at which time a response will be returned to the encoding service indicating that the requested playback feed is available.
  • the encoding service will subscribe to the playback feed, and will begin receiving frames of video through the subscription.
  • the video encoding service applies a pre-processing procedure to the video frames, superimposing a video caption containing the name of the video camera, as well as a timestamp.
  • video provider implementations may choose to publish frames based on activity/motion detection, rather than at a consistent frames-per-second (fps) rate, the encoding service will be responsible for calculating the period of time that each video frame shall be displayed when playing back the encoded video file.
  • Implementations that choose to superimpose timestamp data on top of the video frames may need to generate multiple output frames from the one source frame provided by the history service to provide a constantly running timestamp in the video output file.
  • An implementation may use standard methods/existing libraries to encode the video data into a standard video file format and write this file to disk.
  • an implementation may provide a notification mechanism to inform the user that the video file is available for download (e.g., by sending a link, such as a URL, to the user).
  • notification methods may include, but are not limited to, Email, SMS text message, instant messaging, pager, etc.
  • an implementation may provide the user with an encrypted, direct download link to the video file.
  • the encrypted link provides the user with direct access to the video file without requiring a system log in, and prevents users from editing the link or otherwise tampering with the link to attempt unauthorized access to other files.
  • the data management service also may provide a threshold that defines how long each video clip that is generated should remain available for the user to download. This threshold may vary on a customer, account, or video adapter level. Implementations should provide a mechanism to periodically review generated video clips and automatically remove files that are older than the threshold defined by the data management service.
  • video recording and playback is controlled by a video history service.
  • the history service is a client connected to the data management service that runs under an administrative account that allows access to all published video feeds.
  • the history service requests a list of all published feeds from the data management service, and subscribes to all available feeds.
  • FIG. 15A as each feed publishes new frames of video, the frames are delivered by the data management service to the video history service provider.
  • the history service records each frame to permanent storage (e.g., hard drive or storage array), along with metadata associated with the video frame (identifier for the video feed that published the frame, date/time stamp when the frame was taken, etc.)
  • An implementation of the video history service may use any method to store the data (indexed file system, relational database, etc.) provided that the storage mechanism used includes the ability to quickly seek to a range of frames to be retrieved while simultaneously continuing to record incoming data from other live feeds.
  • the storage mechanism must also provide the ability to store metadata in addition to the frame itself, and provide the ability to delete video frames that are older than the threshold defined by the data management service that controls how long a customer's video should be kept.
  • the threshold may vary on a per-customer or per-video adapter level.
  • the data storage service when a video viewer sends a video playback request to the data storage service, the data storage service will forward this request to the video history service.
  • the history service will confirm that the request is valid, and then ask the data storage service for permission to publish a new video feed.
  • the data storage service approves the request, the history service will begin publishing the playback feed, and a response to the original request will be returned to the video viewer client indicating that the playback feed is now available.
  • the video viewer client will subscribe to the playback video feed in the same manner as it would subscribe to any other feed available to it through that data management service.
  • a notification is sent to the history service informing it that a subscriber has attached to the playback feed.
  • the history service begins loading frames of video from storage and publishes them, frame by frame, to the data storage service. Publication will continue until the history service reaches the end of the data, as defined by the date range provided by the user. The history service then ceases publication, and the video viewer is notified that no more data is available.

Abstract

A user interacts with a data storage service which enables one or more feeds from video and or web cameras to be streamed to the data storage service. The data storage service then provides storage and/or playback services to the subscriber (e.g., for a monthly fee or a usage fee). Once the video streams have been established between at least one camera and the data storage facility, the user may access the recorded data from any one of several sources, such as a world wide web browser or a cellular phone). The interface may provide a matrix of displays such that the user can see multiple areas or multiple parts of the same area simultaneously.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims is related to and is a continuation-in-part of co-pending U.S. patent Ser. No. 11/819,206, filed Jun. 26, 2007, naming Brian Turner as an inventor, the contents of which are incorporated herein by reference.
  • FIELD OF INVENTION
  • The present invention is directed to a method and system for assembling real-time and non-real-time information sources, and in one embodiment to a method and system for integrating real-time recording of video from at least one of a camera and a web camera and playback from a remote storage system.
  • DISCUSSION OF THE BACKGROUND
  • As the available bandwidth on the Internet grows, additional services are now being provided which previously were impractical or impossible from a bandwidth perspective. However, it is now possible to conduct not only video conferencing but real-time recording of video for security purposes. However, in some circumstances, most of the time the recorded video has little value (e.g., when recording a remote location where nothing is occurring). It is only when an important even occurs (e.g., the remote location is broken into), that the video needs to be reviewed. As a result, it is not necessarily economical to require each company that wishes to record video to have its own expensive video storage and playback facility.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
  • FIG. 1 is an exemplary screenshot of a World Wide Web interface for controlling a first video stream;
  • FIG. 2 is an exemplary screenshot of a World Wide Web interface for controlling first and second video streams;
  • FIG. 3 is an exemplary screenshot of a World Wide Web interface for controlling first and second video streams which alternate in the same video window;
  • FIG. 4 is a block diagram of a system for viewing remotely stored video data recorded at plural remote sites;
  • FIG. 5 is a ladder diagram showing messages between various components in the system;
  • FIGS. 6A and 6B are bloom patterns used in determining if motion exists between corresponding pixels of consecutive video frames;
  • FIGS. 7-10 are exemplary graphical user interfaces for creating access lists associated with an account such that the permissions of various users associated with an account can be specified;
  • FIG. 11 is a block diagram showing various communications channels in a communication session between a client and a Data Management Service;
  • FIG. 12 is a flowchart illustrating a prioritized communication system for providing priority to system channel data over video channel data;
  • FIG. 13 is a block diagram showing network sessions between various components of a system such as the system shown in FIG. 4;
  • FIG. 14 is a ladder diagram showing messages between components in the system to support creation of a downloadable video clip; and
  • FIGS. 15A and 15B are a ladder diagrams showing messages between various components in the system during video recording and video playback, respectively.
  • DETAILED DESCRIPTION
  • According to one aspect of a system described herein, a user subscribes to a data management service which enables one or more feeds from a video camera and or a web camera or other device defined within the data management service to be streamed to the data management service. The data management service then provides storage, playback, and/or event trigger services to the subscriber (e.g., for a monthly fee or a usage fee). In this way the subscriber does not need to have the storage to save the video nor does it need the backup services required to protect the stored video from loss, nor does it need to have a programmed data interaction facility to generate triggered actions or events from the streamed data.
  • When storing the video from at least one video camera or web camera, the data management service may be requested to playback video from an earlier time or generate a triggered action in addition to continuing to record. The recording, playback, and event generation therefore compete for network utilization. The real-time data stream can be lost if it is not captured as it is being recorded, or a critical triggered event may be missed. However, if the playback becomes slower, then there is no information loss, just unresponsiveness and/or choppiness. As a result, the communications protocol is preferably designed such that event generation and data recording are provided a greater priority than playback. For example, if an alarm at a remotely monitored facility causes a user to receive an e-mail, the user may wish to see why the alarm occurred, but it is more important to continue to send immediate notifications and to record the conditions that caused the alarm.
  • To this end, a user arranges for data management with a data management facility. For example the data management facility may charge a fee to the user for every camera that is connected, for a given number of triggered events, for a set amount of data to be recorded, or for a number of users who may have live interaction with the data streams. Alternatively, the data management facility may not charge a fee for interaction but instead provide advertisements on the screen while the video is being played back or other data streams are being monitored. Furthermore, the fees for storage, playback, and event generation may be combined or separate.
  • As shown in FIG. 4, the user then connects user-side camera systems to the data storage facility, either by pushing the video data to the data storage facility or by allowing the data storage facility to pull the data by accessing a user-side camera system (e.g., a camera with integrated software or a video server computer to which one or more cameras are attached). Cameras may be connected to a video server computer through a standard interface (e.g., a USB interface) or through custom hardware (e.g., a video capture board). It should be understood that in embodiments using a camera with integrated software, a server may be inside the camera itself. Alternatively, there may be a video server computer or other specialized data aggregating device that aggregates the video data from several cameras (e.g., connected to various USB ports of the same computer) before passing them on to the data storage facility.
  • The data storage facility includes wired and/or wireless communications adapters (e.g., WiFi adapters, Ethernet adapters, WiMax adapters, telephone line adapters such as modems) for receiving and transmitting the video data from/to the cameras and the users. Such communications adapters may use any unreliable or reliable transmission protocol (e.g., UDP/IP or TCP/IP). The data storage facility also includes computer storage devices (e.g., hard disks, hard disk arrays and optical disks and/or arrays) for storage of the video data. The data storage facility further includes a command interpreter for interpreting the requests from the user which specify how the video data is to be played back to the user.
  • As part of (virtually) connecting the camera(s) to the data storage facility (as described above), a number of messages are sent between the various components of the system, as shown in FIG. 5. When the user-side camera system is powered on and the video server software is loaded, the user-side camera system opens a network connection to at least a portion of the server equipment (e.g., to a server acting as an XML router). The user-side camera system sends a login request to the XML router, including the customer account that the user-side camera system is running under and the password. The XML router will authenticate that the account exists and the password is valid and then return a response to the user-side camera system indicating that login was successful. The network connection that was opened remains open for as long as the user-side camera system is powered on. This network connection is referred to as the system channel, and it is used by the user-side camera system and the XML router to send protocol messages back and forth.
  • After the connection is established, the user-side camera system verifies that each video adapter is still connected and working properly, and then, for each video adapter, it sends a publication request (including a unique identifier) to the XML router over the system channel. This request tells the XML router which video adapter (identified by a unique identifier) is requesting the rights to publish a video feed. The XML router validates that this video adapter is authorized to publish video, subject to limitations on the number of video feeds that are allowed by the customer's subscription plan. The XML router also looks at the customer's account information to determine the maximum frames per second (FPS) rate that is allowed for this video adapter. The XML router has the ability to control the FPS rate for each of the customer's video adapters individually using account configuration values that are stored in the administrative database. The XML router sends a response back to the user-side camera system indicating that publishing a video feed is OK. This response includes the maximum FPS rate that the video adapter is allowed to capture video.
  • The user-side camera system initializes each video adapter, and configures the video drivers to capture video at the maximum FPS rate specified in the response from the XML router for each video adapter. A secondary communication channel is opened (a “video publication channel”) between each video adapter and the XML router. The purpose of this channel is to transport video frames to the XML router. Transporting these frames across a secondary channel allows protocol messages to continue to be exchanged across the system channel without experiencing any delay due to high network traffic.
  • After each video adapter is initialized, it enters into a continuous loop that will continue to run for as long as the user-side camera system is active. Each time a frame is captured from the video device, the video adapter first checks to make sure that frames are being received from the adapter at a speed less than or equal to the maximum FPS rate specified by the XML router. This is a secondary check to ensure that the maximum FPS setting is respected even in cases where the driver does not support FPS throttling. If frames are received faster than allowed, they are ignored. Following this FPS speed test, the video adapter preferably uses a motion detection algorithm to determine if motion was detected between the incoming frame and the previous frame that was sent to the XML router. This is a bandwidth saving feature that prevents the sending of redundant frames to increase performance and reduce network usage. If motion was detected, the frame is sent across the video publication channel to the XML router for distribution to clients that are subscribed to this video feed.
  • To keep the network connection active, each video adapter will send frames to the server occasionally (e.g., at a rate of not less than one frame per minute), even if no motion is detected in the frames that are captured by the device.
  • Even in cases where no motion exists between two frames captured by the video device, two frames will never match exactly. Minor differences in the color values for pixels in the image may be present. The actual amount of variance in the color values varies based on the quality of the video capture device and device driver software implementation. This variance in color values is referred to as “noise”. Differences in frames due to noise make it impossible to do a simple bit by bit comparison between two frames to test for motion. Instead, a motion detection method is utilized. Motion detection methods preferably are video hardware independent and detect cases where even a small amount of motion exists between two frames without triggering false positives due to noise.
  • According to one embodiment of a motion detection method, network traffic is reduced by not requiring clients to send video frames unless motion is detected. This allows the software to run even on networks with a small amount of bandwidth.
  • According to one embodiment, the user-side camera system remembers the bitmap data for the last frame that was sent to an XML Router. Remembering the previous frame is necessary because in order to execute a motion test, the method needs two frames to compare against each other—the last frame that was sent to the server will be compared against frames received from the video adapter to test for motion. When a frame is received from the video adapter, it is first converted from color to a gray scale image. This can be done using standard, publically available techniques that convert RGB values to a luminance value.
  • A bitmap also is created with the same size as the incoming video frame, which will be referred to as the “motion map”. The motion map will indicate, for each pixel in the frame, whether or not the pixel is still being considered as a possible source of motion. For each pixel in the incoming frame, the method will calculate the difference in luminance values between the incoming frame and the same pixel in the previously sent frame. If this value exceeds the noise threshold then corresponding pixel in the motion map is flagged. This represents the first pass in the motion detection method.
  • However, if the noise threshold is too low, false positives will be detected. If the noise threshold is too high, true motion will not be detected by the method. To address those issues, a method can look for blocks of pixels in the motion map that have all been flagged for motion. This block of pixels is referred to as the “bloom” because the motion detection technique blooms out in a diamond pattern around the target pixel when testing for motion. The bloom level is configurable (e.g., in a configuration file or software command-line “switch”). A bloom of size 5 is shown in FIG. 6A, where the center X is the pixel that is being considered for motion, and the surrounding X's represent the surrounding pixels in the motion map that must also be flagged for motion in order to keep the target pixel marked as a candidate for motion. Otherwise, the flag for the target pixel in the motion map is cleared, dropping this pixel as a false positive. As shown in FIG. 6B, a bloom size of 7 is also possible, as is a bloom size of 9 (not shown) or any other assigned value.
  • The method scans the entire motion map, applying the bloom processing from top-left to bottom-right. This means that the range of pixels considered is actually smaller than the size of the bitmap, because it is impossible for a pixel in the corner of the bitmap to pass the bloom test. Therefore, to improve performance, the process only tests pixels with sufficient area surrounding them to pass the test. For example, with a size 5 bloom, and a bitmap coordinate system that begins in the top left with (0,0), the scanning would begin at (2,2). The process looks at the target pixel in the motion map. If the pixel was flagged for motion in the previous step, the bloom test is applied. If the pixel fails the bloom test (meaning not all surrounding pixels in the bloom area were also flagged in the motion map), then the pixel's flag is cleared. If the pixel passes the bloom test, the entire square area surrounding the pixel is flagged in the motion map—meaning that the corner areas surrounding the diamond are flagged for motion as well, if they were not flagged already. The method continues moving through the motion map left to right, top to bottom, applying this logic to each pixel.
  • With a bloom size set to a value larger than 3, the process is able to use a relatively low value for the noise threshold. This allows for even small amounts of motion to be detected without triggering false positives. With a low noise threshold, many pixels are flagged as candidates for motion in the first pass, but are disregarded in the second pass bloom test unless all of the surrounding pixels were also flagged for motion.
  • The process also can dynamically adjust the noise threshold. The process attempts to maintain a noise threshold value that results in 25% of the pixels reporting motion at any given time. Even with 25% of the pixels reporting potential motion, the bloom test is able to reliably eliminate false positives. The process recalibrates the value for noise threshold once every second. It does this by testing to see what percentage of the pixels were flagged during the first pass. The noise threshold will be adjusted up or down based on how the current percentage compares to the target percentage. If the current percentage is greater than the target percentage, the noise threshold is increased by one, and if the current percentage is less than the target percentage, the noise threshold is decreased by one. The noise threshold is never adjusted by more than one unit at a time, and the recalibration happens only once per second. This prevents rapid swings in the noise threshold value during periods of motion. Additionally, the process will defines a minimum and maximum noise threshold values (e.g., 4 and 85, respectively) and prevents the noise thresholds from being adjusted beyond those values. The noise recalibration feature of the process helps the process to remain accurate as conditions viewed by the video device change over time. For example, for many video devices, noise levels increase as the captured image becomes darker. This is particularly important for cameras that capture images outside or in rooms where the lights may be turned on or off.
  • The process counts the number of pixels that passed the bloom test and compares this count against a threshold value, which also is configurable (e.g., 8). If the number of pixels that passed the bloom test meets or exceeds this threshold, the process concludes that motion was detected in this frame.
  • The process contains several values that can be tuned to affect the process's performance. These values may be changed (e.g., by a system administrator) later to improve the process's performance. Exemplary values are shown in the table below.
  • Parameter Name Parameter Value Description
    Min. Noise Threshold 4 Minimum noise threshold value
    Max. Noise Threshold 85 Maximum noise threshold value
    Target Percentage 25% The process will adjust the current noise
    threshold value to attempt to consistently see
    cases where this percentage of the pixels
    exceed the noise threshold when comparing
    current frame to previous frame
    Recalibration Frequency Every Second How often the process adjusts the noise
    threshold value
    Bloom Size 9 The size of the “bloom” used to filter out false
    positives
    Threshold 8 The minimum number of pixels that must have
    passed the bloom test for this frame to be
    considered as having motion
  • In a preferred embodiment, once the video streams have been established between at least one camera and the data management facility, the user may access the recorded data from any one of several interfaces. As shown in FIG. 1, a first data retrieval interface comprises a World Wide Web browser that connects to a server associated with the data management facility. The user authenticates itself to the server and is provided a list of available cameras from which he/she can obtain video. The number of cameras and their sources that can be seen can be configured on a user-by-user and client-by-client basis. Thus, some cameras may be available to only one user while other cameras may be accessible to multiple users. The user authentication can also specify what kinds of video data a user can see and what he/she can do with that video data. For example, some users may only be able to see delayed live feeds, and others may be able to control the feed (e.g., rewind, pan, zoom). Also, some users may be allowed to download stored video while others may not.
  • Once at least one camera is selected, the user begins to see video from that camera. To receive the video, the browser may be supplemented with one or more active components (e.g., an ACTIVEX, JAVA, ADOBE FLASH or SILVERLIGHT user interface control, as well as JAVASCRIPT programming language instructions). The video may be either the most recently received video, thereby forming a time-delayed live feed (as the video is first received from the remote source and then sent to the interface), or it may be recorded video data from earlier in the day or from a previous day. Video feeds using web browsers is a known technology in the area of traffic cameras in several metropolitan areas. For example, trafficland.com provides such a service for monitoring cameras in the Washington, D.C. area.
  • To facilitate an examination of what is happening in a site with multiple cameras, the interface preferably includes the ability to select video from multiple sources simultaneously. Thus, as shown in FIG. 2, the video can be displayed in a matrix of sub-windows inside a browser where the user selects which cameras of the available cameras are to be displayed. Optionally, the user may be able to specify the order and placement of the cameras within the matrix (e.g., in order to get different views of the same area in close proximity to each other). Each sub-window operates independently of the other sub-windows in the matrix and can be used to view video from any of the cameras available to the user. The same camera feed could be displayed in more than one sub-window in the matrix if desired. For example, the user could choose to view the live feed from a camera in one sub-window in the matrix while simultaneously viewing recorded view from the same camera in another sub-window, or to show a digitally zoomed in view of a camera feed in one sub-window while simultaneously displaying the full view of the same camera feed in another sub-window.
  • The interface may provide the ability to superimpose the camera name and/or a date and time stamp on top of the video image being displayed, as shown in FIG. 2. The interface may provide the user with a method to enable or disable these features individually for each video display sub-window in the window matrix (e.g., check boxes). When the date and time stamp is activated, the date and time will be automatically converted to and displayed in the user's current time zone format, even if the video feed originates from a camera located in another time zone.
  • In addition to using multiple windows to view different sources simultaneously, the interface can further enable any one of the windows to rotate between sources. As shown in FIG. 3, a list of sources may be displayed upon request (e.g., by right clicking on the video window). To rotate between two sources (e.g., “Tahoe Blvd” and “Parking Lot”), the user would select both sources (and may be provided confirmation of the selection by a check mark appearing next to the selection). As further shown in FIG. 3, sources may be non-live video sources such as VCRs or digital video recorders (DVRs).
  • In addition to delayed live feeds, the user's interface may include, but is not limited to, a series of controls to pause the feed, reverse the feed, and return the feed to the delayed live feed. A user interface that provides the ability to display multiple video feeds simultaneously using a matrix of sub-windows can allow the user to apply these feed control options to one video feed without affecting changes to the other sub-windows displayed in the matrix (e.g. rewind one video feed while continuing to view live video from the other stream displayed in the matrix). Furthermore, the user interface may also include an input area for specifying a time of day for a particular day that the system should rewind to (e.g., to see the cause of an alarm).
  • The system may further include a user interface that is integrated with or runs on a cellular telephone. Video may be delivered to the cellular telephone by utilizing a web browser thereon, using a combination of active languages or controls (e.g. an ACTIVEX, SILVERLIGHT, FLASH, or JAVA user interface control, as well as JAVASCRIPT programming language intructions), or through an application built to run on the cellular telephone's native application development platform. The phone may either receive an actual stream or may make a series of rapid, successive requests for parts of the recorded or delayed video which grab a sufficient number of frames per second to achieve the appearance of a stream.
  • In yet another embodiment, TABLET PCs and other portable computing devices (e.g. APPLE IPAD, APPLE IPOD, WINDOWS TABLET PC, ANDROID TABLET PC, WINDOWS CE and cellular telephones) are used to connect to the data management facility. The user interface may be exposed by utilizing a web browser thereon, using a combination of active languages or controls (e.g. an ACTIVEX, SILVERLIGHT, FLASH, or JAVA user interface control, as well as JAVASCRIPT programming language instructions), or through an application built to run on the portable computing device's native application development platform.
  • The user interfaces may further be supplemented with user interface controls for remotely controlling the operation of the camera. For example, the cameras may be controlled to zoom in or out, pan and tilt (e.g., using buttons for those functions or gestures on touch screen interfaces supporting gestures). Such control may result in actual physical movement of the camera, if it is supported, or may be achieved virtually, if supported. For example, a camera can appear to pan right and/or left by performing a virtual zoom (i.e., enlarging one area of an image without actually changing the focus) and then moving to the right or left of the actual image and enlarging the new virtually zoomed image. Physical movement of a camera would affect the video stream provided by the camera to all users subscribed to the video feed. However, a user could apply a virtual zoom to a camera feed, enlarging the user's video of the camera feed, without affecting the video feed displayed to other user's subscribed to the video feed.
  • In addition to the playback of video, the interface may further provide the ability to specify a portion of a previously recorded stream that should be downloaded to the user's computer (or phone). The downloaded file may be in any format (e.g., MPEG, MPEG-2, MPEG-4, WINDOWS MEDIA PLAYER). Because video encoding is a time consuming operation, the user interface may allow the user to request that a video file be created and then continue with other tasks while the video file is generated in the background by the data management server. The data management server will provide the user with a notification (e.g., email or SMS text message) informing the user that the video file is available for download with the message including, for example, a link to the created video file or instructions on how to access the file.
  • Because the video data may contain sensitive information, the video data is preferably encrypted along each of the transmission links. Thus, from the camera to the data storage facility the video data would be encrypted. Likewise, from the data storage facility to the user's playback device the video data would be encrypted.
  • The cameras, the computer connected to the cameras, or the data storage facility may additionally provide a motion sensing service such that a user is notified of motion in an area where none is expected. For example, no motion is expected in a locked warehouse, so if motion occurs, then the user could be notified by a specified communications mechanism (e.g., by email, phone, cellular phone, pager, etc.). If supported by the notification delivery method, the system may include additional details (e.g., attached video clip of the incident, or web hyperlink to allow user to quickly connect to the server and review).
  • When an organization (e.g., a company) wishes to record from more than one location or to provide access to recorded video to more than one person, various account provisioning controls may be added to the above-described system. For example, multiple accounts can be stored on one or more centrally managed computers to which an organization can be given access. The video streams of a first organization can then be stored separately from the video streams of a second organization such that only those persons authorized by the first organization can get access to the video streams associated with the first organization's account. Similarly, only those persons authorized by the second organization can get access to the video streams associated with the second organization's account. In an embodiment where the recording facility can record and pass on a live stream simultaneously, a manager of an organization may be reviewing a recorded stream while a supervisor is watching a live video; however, neither the manager nor the supervisor need be at the location where the video is being recorded from nor at the same location as each other. (The users can authenticate themselves to the system using known authentication methods (e.g., username/password, RSA SECURID).)
  • The account of an organization may also include the threshold information that controls the amount of data that can be stored and/or played back for that account, as well as the number and/or types of actions that may be generated. For example the number of streams that can be recorded simultaneously by an account may be limited or the amount of video (e.g., megabytes/hour) may be limited. Similarly, amount and length of video streams as well as the replacement policy may be controlled for an account. For example, an account may be configured to purge old video streams based on a selected replacement policy (e.g., a first-in-first-out erasure method) or based on location specific information (e.g., erase recordings from camera A before erasing recordings from camera B). Likewise the number of simultaneous viewer requests for streams associated with an account or organization may be limited. Likewise the number and type of generated actions may be defined or limited (e.g., generate notification e-mails but do not activate on-premise alarm lights or sirens) based on association with a defined subscription type. Such thresholds serve to limit the system's incoming bandwidth utilization, outgoing bandwidth utilization, storage space (hard drive) utilization, and business liability on an account-level. The system also need not utilize the same threshold levels for all organizations. For example the first organization may pay a higher price for 5 incoming streams, 3 outgoing streams, 300 GB of storage space, and on-premise alarm activation while a second organization may pay a lower price for 3 incoming streams, 1 outgoing stream, 100 GB of storage space, and basic e-mail or SMS text alerting.
  • In order to reduce bandwidth and storage utilizations, the transmitting adapters can be configured to implement a motion sensing procedure or method, switching to a mode where frames are only sent at a specified interval during times of non-activity, as described in greater detail herein. The system can further be configured to programmatically throttle the adapters which transmit live video streams into the system. In keeping with the above discussion on account-level configurations, the system may provide throttling at an account level such that an amount of data transmitted to the system by all cameras associated with an account does not exceed the threshold level for the account. In a condition where the video received for the account exceeds the threshold, the system can send one or more messages to the cameras associated with the account to request that video be sent at a lower level. The cameras can then be configured to return to the original level if the cause for the increased level is addressed. For example, if an account has four cameras associated with it and the cameras are each transmitting at a medium resolution, motion at a first camera may cause the need for higher resolution images from the first camera, and the first camera may begin transmitting higher resolution images (or more frequent images). However, if the combined bandwidth of the four cameras exceeds the receiving bandwidth threshold, then the other three cameras may be sent messages by the system to instruct them to transmit at a next lower resolution (e.g., low resolution being the next lower resolution from medium resolution) or less frequently (e.g., 10 frames per second (fps) instead of 15 fps).
  • As described above, an organization may require various levels or types of access to video streams associated with the organization. In order to address this, as shown in FIGS. 7-10, an account control system may be provided so that access lists may be associated with an account such that the permissions of various users associated with an account can be specified. For example, an account with four users (e.g., two clerks, a supervisor and a manager) associated with it may have some users which can access the recorded or live video of some cameras that others cannot (e.g., clerk 1 can access camera A and clerk 2 can access camera B, but not vice versa) while some users (e.g., the supervisor and the manager) can access the video from all the cameras. Also, some users may have rights to perform actions on stored videos that others do not. For example, a supervisor may review recorded video from when he/she was on duty but not from other times, and the supervisor cannot delete recorded video. By comparison, a manager's access may be configured to allow all recorded (and live) video to be reviewed, and may even allow video to be erased. (While an application-style interface has been illustrated as an exemplary embodiment, interfaces with similar functionality can be provided using other interfaces, such as World Wide Web interfaces and/or Dynamic HTML interfaces.)
  • The system may also be configured to allow certain users to access video from particular streams by specifying a time of the video to be viewed. For example, a manager may wish to see the number of people present on the video at the opening or closing of a store or at lunch time. A user with sufficient permissions/rights may then also extract time slices of the recorded video in order to transfer it (e.g., to store it on another medium). For example, if a manager views that a shoplifter has stolen from the store, the manager may tell the system to extract the relevant stored frames of video, to use commonly available encoding software, and store the compiled video file (e.g., an MPEG file, Windows Media Player file, or Advanced Systems Format file). Such a file can be played back with a conventional video player application running on a computer after having been transferred to a third party (e.g., an insurance or police).
  • As described above, the system can use communication prioritization, and such prioritization preferably controls the flow of data between clients and the data management service. A client connects to the data management service by opening a network connection to the server and passing the required authentication information to the server. The server will validate the information and then, if everything is OK, sets up the session and return a message to the client indicating that the connection was successful.
  • As shown in FIG. 11, a single network session may contain multiple data channels. The first channel is referred to as the “system channel” and is responsible for carrying protocol messages between the client and the server. Traffic in the system channel always has the highest priority. When messages are waiting to be sent across the system channel, pending traffic for all other channels must wait until the system channel data is first processed. This ensures that the client and server can always exchange protocol messages (change operating parameters, notify clients of events and status changes, etc.) even in cases where a large volume of data is being sent across the network.
  • In addition to the system channel, a client may have a number of additional channels. The infrastructure need not place any limit on the number of additional data channels that can be open between the client and the server, although the data management service will enforce limits on the maximum number of channels based on account subscription levels, network bandwidth limits, etc. Additionally, clients may also support a configuration where only the system channel is active, in cases where no additional channels are required (e.g., video server which has cameras temporarily disabled, or a viewing client that has not yet selected any cameras for viewing).
  • Video traffic between clients and the data management service is sent using a frame-by-frame paradigm. Each video channel can be envisioned as a queue, or a waiting list, of frames waiting to be sent across the network channel. Frames in each video channel queue operate using a “first in, first out” method, meaning that the frame that has been waiting in line the longest will always be the next frame scheduled to be sent across that video channel. Particularly in cases where a client is running over a low-bandwidth network connection, each queue may contain numerous frames of video that are waiting to be sent. For clients that are supporting multiple video channels, it is important that all video channels receive an equal priority. For example, if the client was a two camera video server, data captured from both cameras should be sent to the data management server at an equal rate. If the infrastructure waited until the queue of frames from the first camera was completely sent across the network before processing the pending frames for the second camera, it would result in camera two falling behind “real time”, or, if the bandwidth was severely restricted, possibly never being sent to the server at all. As shown in FIG. 12, to ensure that all video channels receive equal priority, the infrastructure maintains a “queue of queues” identifying all of the different video channels that have one or more frames waiting to be sent. Each of these video channels are “in line” and waiting for data to be sent to the server. When a video channel is at the front of the line, the next frame for that channel is sent to the data management server, and then that video channel is moved to the end of the line. This ensures that traffic for each video channel shares the available network bandwidth equally.
  • The frame-by-frame paradigm provides a standardized unit of measure that the system can use to control network usage, rather than simply thinking of network traffic in bytes. The frame-by-frame design allows the data management service to dynamically adjust network bandwidth across video channels, while still maintaining a “real time” video link with each camera. For example, if motion is detected on one camera, the data management service may send a request to that camera to switch to a higher resolution to capture more detail, and simultaneously instruct the other cameras to switch to a lower resolution mode, in effect, giving the camera with activity a larger share of the available bandwidth. However, all of the video channels would still receive an equal send priority, so even though the size of the frame from the active camera (and therefore, the bandwidth required) would be larger than the frame size for the other cameras, each video channel would still continue to send frames at an equal rate.
  • As shown in FIG. 13, the network sessions for the various components in the system can vary between the types of communicating components. As discussed above, a video server may include a system channel and at least one video channel (when at least one camera is active). The video channels are shown with a “->” designation that shows the direction of video flow (i.e., from the video server to the data management service). The history server similarly has a system channel but also can have a number of recording video channels (e.g., 1 to N) (in an in-bound direction) and a number of playback video channels (e.g., 1 to M) (in an out-bound direction), where the number of recording and playback video channels need not be the same. A video viewer also has a system channel and at least one (in-bound) video channel when at least one video feed is active. The Encoding service also includes a system channel and a number of (in-bound) play video channels.
  • Interactions with the Encoding server are further shown in FIG. 14. In one embodiment, video clips may be generated in a standard video file format (e.g., an MPEG file, Windows Media Player file, or Advanced Systems Format file) such that clips can be played back with a conventional video player application running on a computer after having been transferred to a third party (e.g., an insurance or police). In one such embodiment, this encoding functionality is provided by an encoding service, which is a client connected to the data management service. As shown in FIG. 14, when the encoding service receives a request to create a video clip, it will first validate that the request, and then opens a connection to the data management service. The encoding service will send a playback request, including information about the desired camera, start, and end time for the video clip to the data management service. The data management service will forward this request to the software/hardware responsible for providing video history recording and playback services in the system. The history service begins publishing a playback feed upon receiving the playback request, at which time a response will be returned to the encoding service indicating that the requested playback feed is available. The encoding service will subscribe to the playback feed, and will begin receiving frames of video through the subscription.
  • In one implementation, the video encoding service applies a pre-processing procedure to the video frames, superimposing a video caption containing the name of the video camera, as well as a timestamp. Because video provider implementations may choose to publish frames based on activity/motion detection, rather than at a consistent frames-per-second (fps) rate, the encoding service will be responsible for calculating the period of time that each video frame shall be displayed when playing back the encoded video file. Implementations that choose to superimpose timestamp data on top of the video frames may need to generate multiple output frames from the one source frame provided by the history service to provide a constantly running timestamp in the video output file. An implementation may use standard methods/existing libraries to encode the video data into a standard video file format and write this file to disk.
  • After encoding is complete, an implementation may provide a notification mechanism to inform the user that the video file is available for download (e.g., by sending a link, such as a URL, to the user). Such notification methods may include, but are not limited to, Email, SMS text message, instant messaging, pager, etc.
  • For delivery methods that support it, an implementation may provide the user with an encrypted, direct download link to the video file. The encrypted link provides the user with direct access to the video file without requiring a system log in, and prevents users from editing the link or otherwise tampering with the link to attempt unauthorized access to other files.
  • The data management service also may provide a threshold that defines how long each video clip that is generated should remain available for the user to download. This threshold may vary on a customer, account, or video adapter level. Implementations should provide a mechanism to periodically review generated video clips and automatically remove files that are older than the threshold defined by the data management service.
  • As shown in FIGS. 15A and 15B, in one embodiment, video recording and playback is controlled by a video history service. The history service is a client connected to the data management service that runs under an administrative account that allows access to all published video feeds. The history service requests a list of all published feeds from the data management service, and subscribes to all available feeds. As shown in FIG. 15A, as each feed publishes new frames of video, the frames are delivered by the data management service to the video history service provider. The history service records each frame to permanent storage (e.g., hard drive or storage array), along with metadata associated with the video frame (identifier for the video feed that published the frame, date/time stamp when the frame was taken, etc.)
  • An implementation of the video history service may use any method to store the data (indexed file system, relational database, etc.) provided that the storage mechanism used includes the ability to quickly seek to a range of frames to be retrieved while simultaneously continuing to record incoming data from other live feeds. The storage mechanism must also provide the ability to store metadata in addition to the frame itself, and provide the ability to delete video frames that are older than the threshold defined by the data management service that controls how long a customer's video should be kept. The threshold may vary on a per-customer or per-video adapter level.
  • As shown in FIG. 15B, when a video viewer sends a video playback request to the data storage service, the data storage service will forward this request to the video history service. The history service will confirm that the request is valid, and then ask the data storage service for permission to publish a new video feed. When the data storage service approves the request, the history service will begin publishing the playback feed, and a response to the original request will be returned to the video viewer client indicating that the playback feed is now available. The video viewer client will subscribe to the playback video feed in the same manner as it would subscribe to any other feed available to it through that data management service. When the subscription occurs, a notification is sent to the history service informing it that a subscriber has attached to the playback feed. At this point, the history service begins loading frames of video from storage and publishes them, frame by frame, to the data storage service. Publication will continue until the history service reaches the end of the data, as defined by the date range provided by the user. The history service then ceases publication, and the video viewer is notified that no more data is available.
  • Obviously, numerous modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that, within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.

Claims (24)

1. A system for remote storage and playback of video data recorded at plural remote sites, the system comprising:
a first communications adapter for receiving streams of live video data recorded at the plural remote sites and for receiving requests from users for interactions with the streams of live video data recorded at the plural remote sites;
data storage devices for storing the streams of live video data recorded at the plural remote sites;
a command interpreter for controlling playback of the streams of live video data recorded at the plural remote sites;
a second communications adapter for playing back the streams of live video data recorded at the plural remote sites; and
an account control system for controlling access to the streams of live video data and to the recorded video data on a per-user basis.
2. The system as claimed in claim 1, wherein the first and second communications adapters are the same communications adapter.
3. The system as claimed in claim 1, wherein the first and second communications adapters are different communications adapters.
4. The system as claimed in claim 1, wherein the streams of live video data recorded at the plural remote sites comprises encrypted streams of live data.
5. The system as claimed in claim 1, wherein the command interpreter controls the streaming of the streams of live video data recorded at the plural remote sites to achieve at least one of pausing, rewinding and fast forwarding the streams of live video data recorded at the plural remote sites.
6. The system as claimed in claim 1, wherein the account control system comprises an access control list for specifying permissions on a per-user basis.
7. The system as claimed in claim 1, wherein the account control system controls whether a user can access a stream of live video data from a specified camera.
8. The system as claimed in claim 1, wherein the account control system controls whether a user can extract a clip from the recorded video data.
9. The system as claimed in claim 1, wherein the account control system controls whether a user can delete a recorded stream of video data.
10. The system as claimed in claim 1, wherein the account control system controls the addition of additional users for an account.
11. A system for viewing remotely stored video data recorded at plural remote sites, the system comprising:
a first communications adapter for receiving streams of live video data recorded at the plural remote sites and for receiving requests from users for interactions with the streams of live video data recorded at the plural remote sites;
data storage devices for storing the streams of live video data recorded at the plural remote sites;
a command interpreter for controlling playback of the streams of live video data recorded at the plural remote sites;
a second communications adapter for playing back portions of the streams of live video data recorded at the plural remote sites;
a playback device including (a) a third communications adapter for receiving from the second communications adapter the portions of the streams of live video data recorded at the plural remote sites and (b) a display for displaying the portions of the streams of live video data received from the third communications adapter; and
an account control system for controlling access to the streams of live video data and to the recorded video data on a per-user basis.
12. The system as claimed in claim 11, wherein the first and second communications adapters are the same communications adapter.
13. The system as claimed in claim 11, wherein the first and second communications adapters are different communications adapters.
14. The system as claimed in claim 11, wherein the portions of the streams of live video data recorded at the plural remote sites comprises encrypted live data.
15. The system as claimed in claim 11, wherein the command interpreter controls the delivery of the portions of the streams of live video data recorded at the plural remote sites to achieve at least one of pausing, rewinding and fast forwarding the portions of the streams of live video data recorded at the plural remote sites.
16. The system as claimed in claim 11, wherein the playback device comprises a computer running a World Wide Web browser.
17. The system as claimed in claim 11, wherein the playback device comprises a PDA.
18. The system as claimed in claim 11, wherein the playback device comprises a cellular phone.
19. The system as claimed in claim 18, wherein the cellular phone comprises means for requesting the portions of the streams of live video data recorded at the plural remote sites sufficiently rapidly to simulate a video stream.
20. The system as claimed in claim 11, wherein the account control system comprises an access control list for specifying permissions on a per-user basis.
21. The system as claimed in claim 11, wherein the account control system controls whether a user can access a stream of live video data from a specified camera.
22. The system as claimed in claim 11, wherein the account control system controls whether a user can extract a clip from the recorded video data.
23. The system as claimed in claim 11, wherein the account control system controls whether a user can delete a recorded stream of video data.
24. The system as claimed in claim 11, wherein the account control system controls the addition of additional users for an account.
US13/019,072 2007-06-26 2011-02-01 System and method for account-based storage and playback of remotely recorded video data Abandoned US20110126250A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/019,072 US20110126250A1 (en) 2007-06-26 2011-02-01 System and method for account-based storage and playback of remotely recorded video data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/819,206 US20090007197A1 (en) 2007-06-26 2007-06-26 System and method for storage and playback of remotely recorded video data
US13/019,072 US20110126250A1 (en) 2007-06-26 2011-02-01 System and method for account-based storage and playback of remotely recorded video data

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/819,206 Continuation-In-Part US20090007197A1 (en) 2007-06-26 2007-06-26 System and method for storage and playback of remotely recorded video data

Publications (1)

Publication Number Publication Date
US20110126250A1 true US20110126250A1 (en) 2011-05-26

Family

ID=44063086

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/019,072 Abandoned US20110126250A1 (en) 2007-06-26 2011-02-01 System and method for account-based storage and playback of remotely recorded video data

Country Status (1)

Country Link
US (1) US20110126250A1 (en)

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090193338A1 (en) * 2008-01-28 2009-07-30 Trevor Fiatal Reducing network and battery consumption during content delivery and playback
US20120005704A1 (en) * 2010-06-30 2012-01-05 At&T Intellectual Property I, L.P. System and method of selective channel or advertising delivery
US20120198335A1 (en) * 2010-09-10 2012-08-02 Sextant Navigation, Inc. Apparatus and method for automatic realtime cloud computing processing for live multimedia content
US20120254759A1 (en) * 2011-03-31 2012-10-04 Greenberg David S Browser-based recording of content
US8468126B2 (en) 2005-08-01 2013-06-18 Seven Networks, Inc. Publishing data in an information community
US8484314B2 (en) 2010-11-01 2013-07-09 Seven Networks, Inc. Distributed caching in a wireless network of content delivered for a mobile application over a long-held request
US8494510B2 (en) 2008-06-26 2013-07-23 Seven Networks, Inc. Provisioning applications for a mobile device
US8539040B2 (en) 2010-11-22 2013-09-17 Seven Networks, Inc. Mobile network background traffic data management with optimized polling intervals
US8621075B2 (en) 2011-04-27 2013-12-31 Seven Metworks, Inc. Detecting and preserving state for satisfying application requests in a distributed proxy and cache system
US8700728B2 (en) 2010-11-01 2014-04-15 Seven Networks, Inc. Cache defeat detection and caching of content addressed by identifiers intended to defeat cache
US8738050B2 (en) 2007-12-10 2014-05-27 Seven Networks, Inc. Electronic-mail filtering for mobile devices
US8750123B1 (en) 2013-03-11 2014-06-10 Seven Networks, Inc. Mobile device equipped with mobile network congestion recognition to make intelligent decisions regarding connecting to an operator network
US8761756B2 (en) 2005-06-21 2014-06-24 Seven Networks International Oy Maintaining an IP connection in a mobile network
US8774844B2 (en) 2007-06-01 2014-07-08 Seven Networks, Inc. Integrated messaging
US8775631B2 (en) 2012-07-13 2014-07-08 Seven Networks, Inc. Dynamic bandwidth adjustment for browsing or streaming activity in a wireless network based on prediction of user behavior when interacting with mobile applications
US8787947B2 (en) 2008-06-18 2014-07-22 Seven Networks, Inc. Application discovery on mobile devices
US8812695B2 (en) 2012-04-09 2014-08-19 Seven Networks, Inc. Method and system for management of a virtual network connection without heartbeat messages
US8811952B2 (en) 2002-01-08 2014-08-19 Seven Networks, Inc. Mobile device power management in data synchronization over a mobile network with or without a trigger notification
US8832228B2 (en) 2011-04-27 2014-09-09 Seven Networks, Inc. System and method for making requests on behalf of a mobile device based on atomic processes for mobile network traffic relief
US8838783B2 (en) 2010-07-26 2014-09-16 Seven Networks, Inc. Distributed caching for resource and mobile network traffic management
US8839412B1 (en) 2005-04-21 2014-09-16 Seven Networks, Inc. Flexible real-time inbox access
US8843153B2 (en) 2010-11-01 2014-09-23 Seven Networks, Inc. Mobile traffic categorization and policy for network use optimization while preserving user experience
US8862657B2 (en) 2008-01-25 2014-10-14 Seven Networks, Inc. Policy based content service
US8868753B2 (en) 2011-12-06 2014-10-21 Seven Networks, Inc. System of redundantly clustered machines to provide failover mechanisms for mobile traffic management and network resource conservation
US8874761B2 (en) 2013-01-25 2014-10-28 Seven Networks, Inc. Signaling optimization in a wireless network for traffic utilizing proprietary and non-proprietary protocols
US8903954B2 (en) 2010-11-22 2014-12-02 Seven Networks, Inc. Optimization of resource polling intervals to satisfy mobile device requests
US8909759B2 (en) 2008-10-10 2014-12-09 Seven Networks, Inc. Bandwidth measurement
US8909202B2 (en) 2012-01-05 2014-12-09 Seven Networks, Inc. Detection and management of user interactions with foreground applications on a mobile device in distributed caching
US8934414B2 (en) 2011-12-06 2015-01-13 Seven Networks, Inc. Cellular or WiFi mobile traffic optimization based on public or private network destination
US8984581B2 (en) 2011-07-27 2015-03-17 Seven Networks, Inc. Monitoring mobile application activities for malicious traffic on a mobile device
US20150089024A1 (en) * 2013-09-25 2015-03-26 Samsung Techwin Co., Ltd. Network system and network method
US9002828B2 (en) 2007-12-13 2015-04-07 Seven Networks, Inc. Predictive content delivery
US9009250B2 (en) 2011-12-07 2015-04-14 Seven Networks, Inc. Flexible and dynamic integration schemas of a traffic management system with various network operators for network traffic alleviation
US9021021B2 (en) 2011-12-14 2015-04-28 Seven Networks, Inc. Mobile network reporting and usage analytics system and method aggregated using a distributed traffic optimization system
US9043433B2 (en) 2010-07-26 2015-05-26 Seven Networks, Inc. Mobile network traffic coordination across multiple applications
US9055102B2 (en) 2006-02-27 2015-06-09 Seven Networks, Inc. Location-based operations and messaging
US9065765B2 (en) 2013-07-22 2015-06-23 Seven Networks, Inc. Proxy server associated with a mobile carrier for enhancing mobile traffic management in a mobile network
US9084105B2 (en) 2011-04-19 2015-07-14 Seven Networks, Inc. Device resources sharing for network resource conservation
US9161258B2 (en) 2012-10-24 2015-10-13 Seven Networks, Llc Optimized and selective management of policy deployment to mobile clients in a congested network to prevent further aggravation of network congestion
US9173128B2 (en) 2011-12-07 2015-10-27 Seven Networks, Llc Radio-awareness of mobile device for sending server-side control signals using a wireless network optimized transport protocol
US9203864B2 (en) 2012-02-02 2015-12-01 Seven Networks, Llc Dynamic categorization of applications for network access in a mobile network
US9241314B2 (en) 2013-01-23 2016-01-19 Seven Networks, Llc Mobile device with application or context aware fast dormancy
US9251193B2 (en) 2003-01-08 2016-02-02 Seven Networks, Llc Extending user relationships
US9307493B2 (en) 2012-12-20 2016-04-05 Seven Networks, Llc Systems and methods for application management of mobile device radio state promotion and demotion
US9325662B2 (en) 2011-01-07 2016-04-26 Seven Networks, Llc System and method for reduction of mobile network traffic used for domain name system (DNS) queries
US9326189B2 (en) 2012-02-03 2016-04-26 Seven Networks, Llc User as an end point for profiling and optimizing the delivery of content and data in a wireless network
US20160307378A1 (en) * 2013-12-06 2016-10-20 Cosworth Group Holdings Limited Processing video and sensor data associated with a vehicle
CN108882042A (en) * 2017-05-10 2018-11-23 北京元美传媒科技有限责任公司 A kind of live streaming hot shears volume back method
US10263899B2 (en) 2012-04-10 2019-04-16 Seven Networks, Llc Enhanced customer service for mobile carriers using real-time and historical mobile application and traffic or optimization data associated with mobile devices in a mobile network
US10477281B2 (en) * 2013-12-30 2019-11-12 Telecom Italia S.P.A. Method and system for automatically selecting parts of a video and/or audio media content based on information obtained from social networks
US10503264B1 (en) * 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US10530731B1 (en) 2016-03-28 2020-01-07 Snap Inc. Systems and methods for chat with audio and video elements
US20200221018A1 (en) * 2019-01-07 2020-07-09 Hanwha Techwin Co., Ltd. Setting system for a camera and control method thereof
CN111475565A (en) * 2020-04-21 2020-07-31 北京邮电大学 Visual target historical geographic information data playback system and method
US20200404309A1 (en) * 2018-09-30 2020-12-24 Beijing Microlive Vision Technology Co., Ltd Video watermark adding method and apparatus, and electronic device and storage medium
US11323377B2 (en) * 2019-04-09 2022-05-03 Charter Communications Operating, Llc Dynamic prioritization of data flows

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5692213A (en) * 1993-12-20 1997-11-25 Xerox Corporation Method for controlling real-time presentation of audio/visual data on a computer system
US20020056083A1 (en) * 2000-03-29 2002-05-09 Istvan Anthony F. System and method for picture-in-browser scaling
US20020108127A1 (en) * 2001-02-07 2002-08-08 Yon Lew Low bandwidth transmission
US6476858B1 (en) * 1999-08-12 2002-11-05 Innovation Institute Video monitoring and security system
US20070076094A1 (en) * 2005-09-09 2007-04-05 Agilemesh, Inc. Surveillance apparatus and method for wireless mesh network
US20080303903A1 (en) * 2003-12-02 2008-12-11 Connexed Technologies Inc. Networked video surveillance system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5692213A (en) * 1993-12-20 1997-11-25 Xerox Corporation Method for controlling real-time presentation of audio/visual data on a computer system
US6476858B1 (en) * 1999-08-12 2002-11-05 Innovation Institute Video monitoring and security system
US20020056083A1 (en) * 2000-03-29 2002-05-09 Istvan Anthony F. System and method for picture-in-browser scaling
US20020108127A1 (en) * 2001-02-07 2002-08-08 Yon Lew Low bandwidth transmission
US20080303903A1 (en) * 2003-12-02 2008-12-11 Connexed Technologies Inc. Networked video surveillance system
US20070076094A1 (en) * 2005-09-09 2007-04-05 Agilemesh, Inc. Surveillance apparatus and method for wireless mesh network

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8811952B2 (en) 2002-01-08 2014-08-19 Seven Networks, Inc. Mobile device power management in data synchronization over a mobile network with or without a trigger notification
US9251193B2 (en) 2003-01-08 2016-02-02 Seven Networks, Llc Extending user relationships
US8839412B1 (en) 2005-04-21 2014-09-16 Seven Networks, Inc. Flexible real-time inbox access
US8761756B2 (en) 2005-06-21 2014-06-24 Seven Networks International Oy Maintaining an IP connection in a mobile network
US8468126B2 (en) 2005-08-01 2013-06-18 Seven Networks, Inc. Publishing data in an information community
US9055102B2 (en) 2006-02-27 2015-06-09 Seven Networks, Inc. Location-based operations and messaging
US8805425B2 (en) 2007-06-01 2014-08-12 Seven Networks, Inc. Integrated messaging
US8774844B2 (en) 2007-06-01 2014-07-08 Seven Networks, Inc. Integrated messaging
US8738050B2 (en) 2007-12-10 2014-05-27 Seven Networks, Inc. Electronic-mail filtering for mobile devices
US9002828B2 (en) 2007-12-13 2015-04-07 Seven Networks, Inc. Predictive content delivery
US8862657B2 (en) 2008-01-25 2014-10-14 Seven Networks, Inc. Policy based content service
US8799410B2 (en) 2008-01-28 2014-08-05 Seven Networks, Inc. System and method of a relay server for managing communications and notification between a mobile device and a web access server
US8838744B2 (en) 2008-01-28 2014-09-16 Seven Networks, Inc. Web-based access to data objects
US20090193338A1 (en) * 2008-01-28 2009-07-30 Trevor Fiatal Reducing network and battery consumption during content delivery and playback
US11102158B2 (en) 2008-01-28 2021-08-24 Seven Networks, Llc System and method of a relay server for managing communications and notification between a mobile device and application server
US8787947B2 (en) 2008-06-18 2014-07-22 Seven Networks, Inc. Application discovery on mobile devices
US8494510B2 (en) 2008-06-26 2013-07-23 Seven Networks, Inc. Provisioning applications for a mobile device
US8909759B2 (en) 2008-10-10 2014-12-09 Seven Networks, Inc. Bandwidth measurement
US10271073B2 (en) 2010-06-30 2019-04-23 At&T Intellectual Property I, L.P. Method for selecting advertisements and channel distribution service preferences for delivery to a set top box
US20120005704A1 (en) * 2010-06-30 2012-01-05 At&T Intellectual Property I, L.P. System and method of selective channel or advertising delivery
US9449332B2 (en) 2010-06-30 2016-09-20 At&T Intellectual Property I, L.P. System and method for providing advertising content in media program content
US8555314B2 (en) * 2010-06-30 2013-10-08 At&T Intellectual Property I, L.P. System and method of selective channel or advertising delivery
US9021521B2 (en) 2010-06-30 2015-04-28 At&T Intellectual Property I, Lp System and method for delivering advertising content according to a selection received from subscriber equipment
US8838783B2 (en) 2010-07-26 2014-09-16 Seven Networks, Inc. Distributed caching for resource and mobile network traffic management
US9043433B2 (en) 2010-07-26 2015-05-26 Seven Networks, Inc. Mobile network traffic coordination across multiple applications
US9049179B2 (en) 2010-07-26 2015-06-02 Seven Networks, Inc. Mobile network traffic coordination across multiple applications
US20120198335A1 (en) * 2010-09-10 2012-08-02 Sextant Navigation, Inc. Apparatus and method for automatic realtime cloud computing processing for live multimedia content
US8990874B2 (en) * 2010-09-10 2015-03-24 Jeffrey Huang Apparatus and method for automatic realtime cloud computing processing for live multimedia content
US8843153B2 (en) 2010-11-01 2014-09-23 Seven Networks, Inc. Mobile traffic categorization and policy for network use optimization while preserving user experience
US8484314B2 (en) 2010-11-01 2013-07-09 Seven Networks, Inc. Distributed caching in a wireless network of content delivered for a mobile application over a long-held request
US8782222B2 (en) 2010-11-01 2014-07-15 Seven Networks Timing of keep-alive messages used in a system for mobile network resource conservation and optimization
US8700728B2 (en) 2010-11-01 2014-04-15 Seven Networks, Inc. Cache defeat detection and caching of content addressed by identifiers intended to defeat cache
US8903954B2 (en) 2010-11-22 2014-12-02 Seven Networks, Inc. Optimization of resource polling intervals to satisfy mobile device requests
US8539040B2 (en) 2010-11-22 2013-09-17 Seven Networks, Inc. Mobile network background traffic data management with optimized polling intervals
US9325662B2 (en) 2011-01-07 2016-04-26 Seven Networks, Llc System and method for reduction of mobile network traffic used for domain name system (DNS) queries
US20120254759A1 (en) * 2011-03-31 2012-10-04 Greenberg David S Browser-based recording of content
US9300719B2 (en) 2011-04-19 2016-03-29 Seven Networks, Inc. System and method for a mobile device to use physical storage of another device for caching
US9084105B2 (en) 2011-04-19 2015-07-14 Seven Networks, Inc. Device resources sharing for network resource conservation
US8832228B2 (en) 2011-04-27 2014-09-09 Seven Networks, Inc. System and method for making requests on behalf of a mobile device based on atomic processes for mobile network traffic relief
US8621075B2 (en) 2011-04-27 2013-12-31 Seven Metworks, Inc. Detecting and preserving state for satisfying application requests in a distributed proxy and cache system
US8984581B2 (en) 2011-07-27 2015-03-17 Seven Networks, Inc. Monitoring mobile application activities for malicious traffic on a mobile device
US8977755B2 (en) 2011-12-06 2015-03-10 Seven Networks, Inc. Mobile device and method to utilize the failover mechanism for fault tolerance provided for mobile traffic management and network/device resource conservation
US8934414B2 (en) 2011-12-06 2015-01-13 Seven Networks, Inc. Cellular or WiFi mobile traffic optimization based on public or private network destination
US8868753B2 (en) 2011-12-06 2014-10-21 Seven Networks, Inc. System of redundantly clustered machines to provide failover mechanisms for mobile traffic management and network resource conservation
US9009250B2 (en) 2011-12-07 2015-04-14 Seven Networks, Inc. Flexible and dynamic integration schemas of a traffic management system with various network operators for network traffic alleviation
US9208123B2 (en) 2011-12-07 2015-12-08 Seven Networks, Llc Mobile device having content caching mechanisms integrated with a network operator for traffic alleviation in a wireless network and methods therefor
US9173128B2 (en) 2011-12-07 2015-10-27 Seven Networks, Llc Radio-awareness of mobile device for sending server-side control signals using a wireless network optimized transport protocol
US9277443B2 (en) 2011-12-07 2016-03-01 Seven Networks, Llc Radio-awareness of mobile device for sending server-side control signals using a wireless network optimized transport protocol
US9021021B2 (en) 2011-12-14 2015-04-28 Seven Networks, Inc. Mobile network reporting and usage analytics system and method aggregated using a distributed traffic optimization system
US9131397B2 (en) 2012-01-05 2015-09-08 Seven Networks, Inc. Managing cache to prevent overloading of a wireless network due to user activity
US8909202B2 (en) 2012-01-05 2014-12-09 Seven Networks, Inc. Detection and management of user interactions with foreground applications on a mobile device in distributed caching
US9203864B2 (en) 2012-02-02 2015-12-01 Seven Networks, Llc Dynamic categorization of applications for network access in a mobile network
US9326189B2 (en) 2012-02-03 2016-04-26 Seven Networks, Llc User as an end point for profiling and optimizing the delivery of content and data in a wireless network
US8812695B2 (en) 2012-04-09 2014-08-19 Seven Networks, Inc. Method and system for management of a virtual network connection without heartbeat messages
US10263899B2 (en) 2012-04-10 2019-04-16 Seven Networks, Llc Enhanced customer service for mobile carriers using real-time and historical mobile application and traffic or optimization data associated with mobile devices in a mobile network
US8775631B2 (en) 2012-07-13 2014-07-08 Seven Networks, Inc. Dynamic bandwidth adjustment for browsing or streaming activity in a wireless network based on prediction of user behavior when interacting with mobile applications
US9161258B2 (en) 2012-10-24 2015-10-13 Seven Networks, Llc Optimized and selective management of policy deployment to mobile clients in a congested network to prevent further aggravation of network congestion
US9307493B2 (en) 2012-12-20 2016-04-05 Seven Networks, Llc Systems and methods for application management of mobile device radio state promotion and demotion
US9271238B2 (en) 2013-01-23 2016-02-23 Seven Networks, Llc Application or context aware fast dormancy
US9241314B2 (en) 2013-01-23 2016-01-19 Seven Networks, Llc Mobile device with application or context aware fast dormancy
US8874761B2 (en) 2013-01-25 2014-10-28 Seven Networks, Inc. Signaling optimization in a wireless network for traffic utilizing proprietary and non-proprietary protocols
US8750123B1 (en) 2013-03-11 2014-06-10 Seven Networks, Inc. Mobile device equipped with mobile network congestion recognition to make intelligent decisions regarding connecting to an operator network
US9065765B2 (en) 2013-07-22 2015-06-23 Seven Networks, Inc. Proxy server associated with a mobile carrier for enhancing mobile traffic management in a mobile network
US9794317B2 (en) * 2013-09-25 2017-10-17 Hanwha Techwin Co., Ltd. Network system and network method
US20150089024A1 (en) * 2013-09-25 2015-03-26 Samsung Techwin Co., Ltd. Network system and network method
US20160307378A1 (en) * 2013-12-06 2016-10-20 Cosworth Group Holdings Limited Processing video and sensor data associated with a vehicle
US10832505B2 (en) * 2013-12-06 2020-11-10 Cosworth Group Holdings Limited Processing video and sensor data associated with a vehicle
US10477281B2 (en) * 2013-12-30 2019-11-12 Telecom Italia S.P.A. Method and system for automatically selecting parts of a video and/or audio media content based on information obtained from social networks
US10503264B1 (en) * 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US11861068B2 (en) 2015-06-16 2024-01-02 Snap Inc. Radial gesture navigation
US11132066B1 (en) 2015-06-16 2021-09-28 Snap Inc. Radial gesture navigation
US10530731B1 (en) 2016-03-28 2020-01-07 Snap Inc. Systems and methods for chat with audio and video elements
US11063898B1 (en) 2016-03-28 2021-07-13 Snap Inc. Systems and methods for chat with audio and video elements
CN108882042A (en) * 2017-05-10 2018-11-23 北京元美传媒科技有限责任公司 A kind of live streaming hot shears volume back method
US20200404309A1 (en) * 2018-09-30 2020-12-24 Beijing Microlive Vision Technology Co., Ltd Video watermark adding method and apparatus, and electronic device and storage medium
US11930202B2 (en) * 2018-09-30 2024-03-12 Beijing Microlive Vision Technology Co., Ltd Method and apparatus for video watermarking, and storage medium
US11095810B2 (en) * 2019-01-07 2021-08-17 Hanwha Techwin Co., Ltd. Setting system for a camera and control method thereof
US20200221018A1 (en) * 2019-01-07 2020-07-09 Hanwha Techwin Co., Ltd. Setting system for a camera and control method thereof
US11323377B2 (en) * 2019-04-09 2022-05-03 Charter Communications Operating, Llc Dynamic prioritization of data flows
CN111475565A (en) * 2020-04-21 2020-07-31 北京邮电大学 Visual target historical geographic information data playback system and method

Similar Documents

Publication Publication Date Title
US20110126250A1 (en) System and method for account-based storage and playback of remotely recorded video data
US10992966B2 (en) Mobile phone as a police body camera over a cellular network
US10347102B2 (en) Method and system for surveillance camera arbitration of uplink consumption
US10157526B2 (en) System and method for a security system
US7949730B2 (en) System and method for remote data acquisition and distribution
US9860490B2 (en) Network video recorder system
US9167213B2 (en) Apparatus and method for video display and control for portable device
US7719571B2 (en) Wireless video surveillance system and method with DVR-based querying
US20150381536A1 (en) Method and system for prompt video-data message transfer to personal devices
US10645459B2 (en) Devices, systems, and methods for remote video retrieval
WO2010045404A2 (en) Network video surveillance system and recorder
KR20130050374A (en) System and method for controllably viewing digital video streams captured by surveillance cameras
US20110096139A1 (en) System and Method for Providing Secure Video Visitation
CN104159086A (en) A platform for monitoring a digital video of a provincial road network
US20150124109A1 (en) Apparatus and method for hosting a live camera at a given geographical location
US20090204689A1 (en) Method and apparatus for remote surveillance of a premises
US20220415147A1 (en) Devices, systems, and methods for remote video retrieval
JP2010183131A (en) Camera interface device and image communication system
US20230419801A1 (en) Event detection, event notification, data retrieval, and associated devices, systems, and methods
KR101842020B1 (en) Remote management system, remote management method, and monitoring server
WILL et al. About this Manual
KR20210037304A (en) Display video providing device
Karanja Mobile Video streaming Surveillance System With SMS Alert
JP2012249216A (en) Image recording device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION