US20140147099A1 - Video headphones platform methods, apparatuses and media - Google Patents

Video headphones platform methods, apparatuses and media Download PDF

Info

Publication number
US20140147099A1
US20140147099A1 US14/092,059 US201314092059A US2014147099A1 US 20140147099 A1 US20140147099 A1 US 20140147099A1 US 201314092059 A US201314092059 A US 201314092059A US 2014147099 A1 US2014147099 A1 US 2014147099A1
Authority
US
United States
Prior art keywords
video
vhp
user
audio
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/092,059
Inventor
Stephen Chase
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Soundsight Ip LLC
Original Assignee
Stephen Chase
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stephen Chase filed Critical Stephen Chase
Priority to US14/092,059 priority Critical patent/US20140147099A1/en
Publication of US20140147099A1 publication Critical patent/US20140147099A1/en
Assigned to SOUNDSIGHT, LLC reassignment SOUNDSIGHT, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHASE, STEPHEN, GORILOVSKY, DMITRY, SOUNDSIGHT MOBILE LLC
Assigned to SOUNDSIGHT IP, LLC reassignment SOUNDSIGHT IP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SOUNDSIGHT, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/02Casings; Cabinets ; Supports therefor; Mountings therein
    • H04R1/028Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/10Details of earpieces, attachments therefor, earphones or monophonic headphones covered by H04R1/10 but not provided for in any of its subgroups
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication

Definitions

  • VHP VIDEO HEADPHONES PLATFORM METHODS, APPARATUSES AND MEDIA
  • the present disclosure is directed generally to video headphones.
  • Headphones may obtain an audio signal from an audio source (e.g., a portable music player) either via an audio cable or via a wireless receiver.
  • an audio source e.g., a portable music player
  • FIG. 1 shows a logic flow diagram illustrating a world view sharing (WVS) component in one embodiment of the VHP.
  • WVS world view sharing
  • FIG. 2 shows a data flow diagram in one embodiment of the VHP. This is an exemplary VHP data flow.
  • FIG. 3 shows a block diagram illustrating an exemplary VHP coordinator in one embodiment of the VHP. This is an exemplary VHP coordinator.
  • FIGS. 4 to 20 illustrate additional exemplary embodiments of the VHP.
  • the VHP empowers users to record, edit and share their world view with their social networks.
  • a user may record a video through a video camera embedded in the user's headphones. Audio may also be recorded through one or more microphones, which are installed into the headphone cable control unit and/or located underneath the video camera. Once the user finishes recording the video, the user may add music, apply filters, make audio and/or video adjustments, provide a description of the video, and/or the like. After the user finishes editing the video, the video may be shared with the user's social networks.
  • FIG. 1 shows a logic flow diagram illustrating a world view sharing (WVS) component in one embodiment of the VHP.
  • a request to share a world view video may be received at 101 .
  • a world view video may be a video showing what may be seen and/or heard by a user (i.e., the view from the user's eye level).
  • the request to share a world view video may be received based on activation of a record button on video headphones (e.g., the record button may be integrated into a headphone cable) by the user.
  • the request to share a world view video may be received based on activation of a record button of a VHP mobile app (e.g., running on the user's smart phone) by the user.
  • a VHP mobile app e.g., running on the user's smart phone
  • video headphones may be headphones with an embedded and/or attached video camera.
  • video headphones may include features such as HD premium quality sound, pivoting ear pads for multi-angle recording, one or more HD (e.g., 1080p) video cameras, one or more dynamic microphones, audio and/or video controls (e.g., record video, adjust volume, skip forward and/or back, mute) integrated into a headphone cable, and/or the like.
  • the video headphones may include a video camera on one of the earpieces, and the video camera may have a microphone (e.g., located underneath the video camera).
  • the video headphones may include two video cameras, one on each earpiece, and each video camera may have a microphone.
  • a video camera and/or the accompanying microphone if the video camera has a microphone, may pivot and/or swivel (e.g., 360 degrees in any direction) to allow the user to record video in any direction.
  • a user may angle the video camera up to record a video of a bird in a tree.
  • the user may point one video camera straight ahead and the other video camera towards the back to record a split screen of what is in front and behind the user.
  • the user may position both video cameras in such a way (e.g., both video cameras pointing in the same direction) as to allow three dimensional (3D) video recording.
  • the video headphones may be connected (e.g., via a headphone cable, via a wireless technology capable of transmitting HD video) to the user's client (e.g., a smart phone, a portable media player, a tablet) and may record and transmit the video (e.g., including video and/or audio) to the client (e.g., via the VHP mobile app).
  • the video may be stored locally on the client.
  • the video may be stored remotely (e.g., on a remote server).
  • the video may be of any desired length.
  • a minimum and/or a maximum length for the video (e.g., 15 seconds) may be specified.
  • a desired video segment may be determined at 115 .
  • the user may trim the video by dragging a video selection widget of the VHP mobile app to select a video segment of a predetermined size (e.g., a 10 second video segment).
  • the user may trim the video by expanding and/or shrinking the size of a video selection widget of the VHP mobile app to select a desired video segment.
  • the user may combine multiple videos (e.g., the current video and one or more previously recorded videos) into one video. The video may be trimmed to the desired video segment at 120 .
  • the VHP may facilitate audio selection at 130 .
  • the user may select audio (e.g., music) from the user's audio collection (e.g., by pressing a “My Music” button of the VHP mobile app).
  • the user may select a music album and/or track (e.g., a song) and/or a playlist from the user's music collection.
  • the user may also utilize an audio selection widget to select the desired portion of the selected audio to be added to the video. For example, the length of the selected audio portion may be set to be equal to the length of the video.
  • the VHP may suggest audio to the user that well matches actions in the video.
  • the VHP may suggest songs and/or playlists and/or audio portions whose tempo matches the actions in the video.
  • the user may select audio from the VHP's audio collection (e.g., by pressing a “VHP Library” button of the VHP mobile app).
  • the user may purchase music albums and/or songs and/or audio effects via the VHP mobile app.
  • the user may select audio and/or a portion of the audio from the VHP's audio collection in a similar manner as described above with regard to the user's audio collection.
  • the user may select audio from the VHP's audio collection based on suggestions from the VHP made in a similar manner as described above with regard to the user's audio collection.
  • the VHP may facilitate desired audio adjustments at 140 .
  • the user may adjust the audio by speeding up and/or slowing down the audio.
  • the user may utilize “Slow Down” and/or “Speed Up” buttons of the VHP mobile app to adjust audio speed (e.g., in discreet chunks, such as 2 ⁇ or 3 ⁇ faster or slower; continuously, such as based on the length of time that a button is pressed).
  • the length of the selected audio portion may vary based on the speed of the audio. For example, if the user speeds up a song, the user may be able to select a longer portion of the song.
  • the user may choose to have an entire song play during the duration (e.g., 2 minutes) of the video, and the speed at which the song is played back may be adjusted accordingly (e.g., so that the entire song is played back in 2 minutes) by the VHP.
  • the user may adjust the audio by auto tuning the audio.
  • the user may utilize an “Auto Tune” button of the VHP mobile app.
  • the user may adjust the audio by adding sound effects to the audio.
  • the user may utilize a sound effects selection widget of the VHP mobile app.
  • the audio may be added to the video at 145 .
  • the added audio may replace audio recorded in the video.
  • the added audio may be combined with audio recorded in the video.
  • the user may post artist credit (e.g., via a “Post Artist Credit” button of the VHP mobile app) for audio (e.g., a song) being used in the video.
  • the artist credit may scroll in a specified location (e.g., across the bottom of the video).
  • the VHP may determine desired video effects at 155 .
  • the user may select desired video effects (e.g., sepia filter, black and white filter, monochromatic filter, a light filter, a frame around the video, speed up and/or slow down the video) via a video effect selection widget of the VHP mobile app.
  • desired video effects e.g., sepia filter, black and white filter, monochromatic filter, a light filter, a frame around the video, speed up and/or slow down the video
  • the user may wish to add various video elements (e.g., a video start element, a video end element, a transition element, a comments element) to the video.
  • the user may wish to insert a comment into the video after an amusing scene in the video.
  • the desired video effects may be added to the video at 160 .
  • a description of the video may be obtained from the user at 165 .
  • the description may include a title.
  • the description may include the user's description for the video.
  • the description may have a minimum and/or a maximum size.
  • a title may have to be at least 1 character long and no more than 50 characters long.
  • the user's description for the video may have to be no more than 148 characters.
  • the minimum and/or the maximum size for the description may correspond to those specified by social networks on which the video may be shared (e.g., a maximum of 140 characters for Twitter).
  • the video may be added to the user's video library at 170 .
  • the video may be stored (e.g., on the client, on a remote server) and added to the user's VHP profile. If the user chooses to share the video with other VHP users, the other VHP users may comment on the video, mark the video as favorite, forward the video to others, and/or the like. The user may also see how many times the video has been viewed, post comments, and/or the like.
  • social networks e.g., Facebook, Tumblr, Twitter, Instagram, Pinterest, Vimeo, YouTube.
  • social networks on which the user wishes to share the video may be determined at 180 .
  • the user may select desired social networks via appropriate buttons of the VHP mobile app and press a “Share Now” button to initiate sharing on the selected social networks.
  • the user may specify default desired social networks (e.g., via the user's profile settings) so that the default desired social networks are preselected for the user, and the user may press a “Share Now” button to initiate sharing on the selected social networks.
  • the video may be shared on the selected social networks at 185 (e.g., via appropriate APIs provided by the social networks).
  • FIG. 2 shows a data flow diagram in one embodiment of the VHP.
  • FIG. 2 provides an example of how data may flow to, through, and/or from the VHP.
  • a user 202 may provide a command to initiate video recording 221 to the client 210 (e.g., a smart phone, a portable media player, a tablet).
  • the user may provide the command to initiate video recording by pressing a record button on the user's video headphones 206 .
  • the user may provide the command to initiate video recording by pressing a record button of a VHP mobile app on the client.
  • the client may send a video data request 225 to the video headphones.
  • the video data request may be in XML format and may include data such as a command to initiate video recording, video parameters (e.g., video resolution, video aspect ratio, audio quality), video recording length, and/or the like.
  • the video headphones may begin and/or end video recording based on the video data request.
  • the video headphones may send a video data response 229 to the client.
  • the video data response may be in XML format and may include data such as the recorded video (e.g., including video and/or audio), video information (e.g., date and/or time of recording, location of the video), and/or the like.
  • the client may output an adjustments request 233 to the user.
  • the adjustments request may prompt the user to trim the video, to add audio to the video, to adjust audio in the video, to add video effects to the video, to provide a description of the video, to share the video, and/or the like.
  • the adjustments request may be output via a GUI of the VHP mobile app.
  • the user may input an adjustments response 237 into the client.
  • the adjustments response may indicate whether and/or how the user wishes to trim the video, to add audio to the video, to adjust audio in the video, to add video effects to the video, to describe the video, to share the video, and/or the like.
  • the adjustments response may be input via the GUI of the VHP mobile app.
  • the client may send an audio data request 241 to a VHP server 214 .
  • the VHP server may store songs available from the VHP (e.g., in an audio data store 330 c ).
  • the audio data request may be in XML format and may include data such as the user's identifier and/or password, an identifier of a requested song and/or album and/or playlist, audio parameters (e.g., audio format, audio quality in kbps), a payment method, and/or the like.
  • the VHP server may send an audio data response 245 to the client.
  • the audio data response may be in XML format and may include data such as the requested song and/or album and/or playlist, audio parameters, a payment confirmation, and/or the like.
  • the client may add the obtained audio to the video and/or adjust the obtained audio and/or the video based on user instructions.
  • the client may send a share request 249 to a social network 218 .
  • the share request may include the video and instruct the social network to post the video via the user's social network account.
  • the share request may be sent via the social network's API command and may include data such as the user's identifier and/or password on the social network, the video, the description of the video, video information (e.g., date and/or time of recording, location of the video obtained via the client's GPS), and/or the like.
  • the social network may send a share response 253 to the client.
  • the share response may be sent via the social network's API command and may indicate whether the video was shared on the social network successfully.
  • the client may provide a video output 257 to the user.
  • the video output may inform the user whether the user's video has been stored (e.g., in a videos data store 330 d ), added to the user's profile, shared on one or more social networks, and/or the like.
  • FIG. 3 shows a block diagram illustrating an exemplary VHP coordinator in one embodiment of the VHP.
  • the VHP coordinator facilitates the operation of the VHP via a computer system (e.g., one or more cloud computing systems, grid computing systems, virtualized computer systems, mainframe computers, servers, clients, nodes, desktops, mobile devices such as smart phones, cellular phones, tablets, personal digital assistants (PDAs), and/or the like, embedded computers, dedicated computers, a system on a chip (SOC)).
  • the VHP coordinator may receive, obtain, aggregate, process, generate, store, retrieve, send, delete, input, output, and/or the like data (including program data and program instructions); may execute program instructions; may communicate with computer systems, with nodes, with users, and/or the like.
  • the VHP coordinator may comprise a standalone computer system, a distributed computer system, a node in a computer network (i.e., a network of computer systems organized in a topology), a network of VHP coordinators, and/or the like.
  • a computer network i.e., a network of computer systems organized in a topology
  • VHP coordinators e.g., processor, system bus, memory, input/output devices
  • the VHP coordinator and/or the various VHP coordinator elements may be organized in any number of ways (i.e., using any number and configuration of computer systems, computer networks, nodes, VHP coordinator elements, and/or the like) to facilitate VHP operation.
  • VHP coordinator computer systems may communicate among each other in any number of ways to facilitate VHP operation.
  • the term “user” refers generally to people and/or computer systems that interact with the VHP;
  • the term “server” refers generally to a computer system, a program, and/or a combination thereof that handles requests and/or responds to requests from clients via a computer network;
  • client refers generally to a computer system, a program, a user, and/or a combination thereof that generates requests and/or handles responses from servers via a computer network;
  • node refers generally to a server, to a client, and/or to an intermediary computer system, program, and/or a combination thereof that facilitates transmission of and/or handling of requests and/or responses.
  • the VHP coordinator includes a processor 301 that executes program instructions (e.g., VHP program instructions).
  • the processor may be a general purpose microprocessor (e.g., a central processing unit (CPU)), a dedicated microprocessor (e.g., a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a network processor, and/or the like), an external processor, a plurality of processors (e.g., working in parallel, distributed, and/or the like), a microcontroller (e.g., for an embedded system), and/or the like.
  • a general purpose microprocessor e.g., a central processing unit (CPU)
  • a dedicated microprocessor e.g., a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a network processor, and/or the like
  • an external processor e.g., a plurality of processors (e.
  • the processor may be implemented using integrated circuits (ICs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), and/or the like.
  • the processor may comprise one or more cores, may include embedded elements (e.g., a coprocessor such as a math coprocessor, a cryptographic coprocessor, a physics coprocessor, and/or the like, registers, cache memory, software), may be synchronous (e.g., using a clock signal) or asynchronous (e.g., without a central clock), and/or the like.
  • the processor may be an AMD FX processor, an AMD Opteron processor, an AMID Geode LX processor, an Intel Core i7 processor, an Intel Xeon processor, an Intel Atom processor, an ARM Cortex processor, an IBM PowerPC processor, and/or the like.
  • the processor may be connected to system memory 305 via a system bus 303 .
  • the system bus may interconnect these and/or other elements of the VHP coordinator via electrical, electronic, optical, wireless, and/or the like communication links (e.g., the system bus may be integrated into a motherboard that interconnects VHP coordinator elements and provides power from a power supply).
  • the system bus may comprise one or more control buses, address buses, data buses, memory buses, peripheral buses, and/or the like.
  • the system bus may be a parallel bus, a serial bus, a daisy chain design, a hub design, and/or the like.
  • the system bus may comprise a front-side bus, a back-side bus, AMD's HyperTransport, Intel's QuickPath Interconnect, a peripheral component interconnect (PCI) bus, an accelerated graphics port (AGP) bus, a PCI Express bus, a low pin count (LPC) bus, a universal serial bus (USB), and/or the like.
  • the system memory may comprise registers, cache memory (e.g., level one, level two, level three), read only memory (ROM) (e.g., BIOS, flash memory), random access memory (RAM) (e.g., static RAM (SRAM), dynamic RAM (DRAM), error-correcting code (ECC) memory), and/or the like.
  • the system memory may be discreet, external, embedded, integrated into a CPU, and/or the like.
  • the processor may access, read from, write to, store in, erase, modify, and/or the like, the system memory in accordance with program instructions (e.g., VHP program instructions) executed by the processor.
  • the system memory may facilitate accessing, storing, retrieving, modifying, deleting, and/or the like data (e.g., VHP data) by the processor.
  • input/output devices 310 may be connected to the processor and/or to the system memory, and/or to one another via the system bus.
  • the input/output devices may include one or more graphics devices 311 .
  • the processor may make use of the one or more graphic devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor.
  • a graphics device may be a video card that may obtain (e.g., via a connected video camera), process (e.g., render a frame), output (e.g., via a connected monitor, television, and/or the like), and/or the like graphical (e.g., multimedia, video, image, text) data (e.g., VHP data).
  • a video card may be connected to the system bus via an interface such as PCI, AGP, PCI Express, USB, PC Card, ExpressCard, and/or the like.
  • a video card may use one or more graphics processing units (GPUs), for example, by utilizing AMD's CrossFireX and/or NVIDIA's SLI technologies.
  • a video card may be connected via an interface (e.g., video graphics array (VGA), digital video interface (DVI), Mini-DVI, Micro-DVI, high-definition multimedia interface (HDMI), DisplayPort, Thunderbolt, composite video, S-Video, component video, and/or the like) to one or more displays (e.g., cathode ray tube (CRT), liquid crystal display (LCD), touchscreen, and/or the like) that display graphics.
  • VGA video graphics array
  • DVI digital video interface
  • Mini-DVI Mini-DVI
  • Micro-DVI Micro-DVI
  • HDMI high-definition multimedia interface
  • Thunderbolt Thunderbolt
  • composite video S-Video
  • component video and/or the like
  • displays e.g., cathode ray tube (CRT), liquid crystal display (LCD), touchscreen,
  • a video card may be an AMD Radeon HD 6990, an ATI Mobility Radeon HD 5870, an AMD FirePro V9800P, an AMD Radeon E6760 MXM V3.0 Module, an NVIDIA GeForce GTX 590, an NVIDIA GeForce GTX 580M, an Intel HD Graphics 3000 , and/or the like.
  • a graphics device may be a video capture board that may obtain (e.g., via coaxial cable), process (e.g., overlay with other graphical data), capture, convert (e.g., between different formats, such as MPEG2 to H.264), and/or the like graphical data.
  • a video capture board may be and/or include a TV tuner, may be compatible with a variety of broadcast signals (e.g., NTSC, PAL, ATSC, QAM) may be a part of a video card, and/or the like.
  • a video capture board may be an ATI All-in-Wonder HD, a Hauppauge ImpactVBR 01381, a Hauppauge WinTV-HVR-2250, a Hauppauge Colossus 01414, and/or the like.
  • a graphics device may be discreet, external, embedded, integrated into a CPU, and/or the like.
  • a graphics device may operate in combination with other graphics devices (e.g., in parallel) to provide improved capabilities, data throughput, color depth, and/or the like.
  • the input/output devices may include one or more audio devices 313 .
  • the processor may make use of the one or more audio devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor.
  • an audio device may be a sound card that may obtain (e.g., via a connected microphone), process, output (e.g., via connected speakers), and/or the like audio data (e.g., VHP data).
  • a sound card may be connected to the system bus via an interface such as PCI, PCI Express, USB, PC Card, ExpressCard, and/or the like.
  • a sound card may be connected via an interface (e.g., tip sleeve (TS), tip ring sleeve (TRS), RCA, TOSLINK, optical) to one or more amplifiers, speakers (e.g., mono, stereo, surround sound), subwoofers, digital musical instruments, and/or the like.
  • a sound card may be an Intel AC'97 integrated codec chip, an Intel HD Audio integrated codec chip, a Creative Sound Blaster X-Fi Titanium HD, a Creative Sound Blaster X-Fi Go! Pro, a Creative Sound Blaster Recon 3D, a Turtle Beach Riviera, a Turtle Beach Amigo II, and/or the like.
  • An audio device may be discreet, external, embedded, integrated into a motherboard, and/or the like. An audio device may operate in combination with other audio devices (e.g., in parallel) to provide improved capabilities, data throughput, audio quality, and/or the like.
  • the input/output devices may include one or more network devices 315 .
  • the processor may make use of the one or more network devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor.
  • a network device may be a network card that may obtain (e.g., via a Category 5 Ethernet cable), process, output (e.g., via a wireless antenna), and/or the like network data (e.g., VHP data).
  • a network card may be connected to the system bus via an interface such as PCI, PCI Express, USB, FireWire, PC Card, ExpressCard, and/or the like.
  • a network card may be a wired network card (e.g., 10/100/1000, optical fiber), a wireless network card (e.g., Wi-Fi 802.11a/b/g/n/ac/ad, Bluetooth, Near Field Communication (NFC), TransferJet), a modem (e.g., dialup telephone-based, asymmetric digital subscriber line (ADSL), cable modem, power line modem, wireless modem based on cellular protocols such as high speed packet access (HSPA), evolution-data optimized (EV-DO), global system for mobile communications (GSM), worldwide interoperability for microwave access (WiMax), long term evolution (LTE), and/or the like, satellite modem, FM radio modem, radio-frequency identification (RFID) modem, infrared (IR) modem), and/or the like.
  • HSPA high speed packet access
  • EV-DO evolution-data optimized
  • GSM global system for mobile communications
  • WiMax worldwide interoperability for microwave access
  • LTE long term evolution
  • a network card may be an Intel EXPI9301CT, an Intel EXPI9402PT, a LINKSYS USB300M, a BUFFALO WLI-UC-G450, a Rosewill RNX-MiniN1, a TRENDnet TEW-623PI, a Rosewill RNX-N180UBE, an ASUS USB-BT211, a MOTOROLA SB6120, a U.S.
  • a network device may be discreet, external, embedded, integrated into a motherboard, and/or the like.
  • a network device may operate in combination with other network devices (e.g., in parallel) to provide improved data throughput, redundancy, and/or the like.
  • LACP link aggregation control protocol
  • a network device may be used to connect to a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a personal area network, the Internet, an intranet, a Bluetooth network, an NFC network, a Wi-Fi network, a cellular network, and/or the like.
  • the input/output devices may include one or more peripheral devices 317 .
  • the processor may make use of the one or more peripheral devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor.
  • a peripheral device may be a digital camera, a video camera, a webcam, an electronically moveable pan tilt zoom (PTZ) camera, a monitor, a touchscreen display, active shutter 3D glasses, head-tracking 3D glasses, a remote control, an audio line-in, an audio line-out, a microphone, headphones, speakers, a subwoofer, a router, a hub, a switch, a firewall, an antenna, a keyboard, a mouse, a trackpad, a trackball, a digitizing tablet, a stylus, a joystick, a gamepad, a game controller, a force-feedback device, a laser, sensors (e.g., proximity sensor, rangefinder, ambient temperature sensor, ambient light sensor, humidity sensor, an accelerometer, a a sensor, a
  • a peripheral device may be connected to the system bus via an interface such as PCI, PCI Express, USB, FireWire, VGA, DVI, Mini-DVI, Micro-DVI, HDMI, DisplayPort, Thunderbolt, composite video, S-Video, component video, PC Card, ExpressCard, serial port, parallel port, PS/2, TS, TRS, RCA, TOSLINK, network connection (e.g., wired such as Ethernet, optical fiber, and/or the like, wireless such as Wi-Fi, Bluetooth, NFC, cellular, and/or the like), a connector of another input/output device, and/or the like.
  • a peripheral device may be discreet, external, embedded, integrated (e.g., into a processor, into a motherboard), and/or the like.
  • a peripheral device may operate in combination with other peripheral devices (e.g., in parallel) to provide the VHP coordinator with a variety of input, output and processing capabilities.
  • the input/output devices may include one or more storage devices 319 .
  • the processor may access, read from, write to, store in, erase, modify, and/or the like a storage device in accordance with program instructions (e.g., VHP program instructions) executed by the processor.
  • a storage device may facilitate accessing, storing, retrieving, modifying, deleting, and/or the like data (e.g., VHP data) by the processor.
  • the processor may access data from the storage device directly via the system bus.
  • the processor may access data from the storage device by instructing the storage device to transfer the data to the system memory and accessing the data from the system memory.
  • a storage device may be a hard disk drive (HDD), a solid-state drive (SSD), a floppy drive using diskettes, an optical disk drive (e.g., compact disk (CD-ROM) drive, CD-Recordable (CD-R) drive, CD-Rewriteable (CD-RW) drive, digital versatile disc (DVD-ROM) drive, DVD-R drive, DVD-RW drive, Blu-ray disk (BD) drive) using an optical medium, a magnetic tape drive using a magnetic tape, a memory card (e.g., a USB flash drive, a compact flash (CF) card, a secure digital extended capacity (SDXC) card), a network attached storage (NAS), a direct-attached storage (DAS), a storage area network (SAN), other processor-readable physical mediums, and/or the like.
  • HDD hard disk drive
  • SSD solid-state drive
  • a floppy drive using diskettes an optical disk drive (e.g., compact disk (CD-ROM) drive, CD-Record
  • a storage device may be connected to the system bus via an interface such as PCI, PCI Express, USB, FireWire, PC Card, ExpressCard, integrated drive electronics (IDE), serial advanced technology attachment (SATA), external SATA (eSATA), small computer system interface (SCSI), serial attached SCSI (SAS), fibre channel (FC), network connection (e.g., wired such as Ethernet, optical fiber, and/or the like; wireless such as Wi-Fi, Bluetooth, NFC, cellular, and/or the like), and/or the like.
  • a storage device may be discreet, external, embedded, integrated (e.g., into a motherboard, into another storage device), and/or the like.
  • a storage device may operate in combination with other storage devices to provide improved capacity, data throughput, data redundancy, and/or the like.
  • protocols such as redundant array of independent disks (RAID) (e.g., RAID 0 (striping), RAID 1 (mirroring), RAID 5 (striping with distributed parity), hybrid RAID), just a bunch of drives (JBOD), and/or the like may be used.
  • RAID redundant array of independent disks
  • RAID 0 striping
  • RAID 1 mirrorroring
  • RAID 5 striping with distributed parity
  • hybrid RAID just a bunch of drives
  • JBOD just a bunch of drives
  • virtual and/or physical drives may be pooled to create a storage pool.
  • an SSD cache may be used with a HDD to improve speed.
  • memory 320 i.e., physical memory
  • VHP memory 320 contains processor-operable (e.g., accessible) VHP data stores 330 .
  • Data stores 330 comprise data that may be used (e.g., by the VHP) via the VHP coordinator. Such data may be organized using one or more data formats such as a database (e.g., a relational database with database tables, an object-oriented database, a graph database, a hierarchical database), a flat file (e.g., organized into a tabular format), a binary file (e.g., a GIF file, an MPEG-4 file), a structured file (e.g., an HTML file, an XML file), a text file, and/or the like.
  • a database e.g., a relational database with database tables, an object-oriented database, a graph database, a hierarchical database
  • a flat file e.g., organized into a tabular format
  • binary file e.g., a GIF file, an MPEG-4 file
  • data may be organized using one or more data structures such as an array, a queue, a stack, a set, a linked list, a map, a tree, a hash, a record, an object, a directed graph, and/or the like.
  • data stores may be organized in any number of ways (i.e., using any number and configuration of data formats, data structures, VHP coordinator elements, and/or the like) to facilitate VHP operation.
  • VHP data stores may comprise data stores 330 a - d implemented as one or more databases.
  • a users data store 330 a may be a collection of database tables that include fields such as UserID, UserName, UserPreferences, UserVideos, UserSocialNetworks, and/or the like.
  • a clients data store 330 b may be a collection of database tables that include fields such as ClientID, ClientName, ClientDeviceType, ClientScreenResolution, and/or the like.
  • An audio data store 330 c may be a collection of database tables that include fields such as AudioID, AudioAlbum, AudioPlaylist, AudioFormat, AudioQuality, AudioPrice, and/or the like.
  • a videos data store 330 d may be a collection of database tables that include fields such as VideoID, VideoTitle, VideoDescription, VideoResolution, VideoEffects, VideoSharingSettings, and/or the like.
  • the VHP coordinator may use data stores 330 to keep track of inputs, parameters, settings, variables, records, outputs, and/or the like.
  • VHP memory 320 contains processor-operable (e.g., executable) VHP components 340 .
  • Components 340 comprise program components (including program instructions and any associated data stores) that are executed (e.g., by the VHP) via the VHP coordinator (i.e., via the processor) to transform VHP inputs into VHP outputs.
  • the various components and their subcomponents, capabilities, applications, and/or the like may be organized in any number of ways (i.e., using any number and configuration of components, subcomponents, capabilities, applications, VHP coordinator elements, and/or the like) to facilitate VHP operation.
  • the various components and their subcomponents, capabilities, applications, and/or the like may communicate among each other in any number of ways to facilitate VHP operation.
  • the various components and their subcomponents, capabilities, applications, and/or the like may be combined, integrated, consolidated, split up, distributed, and/or the like in any number of ways to facilitate VHP operation.
  • a single or multiple instances of the various components and their subcomponents, capabilities, applications, and/or the like may be instantiated on each of a single VHP coordinator node, across multiple VHP coordinator nodes, and/or the like.
  • program components may be developed using one or more programming languages, techniques, tools, and/or the like such as an assembly language, Ada, BASIC, C, C++, C#, COBOL, Fortran, Java, LabVIEW, Lisp, Mathematica, MATLAB, OCaml, PL/I, Smalltalk, Visual Basic for Applications (VBA), HTML, XML, CSS, JavaScript, JavaScript Object Notation (JSON), PHP, Perl, Ruby, Python, Asynchronous JavaScript and XML (AJAX), Simple Object Access Protocol (SOAP), SSL, ColdFusion, Microsoft .NET, Apache modules, Adobe Flash, Adobe AIR, Microsoft Silverlight, Windows PowerShell, batch files, Tcl, graphical user interface (GUI) toolkits, SQL, database adapters, web application programming interfaces (APIs), application server extensions, integrated development environments (IDEs), libraries (e.g., object libraries, class libraries, remote libraries), remote procedure calls (RPCs), Common Object Request Broker Architecture (CORBA),
  • GUI
  • components 340 may include an operating environment component 340 a .
  • the operating environment component may facilitate operation of the VHP via various subcomponents.
  • the operating environment component may include an operating system subcomponent.
  • the operating system subcomponent may provide an abstraction layer that facilitates the use of, communication among, common services for, interaction with, security of and/or the like of various VHP coordinator elements, components, data stores, and/or the like.
  • the operating system subcomponent may facilitate execution of program instructions (e.g., VHP program instructions) by the processor by providing process management capabilities.
  • the operating system subcomponent may facilitate the use of multiple processors, the execution of multiple processes, multitasking, and/or the like.
  • the operating system subcomponent may facilitate the use of memory by the VHP.
  • the operating system subcomponent may allocate and/or free memory, facilitate memory addressing, provide memory segmentation and/or protection, provide virtual memory capability, facilitate caching, and/or the like.
  • the operating system subcomponent may include a file system (e.g., File Allocation Table (FAT), New Technology File System (NTFS), Hierarchical File System Plus (HFS+), Universal Disk Format (UDF), Linear Tape File System (LTFS)) to facilitate storage, retrieval, deletion, aggregation, processing, generation, and/or the like of data.
  • FAT File Allocation Table
  • NTFS New Technology File System
  • HFS+ Hierarchical File System Plus
  • UDF Universal Disk Format
  • LTFS Linear Tape File System
  • the operating system subcomponent may facilitate operation of and/or processing of data for and/or from input/output devices.
  • the operating system subcomponent may include one or more device drivers, interrupt handlers, file systems, and/or the like that allow interaction with input/output devices.
  • the operating system subcomponent may facilitate operation of the VHP coordinator as a node in a computer network by providing support for one or more communications protocols.
  • the operating system subcomponent may include support for the internet protocol suite (i.e., Transmission Control Protocol/Internet Protocol (TCP/IP)) of network protocols such as TCP, IP, User Datagram Protocol (UDP), Mobile IP, and/or the like.
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • UDP User Datagram Protocol
  • the operating system subcomponent may include support for security protocols (e.g., Wired Equivalent Privacy (WEP), Wi-Fi Protected Access (WPA), WPA2) for wireless computer networks.
  • WEP Wired Equivalent Privacy
  • WPA Wi-Fi Protected Access
  • WPA2 virtual private networks
  • the operating system subcomponent may facilitate security of the VHP coordinator.
  • the operating system subcomponent may provide services such as authentication, authorization, audit, network intrusion-detection capabilities, firewall capabilities, antivirus capabilities, and/or the like.
  • the operating system subcomponent may facilitate user interaction with the VHP by providing user interface elements that may be used by the VHP to generate a user interface.
  • user interface elements may include widgets (e.g., windows, dialog boxes, scrollbars, menu bars, tabs, ribbons, menus, buttons, text boxes, checkboxes, combo boxes, drop-down lists, list boxes, radio buttons, sliders, spinners, grids, labels, progress indicators, icons, tooltips, and/or the like) that may be used to obtain input from and/or provide output to the user.
  • widgets may be used via a widget toolkit such as Microsoft Foundation Classes (MFC), Apple Cocoa Touch, Java Swing, GTK+, Qt, Yahoo!
  • such user interface elements may include sounds (e.g., event notification sounds stored in MP3 file format), animations, vibrations, and/or the like that may be used to inform the user regarding occurrence of various events.
  • the operating system subcomponent may include a user interface such as Windows Aero, Mac OS X Aqua, GNOME Shell, KDE Plasma Workspaces (e.g., Plasma Desktop, Plasma Netbook, Plasma Contour, Plasma Mobile), and/or the like.
  • the operating system subcomponent may comprise a single-user operating system, a multi-user operating system, a single-tasking operating system, a multitasking operating system, a single-processor operating system, a multiprocessor operating system, a distributed operating system, an embedded operating system, a real-time operating system, and/or the like.
  • the operating system subcomponent may comprise an operating system such as UNIX, LINUX, IBM i, Sun Solaris, Microsoft Windows Server, Microsoft DOS, Microsoft Windows 7, Apple Mac OS X, Apple iOS, Android, Symbian, Windows Phone 7, Blackberry QNX, and/or the like.
  • the operating environment component may include a database subcomponent.
  • the database subcomponent may facilitate VHP capabilities such as storage, analysis, retrieval, access, modification, deletion, aggregation, generation, and/or the like of data (e.g., the use of data stores 330 ).
  • the database subcomponent may make use of database languages (e.g., Structured Query Language (SQL), XQuery), stored procedures, triggers, APIs, and/or the like to provide these capabilities.
  • the database subcomponent may comprise a cloud database, a data warehouse, a distributed database, an embedded database, a parallel database, a real-time database, and/or the like.
  • the database subcomponent may comprise a database such as Microsoft SQL Server, Microsoft Access, MySQL, IBM DB2, Oracle Database, and/or the like.
  • the operating environment component may include an information handling subcomponent.
  • the information handling subcomponent may provide the VHP with capabilities to serve, deliver, upload, obtain, present, download, and/or the like a variety of information.
  • the information handling subcomponent may use protocols such as Hypertext Transfer Protocol (HTTP), Hypertext Transfer Protocol Secure (HTTPS), File Transfer Protocol (FTP), Telnet, Secure Shell (SSH), Transport Layer Security (TLS), Secure Sockets Layer (SSL), peer-to-peer (P2P) protocols (e.g., BitTorrent), and/or the like to handle communication of information such as web pages, files, multimedia content (e.g., streaming media), applications, and/or the like.
  • HTTP Hypertext Transfer Protocol
  • HTTPS Hypertext Transfer Protocol Secure
  • FTP File Transfer Protocol
  • Telnet Telnet
  • SSH Secure Shell
  • TLS Transport Layer Security
  • SSL Secure Sockets Layer
  • P2P peer-to-peer
  • BitTorrent BitTorrent
  • the information handling subcomponent may facilitate the serving of information to users, VHP components, nodes in a computer network, web browsers, and/or the like.
  • the information handling subcomponent may comprise a web server such as Apache HTTP Server, Microsoft Internet Information Services (IIS), Oracle WebLogic Server, Adobe Flash Media Server, Adobe Content Server, and/or the like.
  • a web server may include extensions, plug-ins, add-ons, servlets, and/or the like.
  • these may include Apache modules, IIS extensions, Java servlets, and/or the like.
  • the information handling subcomponent may communicate with the database subcomponent via standards such as Open Database Connectivity (ODBC), Java Database Connectivity (JDBC), ActiveX Data Objects for .NET (ADO.NET), and/or the like.
  • ODBC Open Database Connectivity
  • JDBC Java Database Connectivity
  • ADO.NET ActiveX Data Objects for .NET
  • the information handling subcomponent may use such standards to store, analyze, retrieve, access, modify, delete, aggregate, generate, and/or the like data (e.g., data from data stores 330 ) via the database subcomponent.
  • the information handling subcomponent may facilitate presentation of information obtained from users, VHP components, nodes in a computer network, web servers, and/or the like.
  • the information handling subcomponent may comprise a web browser such as Microsoft Internet Explorer, Mozilla Firefox, Apple Safari, Google Chrome, Opera Mobile, Amazon Silk, Nintendo 3DS Internet Browser, and/or the like.
  • a web browser may include extensions, plug-ins, add-ons, applets, and/or the like. For example, these may include Adobe Flash Player, Adobe Acrobat plug-in, Microsoft Silverlight plug-in, Microsoft Office plug-in, Java plug-in, and/or the like.
  • the operating environment component may include a messaging subcomponent.
  • the messaging subcomponent may facilitate VHP message communications capabilities.
  • the messaging subcomponent may use protocols such as Simple Mail Transfer Protocol (SMTP), Internet Message Access Protocol (IMAP), Post Office Protocol (POP), Extensible Messaging and Presence Protocol (XMPP), Real-time Transport Protocol (RTP), Internet Relay Chat (IRC), Skype protocol, AOL's Open System for Communication in Realtime (OSCAR), Messaging Application Programming Interface (MAPI), Facebook API, and/or the like to facilitate VHP message communications.
  • the messaging subcomponent may facilitate message communications such as email, instant messaging, Voice over IP (VoIP), video conferencing, Short Message Service (SMS), web chat, and/or the like.
  • VoIP Voice over IP
  • SMS Short Message Service
  • the messaging subcomponent may comprise Microsoft Exchange Server, Microsoft Outlook, Sendmail, IBM Lotus Domino, Gmail, AOL Instant Messenger (AIM), Yahoo Messenger, ICQ, Trillion, Skype, Google Talk, Apple FaceTime, Apple iChat, Facebook Chat, and/or the like.
  • Microsoft Outlook Microsoft Outlook
  • Sendmail IBM Lotus Domino
  • Gmail Gmail
  • AOL Instant Messenger AIM
  • Yahoo Messenger Yahoo Messenger
  • ICQ Trillion
  • Skype Google Talk
  • Apple FaceTime Apple iChat
  • Facebook Chat and/or the like.
  • the operating environment component may include a security subcomponent that facilitates VHP security.
  • the security subcomponent may restrict access to the VHP, to one or more services provided by the VHP, to data associated with the VHP (e.g., stored in data stores 330 ), to communication messages associated with the VHP, and/or the like to authorized users. Access may be granted via a login screen, via an API that obtains authentication information, via an authentication token, and/or the like.
  • the user may obtain access by providing a username and/or a password (e.g., a string of characters, a picture password), a personal identification number (PIN), an identification card, a magnetic stripe card, a smart card, a biometric identifier (e.g., a finger print, a voice print, a retina scan, a face scan), a gesture (e.g., a swipe), a media access control (MAC) address, an IP address, and/or the like.
  • a password e.g., a string of characters, a picture password
  • PIN personal identification number
  • an identification card e.g., a magnetic stripe card, a smart card
  • a biometric identifier e.g., a finger print, a voice print, a retina scan, a face scan
  • a gesture e.g., a swipe
  • MAC media access control
  • IP address IP address
  • ACLs access-control lists
  • the security subcomponent may facilitate digital
  • the security subcomponent may use cryptographic techniques to secure information (e.g., by storing encrypted data), verify message authentication (e.g., via a digital signature), provide integrity checking (e.g., a checksum), and/or the like by facilitating encryption and/or decryption of data.
  • cryptographic techniques may be used instead of or in combination with cryptographic techniques.
  • Cryptographic techniques used by the VHP may include symmetric key cryptography using shared keys (e.g., using one or more block ciphers such as triple Data Encryption Standard (DES), Advanced Encryption Standard (AES); stream ciphers such as Rivest Cipher 4 (RC4), Rabbit), asymmetric key cryptography using a public key/private key pair (e.g., using algorithms such as Rivest-Shamir-Adleman (RSA), Digital Signature Algorithm (DSA)), cryptographic hash functions (e.g., using algorithms such as Message-Digest 5 (MD5), Secure Hash Algorithm 2 (SHA-2)), and/or the like.
  • the security subcomponent may comprise a cryptographic system such as Pretty Good Privacy (PGP).
  • PGP Pretty Good Privacy
  • the operating environment component may include a virtualization subcomponent that facilitates VHP virtualization capabilities.
  • the virtualization subcomponent may provide support for platform virtualization (e.g., via a virtual machine).
  • Platform virtualization types may include full virtualization, partial virtualization, paravirtualization, and/or the like.
  • platform virtualization may be hardware-assisted (e.g., via support from the processor using technologies such as AMD-V, Intel VT-x, and/or the like).
  • the virtualization subcomponent may provide support for various other virtualized environments such as via operating-system level virtualization, desktop virtualization, workspace virtualization, mobile virtualization, application virtualization, database virtualization, and/or the like.
  • the virtualization subcomponent may provide support for various virtualized resources such as via memory virtualization, storage virtualization, data virtualization, network virtualization, and/or the like.
  • the virtualization subcomponent may comprise VMware software suite (e.g., VMware Server, VMware Workstation, VMware Player, VMware ESX, VMware ESXi, VMware ThinApp, VMware Infrastructure), Parallels software suite (e.g., Parallels Server, Parallels Workstation, Parallels Desktop, Parallels Mobile, Parallels Virtuozzo Containers), Oracle software suite (e.g., Oracle VM Server for SPARC, Oracle VM Server for x86, Oracle VM VirtualBox, Oracle Solaris 10, Oracle Solaris 11), Informatica Data Services, Wine, and/or the like.
  • VMware software suite e.g., VMware Server, VMware Workstation, VMware Player, VMware ESX, VMware ESXi, VMware ThinApp, VMware Infrastructure
  • Parallels software suite e.g., Parallels Server, Parallels Workstation, Parallels Desktop, Parallels Mobile, Parallels Virtuozzo Containers
  • components 340 may include a user interface component 340 b .
  • the user interface component may facilitate user interaction with the VHP by providing a user interface.
  • the user interface component may include programmatic instructions to obtain input from and/or provide output to the user via physical controls (e.g., physical buttons, switches, knobs, wheels, dials), textual user interface, audio user interface, GUI, voice recognition, gesture recognition, touch and/or multi-touch user interface, messages, APIs, and/or the like.
  • the user interface component may make use of the user interface elements provided by the operating system subcomponent of the operating environment component. For example, the user interface component may make use of the operating system subcomponent's user interface elements via a widget toolkit.
  • the user interface component may make use of information presentation capabilities provided by the information handling subcomponent of the operating environment component.
  • the user interface component may make use of a web browser to provide a user interface via HTML5, Adobe Flash, Microsoft Silverlight, and/or the like.
  • components 340 may include any of the components WVS 340 c described in more detail in preceding figures.
  • VIDEO HEADPHONES PLATFORM METHODS, APPARATUSES AND MEDIA shows various embodiments via which the claimed innovations may be practiced. It is to be understood that these embodiments and the features they describe are a representative sample presented to assist in understanding the claimed innovations, and are not exhaustive and/or exclusive. As such, the various embodiments, implementations, examples, and/or the like are deemed non-limiting throughout this disclosure. Furthermore, alternate undescribed embodiments may be available (e.g., equivalent embodiments). Such alternate embodiments have not been discussed in detail to preserve space and/or reduce repetition.
  • the organizational, logical, physical, functional, topological, and/or the like structures of the VHP coordinator, VHP coordinator elements, VHP data stores, VHP components and their subcomponents, capabilities, applications, and/or the like described in various embodiments throughout this disclosure are not limited to a fixed operating order and/or arrangement, instead, all equivalent operating orders and/or arrangements are contemplated by this disclosure.
  • VHP coordinator, VHP coordinator elements, VHP data stores, VHP components and their subcomponents, capabilities, applications, and/or the like described in various embodiments throughout this disclosure are not limited to serial execution, instead, any number and/or configuration of threads, processes, instances, services, servers, clients, nodes, and/or the like that execute in parallel, concurrently, simultaneously, synchronously, asynchronously, and/or the like is contemplated by this disclosure.
  • some of the features described in this disclosure may be mutually contradictory, incompatible, inapplicable, and/or the like, and are not present simultaneously in the same embodiment. Accordingly, the various embodiments, implementations, examples, and/or the like are not to be considered limitations on the disclosure as defined by the claims or limitations on equivalents to the claims.
  • a video headphone as disclosed herein users will be able to record video through a small camera embedded in the headphones themselves. Audio can also be recorded through microphones, which can be installed into the headphone cable control unit and also located underneath the video camera.
  • audio can be added, music can be selected for the video and filters added.
  • audio and video can be seamlessly synced and the output can be shared to desired social networks.
  • a video headphone as disclosed herein may be configured for use and with suitable wired interfacing to connect to an audio source as well as a SmartPhone USB connector.
  • the video headphone may stream audio and video signals to a SmartPhone for storage and later editing and/or uploading through a wire with audio and video controls on the controls that connects from video headphone to smartphone.
  • the video headphone may be worn and/or used by the user such that the camera's view may be adjusted to monitor what may be seen by the user and it's signal output may be combined with the audio source feed, such as that which is monitored by the user, and provided to an external SmartPhone.
  • Specific audio performance desired for the headphone device are those typical in professional use, i.e. bandwidth of 20 Hz to 20 kHz.
  • Specific video performance desired for the camera attachment to the video headphone device shall permit image motion captures of 25 fps or better with a resolution adequate and similar to that which is commonly downloaded to SmartPhone devices from various video streaming and downloadable sources with typical formats such as MP4, FL V, MPEG, etc.
  • Connection 1 may be a Micro-USB to Apple iPhone, iPhone 3G, iPhone 4, iPhone 4S 30-Pin Style Charger Adapter Tip.
  • Connection 2 may be a 3.5 mm Auxillary Cable Sync Connector.
  • Connection 3 may be a Micro USB Cable.
  • An app for use with a video headphone may:
  • Audio features of a system according to the present invention may include:
  • a system according to the present invention may:

Abstract

A request to record a user's world view video may be received. The world view video may be recorded via video headphones and adjusted based on user instructions. The adjusted world view video may be shared with the user's social network.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is based on, and claims priority to U.S. Provisional Application No. 61/731,126, filed Nov. 29, 2012, the entire contents of which being fully incorporated herein by reference.
  • This disclosure describes VIDEO HEADPHONES PLATFORM METHODS, APPARATUSES AND MEDIA (hereinafter “VHP”). A portion of the disclosure of this patent document contains material which is subject to copyright and/or mask work protection. The copyright and/or mask work owners have no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserve all copyright and mask work rights whatsoever.
  • FIELD
  • The present disclosure is directed generally to video headphones.
  • BACKGROUND
  • Many different types of headphones currently exist on the market. Some headphones concentrate more on providing optimal sound reproduction (e.g., high fidelity sound reproduction), while others concentrate on portability (e.g., small size, light weight, foldable). Headphones may obtain an audio signal from an audio source (e.g., a portable music player) either via an audio cable or via a wireless receiver.
  • BRIEF DESCRIPTION OF THE FIGURES
  • The accompanying figures and/or appendices illustrate various exemplary embodiments in accordance with the present disclosure.
  • FIG. 1 shows a logic flow diagram illustrating a world view sharing (WVS) component in one embodiment of the VHP. This is an exemplary VHP world view sharing (WVS) component.
  • FIG. 2 shows a data flow diagram in one embodiment of the VHP. This is an exemplary VHP data flow.
  • FIG. 3 shows a block diagram illustrating an exemplary VHP coordinator in one embodiment of the VHP. This is an exemplary VHP coordinator.
  • FIGS. 4 to 20 illustrate additional exemplary embodiments of the VHP.
  • DETAILED DESCRIPTION Introduction
  • The VHP empowers users to record, edit and share their world view with their social networks. A user may record a video through a video camera embedded in the user's headphones. Audio may also be recorded through one or more microphones, which are installed into the headphone cable control unit and/or located underneath the video camera. Once the user finishes recording the video, the user may add music, apply filters, make audio and/or video adjustments, provide a description of the video, and/or the like. After the user finishes editing the video, the video may be shared with the user's social networks.
  • Detailed Description of the VHP
  • FIG. 1 shows a logic flow diagram illustrating a world view sharing (WVS) component in one embodiment of the VHP. In FIG. 1, a request to share a world view video may be received at 101. For example, a world view video may be a video showing what may be seen and/or heard by a user (i.e., the view from the user's eye level). In one embodiment, the request to share a world view video may be received based on activation of a record button on video headphones (e.g., the record button may be integrated into a headphone cable) by the user. In another embodiment, the request to share a world view video may be received based on activation of a record button of a VHP mobile app (e.g., running on the user's smart phone) by the user.
  • The VHP may facilitate video recording via video headphones at 105. In various embodiments, video headphones may be headphones with an embedded and/or attached video camera. In various implementations, video headphones may include features such as HD premium quality sound, pivoting ear pads for multi-angle recording, one or more HD (e.g., 1080p) video cameras, one or more dynamic microphones, audio and/or video controls (e.g., record video, adjust volume, skip forward and/or back, mute) integrated into a headphone cable, and/or the like. In one implementation, the video headphones may include a video camera on one of the earpieces, and the video camera may have a microphone (e.g., located underneath the video camera). In another implementation, the video headphones may include two video cameras, one on each earpiece, and each video camera may have a microphone. A video camera and/or the accompanying microphone, if the video camera has a microphone, may pivot and/or swivel (e.g., 360 degrees in any direction) to allow the user to record video in any direction. For example, a user may angle the video camera up to record a video of a bird in a tree. In another example, the user may point one video camera straight ahead and the other video camera towards the back to record a split screen of what is in front and behind the user. In yet another example, the user may position both video cameras in such a way (e.g., both video cameras pointing in the same direction) as to allow three dimensional (3D) video recording. The video headphones may be connected (e.g., via a headphone cable, via a wireless technology capable of transmitting HD video) to the user's client (e.g., a smart phone, a portable media player, a tablet) and may record and transmit the video (e.g., including video and/or audio) to the client (e.g., via the VHP mobile app). For example, the video may be stored locally on the client. In another example, the video may be stored remotely (e.g., on a remote server). In one implementation, the video may be of any desired length. In another implementation, a minimum and/or a maximum length for the video (e.g., 15 seconds) may be specified.
  • A determination may be made at 110 whether the user wishes to trim the recorded video. In one embodiment, this determination may be made by prompting the user to indicate whether the user wishes to trim the video (e.g., via a “would you like to trim the video?” prompt). In another embodiment, this determination may be made by displaying a video selection widget via the VHP mobile app and allowing the user to trim the video via the video selection widget if the user chooses to do so.
  • If the user wishes to trim the video, a desired video segment may be determined at 115. In one embodiment, the user may trim the video by dragging a video selection widget of the VHP mobile app to select a video segment of a predetermined size (e.g., a 10 second video segment). In another embodiment, the user may trim the video by expanding and/or shrinking the size of a video selection widget of the VHP mobile app to select a desired video segment. In yet another embodiment, the user may combine multiple videos (e.g., the current video and one or more previously recorded videos) into one video. The video may be trimmed to the desired video segment at 120.
  • A determination may be made at 125 whether the user wishes to add audio to the video. In one embodiment, this determination may be made by prompting the user to indicate whether the user wishes to add audio to the video (e.g., via a “would you like to add audio?” prompt). In another embodiment, this determination may be made by facilitating user selection of audio via the VHP mobile app and allowing the user to add audio to the video if the user chooses to do so.
  • If the user wishes to add audio to the video, the VHP may facilitate audio selection at 130. In one embodiment, the user may select audio (e.g., music) from the user's audio collection (e.g., by pressing a “My Music” button of the VHP mobile app). In one implementation, the user may select a music album and/or track (e.g., a song) and/or a playlist from the user's music collection. The user may also utilize an audio selection widget to select the desired portion of the selected audio to be added to the video. For example, the length of the selected audio portion may be set to be equal to the length of the video. In another implementation, the VHP may suggest audio to the user that well matches actions in the video. For example, the VHP may suggest songs and/or playlists and/or audio portions whose tempo matches the actions in the video. In another embodiment, the user may select audio from the VHP's audio collection (e.g., by pressing a “VHP Library” button of the VHP mobile app). For example, the user may purchase music albums and/or songs and/or audio effects via the VHP mobile app. In one implementation, the user may select audio and/or a portion of the audio from the VHP's audio collection in a similar manner as described above with regard to the user's audio collection. In another implementation, the user may select audio from the VHP's audio collection based on suggestions from the VHP made in a similar manner as described above with regard to the user's audio collection.
  • A determination may be made at 135 whether the user wishes to adjust the audio. In one embodiment, this determination may be made by prompting the user to indicate whether the user wishes to adjust the audio (e.g., via a “would you like to adjust the audio?” prompt). In another embodiment, this determination may be made by facilitating user selection of audio adjustments via the VHP mobile app and allowing the user to adjust the audio if the user chooses to do so.
  • If the user wishes to adjust the audio, the VHP may facilitate desired audio adjustments at 140. In one embodiment, the user may adjust the audio by speeding up and/or slowing down the audio. For example, the user may utilize “Slow Down” and/or “Speed Up” buttons of the VHP mobile app to adjust audio speed (e.g., in discreet chunks, such as 2× or 3× faster or slower; continuously, such as based on the length of time that a button is pressed). In this embodiment, the length of the selected audio portion may vary based on the speed of the audio. For example, if the user speeds up a song, the user may be able to select a longer portion of the song. In another example, the user may choose to have an entire song play during the duration (e.g., 2 minutes) of the video, and the speed at which the song is played back may be adjusted accordingly (e.g., so that the entire song is played back in 2 minutes) by the VHP. In another embodiment, the user may adjust the audio by auto tuning the audio. For example, the user may utilize an “Auto Tune” button of the VHP mobile app. In yet another embodiment, the user may adjust the audio by adding sound effects to the audio. For example, the user may utilize a sound effects selection widget of the VHP mobile app.
  • The audio may be added to the video at 145. In one embodiment, the added audio may replace audio recorded in the video. In another embodiment, the added audio may be combined with audio recorded in the video. In some embodiments, the user may post artist credit (e.g., via a “Post Artist Credit” button of the VHP mobile app) for audio (e.g., a song) being used in the video. For example, the artist credit may scroll in a specified location (e.g., across the bottom of the video).
  • A determination may be made at 150 whether the user wishes to add video effects to the video. In one embodiment, this determination may be made by prompting the user to indicate whether the user wishes to add video effects to the video (e.g., via a “would you like to add video effects?” prompt). In another embodiment, this determination may be made by facilitating user selection of video effects via the VHP mobile app and allowing the user to add video effects to the video if the user chooses to do so.
  • If the user wishes to add video effects to the video, the VHP may determine desired video effects at 155. In one embodiment, the user may select desired video effects (e.g., sepia filter, black and white filter, monochromatic filter, a light filter, a frame around the video, speed up and/or slow down the video) via a video effect selection widget of the VHP mobile app. In another embodiment, the user may wish to add various video elements (e.g., a video start element, a video end element, a transition element, a comments element) to the video. For example, the user may wish to insert a comment into the video after an amusing scene in the video. The desired video effects may be added to the video at 160.
  • A description of the video may be obtained from the user at 165. In one embodiment, the description may include a title. In another embodiment, the description may include the user's description for the video. In some implementations, the description may have a minimum and/or a maximum size. For example, a title may have to be at least 1 character long and no more than 50 characters long. In another example, the user's description for the video may have to be no more than 148 characters. In yet another example, the minimum and/or the maximum size for the description may correspond to those specified by social networks on which the video may be shared (e.g., a maximum of 140 characters for Twitter).
  • The video may be added to the user's video library at 170. For example, the video may be stored (e.g., on the client, on a remote server) and added to the user's VHP profile. If the user chooses to share the video with other VHP users, the other VHP users may comment on the video, mark the video as favorite, forward the video to others, and/or the like. The user may also see how many times the video has been viewed, post comments, and/or the like.
  • A determination may be made at 175 whether the user wishes to share the video via one or more social networks (e.g., Facebook, Tumblr, Twitter, Instagram, Pinterest, Vimeo, YouTube). In one embodiment, this determination may be made by prompting the user to indicate whether the user wishes to share the video (e.g., via a “would you like to share the video?” prompt). In another embodiment, this determination may be made by facilitating user selection of social networks on which to share the video via the VHP mobile app and allowing the user to share the video if the user chooses to do so.
  • If the user wishes to share the video, social networks on which the user wishes to share the video may be determined at 180. In one embodiment, the user may select desired social networks via appropriate buttons of the VHP mobile app and press a “Share Now” button to initiate sharing on the selected social networks. In another embodiment, the user may specify default desired social networks (e.g., via the user's profile settings) so that the default desired social networks are preselected for the user, and the user may press a “Share Now” button to initiate sharing on the selected social networks. The video may be shared on the selected social networks at 185 (e.g., via appropriate APIs provided by the social networks).
  • FIG. 2 shows a data flow diagram in one embodiment of the VHP. FIG. 2 provides an example of how data may flow to, through, and/or from the VHP. In FIG. 2, a user 202 may provide a command to initiate video recording 221 to the client 210 (e.g., a smart phone, a portable media player, a tablet). In one embodiment, the user may provide the command to initiate video recording by pressing a record button on the user's video headphones 206. In another embodiment, the user may provide the command to initiate video recording by pressing a record button of a VHP mobile app on the client.
  • The client may send a video data request 225 to the video headphones. For example, the video data request may be in XML format and may include data such as a command to initiate video recording, video parameters (e.g., video resolution, video aspect ratio, audio quality), video recording length, and/or the like. The video headphones may begin and/or end video recording based on the video data request.
  • The video headphones may send a video data response 229 to the client. For example, the video data response may be in XML format and may include data such as the recorded video (e.g., including video and/or audio), video information (e.g., date and/or time of recording, location of the video), and/or the like.
  • The client may output an adjustments request 233 to the user. In various embodiments, the adjustments request may prompt the user to trim the video, to add audio to the video, to adjust audio in the video, to add video effects to the video, to provide a description of the video, to share the video, and/or the like. For example, the adjustments request may be output via a GUI of the VHP mobile app.
  • The user may input an adjustments response 237 into the client. In various embodiments, the adjustments response may indicate whether and/or how the user wishes to trim the video, to add audio to the video, to adjust audio in the video, to add video effects to the video, to describe the video, to share the video, and/or the like. For example, the adjustments response may be input via the GUI of the VHP mobile app.
  • If the user wishes to add audio (e.g., from the VHP's audio collection) to the video, the client may send an audio data request 241 to a VHP server 214. For example, the VHP server may store songs available from the VHP (e.g., in an audio data store 330 c). For example, the audio data request may be in XML format and may include data such as the user's identifier and/or password, an identifier of a requested song and/or album and/or playlist, audio parameters (e.g., audio format, audio quality in kbps), a payment method, and/or the like.
  • The VHP server may send an audio data response 245 to the client. For example, the audio data response may be in XML format and may include data such as the requested song and/or album and/or playlist, audio parameters, a payment confirmation, and/or the like. The client may add the obtained audio to the video and/or adjust the obtained audio and/or the video based on user instructions.
  • If the user wishes to share the video, the client may send a share request 249 to a social network 218. The share request may include the video and instruct the social network to post the video via the user's social network account. For example, the share request may be sent via the social network's API command and may include data such as the user's identifier and/or password on the social network, the video, the description of the video, video information (e.g., date and/or time of recording, location of the video obtained via the client's GPS), and/or the like. The social network may send a share response 253 to the client. For example, the share response may be sent via the social network's API command and may indicate whether the video was shared on the social network successfully.
  • The client may provide a video output 257 to the user. For example, the video output may inform the user whether the user's video has been stored (e.g., in a videos data store 330 d), added to the user's profile, shared on one or more social networks, and/or the like.
  • Detailed Description of the VHP Coordinator
  • FIG. 3 shows a block diagram illustrating an exemplary VHP coordinator in one embodiment of the VHP. The VHP coordinator facilitates the operation of the VHP via a computer system (e.g., one or more cloud computing systems, grid computing systems, virtualized computer systems, mainframe computers, servers, clients, nodes, desktops, mobile devices such as smart phones, cellular phones, tablets, personal digital assistants (PDAs), and/or the like, embedded computers, dedicated computers, a system on a chip (SOC)). For example, the VHP coordinator may receive, obtain, aggregate, process, generate, store, retrieve, send, delete, input, output, and/or the like data (including program data and program instructions); may execute program instructions; may communicate with computer systems, with nodes, with users, and/or the like. In various embodiments, the VHP coordinator may comprise a standalone computer system, a distributed computer system, a node in a computer network (i.e., a network of computer systems organized in a topology), a network of VHP coordinators, and/or the like. It is to be understood that the VHP coordinator and/or the various VHP coordinator elements (e.g., processor, system bus, memory, input/output devices) may be organized in any number of ways (i.e., using any number and configuration of computer systems, computer networks, nodes, VHP coordinator elements, and/or the like) to facilitate VHP operation. Furthermore, it is to be understood that the various VHP coordinator computer systems, VHP coordinator computer networks, VHP coordinator nodes, VHP coordinator elements, and/or the like may communicate among each other in any number of ways to facilitate VHP operation. As used in this disclosure, the term “user” refers generally to people and/or computer systems that interact with the VHP; the term “server” refers generally to a computer system, a program, and/or a combination thereof that handles requests and/or responds to requests from clients via a computer network; the term “client” refers generally to a computer system, a program, a user, and/or a combination thereof that generates requests and/or handles responses from servers via a computer network; the term “node” refers generally to a server, to a client, and/or to an intermediary computer system, program, and/or a combination thereof that facilitates transmission of and/or handling of requests and/or responses.
  • The VHP coordinator includes a processor 301 that executes program instructions (e.g., VHP program instructions). In various embodiments, the processor may be a general purpose microprocessor (e.g., a central processing unit (CPU)), a dedicated microprocessor (e.g., a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a network processor, and/or the like), an external processor, a plurality of processors (e.g., working in parallel, distributed, and/or the like), a microcontroller (e.g., for an embedded system), and/or the like. The processor may be implemented using integrated circuits (ICs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), and/or the like. In various implementations, the processor may comprise one or more cores, may include embedded elements (e.g., a coprocessor such as a math coprocessor, a cryptographic coprocessor, a physics coprocessor, and/or the like, registers, cache memory, software), may be synchronous (e.g., using a clock signal) or asynchronous (e.g., without a central clock), and/or the like. For example, the processor may be an AMD FX processor, an AMD Opteron processor, an AMID Geode LX processor, an Intel Core i7 processor, an Intel Xeon processor, an Intel Atom processor, an ARM Cortex processor, an IBM PowerPC processor, and/or the like.
  • The processor may be connected to system memory 305 via a system bus 303. The system bus may interconnect these and/or other elements of the VHP coordinator via electrical, electronic, optical, wireless, and/or the like communication links (e.g., the system bus may be integrated into a motherboard that interconnects VHP coordinator elements and provides power from a power supply). In various embodiments, the system bus may comprise one or more control buses, address buses, data buses, memory buses, peripheral buses, and/or the like. In various implementations, the system bus may be a parallel bus, a serial bus, a daisy chain design, a hub design, and/or the like. For example, the system bus may comprise a front-side bus, a back-side bus, AMD's HyperTransport, Intel's QuickPath Interconnect, a peripheral component interconnect (PCI) bus, an accelerated graphics port (AGP) bus, a PCI Express bus, a low pin count (LPC) bus, a universal serial bus (USB), and/or the like. The system memory, in various embodiments, may comprise registers, cache memory (e.g., level one, level two, level three), read only memory (ROM) (e.g., BIOS, flash memory), random access memory (RAM) (e.g., static RAM (SRAM), dynamic RAM (DRAM), error-correcting code (ECC) memory), and/or the like. The system memory may be discreet, external, embedded, integrated into a CPU, and/or the like. The processor may access, read from, write to, store in, erase, modify, and/or the like, the system memory in accordance with program instructions (e.g., VHP program instructions) executed by the processor. The system memory may facilitate accessing, storing, retrieving, modifying, deleting, and/or the like data (e.g., VHP data) by the processor.
  • In various embodiments, input/output devices 310 may be connected to the processor and/or to the system memory, and/or to one another via the system bus.
  • In some embodiments, the input/output devices may include one or more graphics devices 311. The processor may make use of the one or more graphic devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor. In one implementation, a graphics device may be a video card that may obtain (e.g., via a connected video camera), process (e.g., render a frame), output (e.g., via a connected monitor, television, and/or the like), and/or the like graphical (e.g., multimedia, video, image, text) data (e.g., VHP data). A video card may be connected to the system bus via an interface such as PCI, AGP, PCI Express, USB, PC Card, ExpressCard, and/or the like. A video card may use one or more graphics processing units (GPUs), for example, by utilizing AMD's CrossFireX and/or NVIDIA's SLI technologies. A video card may be connected via an interface (e.g., video graphics array (VGA), digital video interface (DVI), Mini-DVI, Micro-DVI, high-definition multimedia interface (HDMI), DisplayPort, Thunderbolt, composite video, S-Video, component video, and/or the like) to one or more displays (e.g., cathode ray tube (CRT), liquid crystal display (LCD), touchscreen, and/or the like) that display graphics. For example, a video card may be an AMD Radeon HD 6990, an ATI Mobility Radeon HD 5870, an AMD FirePro V9800P, an AMD Radeon E6760 MXM V3.0 Module, an NVIDIA GeForce GTX 590, an NVIDIA GeForce GTX 580M, an Intel HD Graphics 3000, and/or the like. In another implementation, a graphics device may be a video capture board that may obtain (e.g., via coaxial cable), process (e.g., overlay with other graphical data), capture, convert (e.g., between different formats, such as MPEG2 to H.264), and/or the like graphical data. A video capture board may be and/or include a TV tuner, may be compatible with a variety of broadcast signals (e.g., NTSC, PAL, ATSC, QAM) may be a part of a video card, and/or the like. For example, a video capture board may be an ATI All-in-Wonder HD, a Hauppauge ImpactVBR 01381, a Hauppauge WinTV-HVR-2250, a Hauppauge Colossus 01414, and/or the like. A graphics device may be discreet, external, embedded, integrated into a CPU, and/or the like. A graphics device may operate in combination with other graphics devices (e.g., in parallel) to provide improved capabilities, data throughput, color depth, and/or the like.
  • In some embodiments, the input/output devices may include one or more audio devices 313. The processor may make use of the one or more audio devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor. In one implementation, an audio device may be a sound card that may obtain (e.g., via a connected microphone), process, output (e.g., via connected speakers), and/or the like audio data (e.g., VHP data). A sound card may be connected to the system bus via an interface such as PCI, PCI Express, USB, PC Card, ExpressCard, and/or the like. A sound card may be connected via an interface (e.g., tip sleeve (TS), tip ring sleeve (TRS), RCA, TOSLINK, optical) to one or more amplifiers, speakers (e.g., mono, stereo, surround sound), subwoofers, digital musical instruments, and/or the like. For example, a sound card may be an Intel AC'97 integrated codec chip, an Intel HD Audio integrated codec chip, a Creative Sound Blaster X-Fi Titanium HD, a Creative Sound Blaster X-Fi Go! Pro, a Creative Sound Blaster Recon 3D, a Turtle Beach Riviera, a Turtle Beach Amigo II, and/or the like. An audio device may be discreet, external, embedded, integrated into a motherboard, and/or the like. An audio device may operate in combination with other audio devices (e.g., in parallel) to provide improved capabilities, data throughput, audio quality, and/or the like.
  • In some embodiments, the input/output devices may include one or more network devices 315. The processor may make use of the one or more network devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor. In one implementation, a network device may be a network card that may obtain (e.g., via a Category 5 Ethernet cable), process, output (e.g., via a wireless antenna), and/or the like network data (e.g., VHP data). A network card may be connected to the system bus via an interface such as PCI, PCI Express, USB, FireWire, PC Card, ExpressCard, and/or the like. A network card may be a wired network card (e.g., 10/100/1000, optical fiber), a wireless network card (e.g., Wi-Fi 802.11a/b/g/n/ac/ad, Bluetooth, Near Field Communication (NFC), TransferJet), a modem (e.g., dialup telephone-based, asymmetric digital subscriber line (ADSL), cable modem, power line modem, wireless modem based on cellular protocols such as high speed packet access (HSPA), evolution-data optimized (EV-DO), global system for mobile communications (GSM), worldwide interoperability for microwave access (WiMax), long term evolution (LTE), and/or the like, satellite modem, FM radio modem, radio-frequency identification (RFID) modem, infrared (IR) modem), and/or the like. For example, a network card may be an Intel EXPI9301CT, an Intel EXPI9402PT, a LINKSYS USB300M, a BUFFALO WLI-UC-G450, a Rosewill RNX-MiniN1, a TRENDnet TEW-623PI, a Rosewill RNX-N180UBE, an ASUS USB-BT211, a MOTOROLA SB6120, a U.S. Robotics USR5686G, a Zoom 5697-00-00F, a TRENDnet TPL-401E2K, a D-Link DHP-W306AV, a StarTech ET91000SC, a Broadcom BCM20791, a Broadcom InConcert BCM4330, a Broadcom BCM4360, an LG VL600, a Qualcomm MDM9600, a Toshiba TC35420 TransferJet device, and/or the like. A network device may be discreet, external, embedded, integrated into a motherboard, and/or the like. A network device may operate in combination with other network devices (e.g., in parallel) to provide improved data throughput, redundancy, and/or the like. For example, protocols such as link aggregation control protocol (LACP) based on IEEE 802.3AD-2000 or IEEE 802.1AX-2008 standards may be used. A network device may be used to connect to a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a personal area network, the Internet, an intranet, a Bluetooth network, an NFC network, a Wi-Fi network, a cellular network, and/or the like.
  • In some embodiments, the input/output devices may include one or more peripheral devices 317. The processor may make use of the one or more peripheral devices in accordance with program instructions (e.g., VHP program instructions) executed by the processor. In various implementations, a peripheral device may be a digital camera, a video camera, a webcam, an electronically moveable pan tilt zoom (PTZ) camera, a monitor, a touchscreen display, active shutter 3D glasses, head-tracking 3D glasses, a remote control, an audio line-in, an audio line-out, a microphone, headphones, speakers, a subwoofer, a router, a hub, a switch, a firewall, an antenna, a keyboard, a mouse, a trackpad, a trackball, a digitizing tablet, a stylus, a joystick, a gamepad, a game controller, a force-feedback device, a laser, sensors (e.g., proximity sensor, rangefinder, ambient temperature sensor, ambient light sensor, humidity sensor, an accelerometer, a gyroscope, a motion sensor, an olfaction sensor, a biosensor, a chemical sensor, a magnetometer, a radar, a sonar, a location sensor such as global positioning system (GPS), Galileo, GLONASS, and/or the like), a printer, a fax, a scanner, a copier, a card reader, and/or the like. A peripheral device may be connected to the system bus via an interface such as PCI, PCI Express, USB, FireWire, VGA, DVI, Mini-DVI, Micro-DVI, HDMI, DisplayPort, Thunderbolt, composite video, S-Video, component video, PC Card, ExpressCard, serial port, parallel port, PS/2, TS, TRS, RCA, TOSLINK, network connection (e.g., wired such as Ethernet, optical fiber, and/or the like, wireless such as Wi-Fi, Bluetooth, NFC, cellular, and/or the like), a connector of another input/output device, and/or the like. A peripheral device may be discreet, external, embedded, integrated (e.g., into a processor, into a motherboard), and/or the like. A peripheral device may operate in combination with other peripheral devices (e.g., in parallel) to provide the VHP coordinator with a variety of input, output and processing capabilities.
  • In some embodiments, the input/output devices may include one or more storage devices 319. The processor may access, read from, write to, store in, erase, modify, and/or the like a storage device in accordance with program instructions (e.g., VHP program instructions) executed by the processor. A storage device may facilitate accessing, storing, retrieving, modifying, deleting, and/or the like data (e.g., VHP data) by the processor. In one implementation, the processor may access data from the storage device directly via the system bus. In another implementation, the processor may access data from the storage device by instructing the storage device to transfer the data to the system memory and accessing the data from the system memory. In various embodiments, a storage device may be a hard disk drive (HDD), a solid-state drive (SSD), a floppy drive using diskettes, an optical disk drive (e.g., compact disk (CD-ROM) drive, CD-Recordable (CD-R) drive, CD-Rewriteable (CD-RW) drive, digital versatile disc (DVD-ROM) drive, DVD-R drive, DVD-RW drive, Blu-ray disk (BD) drive) using an optical medium, a magnetic tape drive using a magnetic tape, a memory card (e.g., a USB flash drive, a compact flash (CF) card, a secure digital extended capacity (SDXC) card), a network attached storage (NAS), a direct-attached storage (DAS), a storage area network (SAN), other processor-readable physical mediums, and/or the like. A storage device may be connected to the system bus via an interface such as PCI, PCI Express, USB, FireWire, PC Card, ExpressCard, integrated drive electronics (IDE), serial advanced technology attachment (SATA), external SATA (eSATA), small computer system interface (SCSI), serial attached SCSI (SAS), fibre channel (FC), network connection (e.g., wired such as Ethernet, optical fiber, and/or the like; wireless such as Wi-Fi, Bluetooth, NFC, cellular, and/or the like), and/or the like. A storage device may be discreet, external, embedded, integrated (e.g., into a motherboard, into another storage device), and/or the like. A storage device may operate in combination with other storage devices to provide improved capacity, data throughput, data redundancy, and/or the like. For example, protocols such as redundant array of independent disks (RAID) (e.g., RAID 0 (striping), RAID 1 (mirroring), RAID 5 (striping with distributed parity), hybrid RAID), just a bunch of drives (JBOD), and/or the like may be used. In another example, virtual and/or physical drives may be pooled to create a storage pool. In yet another example, an SSD cache may be used with a HDD to improve speed.
  • Together and/or separately the system memory 305 and the one or more storage devices 319 may be referred to as memory 320 (i.e., physical memory).
  • VHP memory 320 contains processor-operable (e.g., accessible) VHP data stores 330. Data stores 330 comprise data that may be used (e.g., by the VHP) via the VHP coordinator. Such data may be organized using one or more data formats such as a database (e.g., a relational database with database tables, an object-oriented database, a graph database, a hierarchical database), a flat file (e.g., organized into a tabular format), a binary file (e.g., a GIF file, an MPEG-4 file), a structured file (e.g., an HTML file, an XML file), a text file, and/or the like. Furthermore, data may be organized using one or more data structures such as an array, a queue, a stack, a set, a linked list, a map, a tree, a hash, a record, an object, a directed graph, and/or the like. In various embodiments, data stores may be organized in any number of ways (i.e., using any number and configuration of data formats, data structures, VHP coordinator elements, and/or the like) to facilitate VHP operation. For example, VHP data stores may comprise data stores 330 a-d implemented as one or more databases. A users data store 330 a may be a collection of database tables that include fields such as UserID, UserName, UserPreferences, UserVideos, UserSocialNetworks, and/or the like. A clients data store 330 b may be a collection of database tables that include fields such as ClientID, ClientName, ClientDeviceType, ClientScreenResolution, and/or the like. An audio data store 330 c may be a collection of database tables that include fields such as AudioID, AudioAlbum, AudioPlaylist, AudioFormat, AudioQuality, AudioPrice, and/or the like. A videos data store 330 d may be a collection of database tables that include fields such as VideoID, VideoTitle, VideoDescription, VideoResolution, VideoEffects, VideoSharingSettings, and/or the like. The VHP coordinator may use data stores 330 to keep track of inputs, parameters, settings, variables, records, outputs, and/or the like.
  • VHP memory 320 contains processor-operable (e.g., executable) VHP components 340. Components 340 comprise program components (including program instructions and any associated data stores) that are executed (e.g., by the VHP) via the VHP coordinator (i.e., via the processor) to transform VHP inputs into VHP outputs. It is to be understood that the various components and their subcomponents, capabilities, applications, and/or the like may be organized in any number of ways (i.e., using any number and configuration of components, subcomponents, capabilities, applications, VHP coordinator elements, and/or the like) to facilitate VHP operation. Furthermore, it is to be understood that the various components and their subcomponents, capabilities, applications, and/or the like may communicate among each other in any number of ways to facilitate VHP operation. For example, the various components and their subcomponents, capabilities, applications, and/or the like may be combined, integrated, consolidated, split up, distributed, and/or the like in any number of ways to facilitate VHP operation. In another example, a single or multiple instances of the various components and their subcomponents, capabilities, applications, and/or the like may be instantiated on each of a single VHP coordinator node, across multiple VHP coordinator nodes, and/or the like.
  • In various embodiments, program components may be developed using one or more programming languages, techniques, tools, and/or the like such as an assembly language, Ada, BASIC, C, C++, C#, COBOL, Fortran, Java, LabVIEW, Lisp, Mathematica, MATLAB, OCaml, PL/I, Smalltalk, Visual Basic for Applications (VBA), HTML, XML, CSS, JavaScript, JavaScript Object Notation (JSON), PHP, Perl, Ruby, Python, Asynchronous JavaScript and XML (AJAX), Simple Object Access Protocol (SOAP), SSL, ColdFusion, Microsoft .NET, Apache modules, Adobe Flash, Adobe AIR, Microsoft Silverlight, Windows PowerShell, batch files, Tcl, graphical user interface (GUI) toolkits, SQL, database adapters, web application programming interfaces (APIs), application server extensions, integrated development environments (IDEs), libraries (e.g., object libraries, class libraries, remote libraries), remote procedure calls (RPCs), Common Object Request Broker Architecture (CORBA), and/or the like.
  • In some embodiments, components 340 may include an operating environment component 340 a. The operating environment component may facilitate operation of the VHP via various subcomponents.
  • In some implementations, the operating environment component may include an operating system subcomponent. The operating system subcomponent may provide an abstraction layer that facilitates the use of, communication among, common services for, interaction with, security of and/or the like of various VHP coordinator elements, components, data stores, and/or the like.
  • In some embodiments, the operating system subcomponent may facilitate execution of program instructions (e.g., VHP program instructions) by the processor by providing process management capabilities. For example, the operating system subcomponent may facilitate the use of multiple processors, the execution of multiple processes, multitasking, and/or the like.
  • In some embodiments, the operating system subcomponent may facilitate the use of memory by the VHP. For example, the operating system subcomponent may allocate and/or free memory, facilitate memory addressing, provide memory segmentation and/or protection, provide virtual memory capability, facilitate caching, and/or the like. In another example, the operating system subcomponent may include a file system (e.g., File Allocation Table (FAT), New Technology File System (NTFS), Hierarchical File System Plus (HFS+), Universal Disk Format (UDF), Linear Tape File System (LTFS)) to facilitate storage, retrieval, deletion, aggregation, processing, generation, and/or the like of data.
  • In some embodiments, the operating system subcomponent may facilitate operation of and/or processing of data for and/or from input/output devices. For example, the operating system subcomponent may include one or more device drivers, interrupt handlers, file systems, and/or the like that allow interaction with input/output devices.
  • In some embodiments, the operating system subcomponent may facilitate operation of the VHP coordinator as a node in a computer network by providing support for one or more communications protocols. For example, the operating system subcomponent may include support for the internet protocol suite (i.e., Transmission Control Protocol/Internet Protocol (TCP/IP)) of network protocols such as TCP, IP, User Datagram Protocol (UDP), Mobile IP, and/or the like. In another example, the operating system subcomponent may include support for security protocols (e.g., Wired Equivalent Privacy (WEP), Wi-Fi Protected Access (WPA), WPA2) for wireless computer networks. In yet another example, the operating system subcomponent may include support for virtual private networks (VPNs).
  • In some embodiments, the operating system subcomponent may facilitate security of the VHP coordinator. For example, the operating system subcomponent may provide services such as authentication, authorization, audit, network intrusion-detection capabilities, firewall capabilities, antivirus capabilities, and/or the like.
  • In some embodiments, the operating system subcomponent may facilitate user interaction with the VHP by providing user interface elements that may be used by the VHP to generate a user interface. In one implementation, such user interface elements may include widgets (e.g., windows, dialog boxes, scrollbars, menu bars, tabs, ribbons, menus, buttons, text boxes, checkboxes, combo boxes, drop-down lists, list boxes, radio buttons, sliders, spinners, grids, labels, progress indicators, icons, tooltips, and/or the like) that may be used to obtain input from and/or provide output to the user. For example, such widgets may be used via a widget toolkit such as Microsoft Foundation Classes (MFC), Apple Cocoa Touch, Java Swing, GTK+, Qt, Yahoo! User Interface Library (YUI), and/or the like. In another implementation, such user interface elements may include sounds (e.g., event notification sounds stored in MP3 file format), animations, vibrations, and/or the like that may be used to inform the user regarding occurrence of various events. For example, the operating system subcomponent may include a user interface such as Windows Aero, Mac OS X Aqua, GNOME Shell, KDE Plasma Workspaces (e.g., Plasma Desktop, Plasma Netbook, Plasma Contour, Plasma Mobile), and/or the like.
  • In various embodiments the operating system subcomponent may comprise a single-user operating system, a multi-user operating system, a single-tasking operating system, a multitasking operating system, a single-processor operating system, a multiprocessor operating system, a distributed operating system, an embedded operating system, a real-time operating system, and/or the like. For example, the operating system subcomponent may comprise an operating system such as UNIX, LINUX, IBM i, Sun Solaris, Microsoft Windows Server, Microsoft DOS, Microsoft Windows 7, Apple Mac OS X, Apple iOS, Android, Symbian, Windows Phone 7, Blackberry QNX, and/or the like.
  • In some implementations, the operating environment component may include a database subcomponent. The database subcomponent may facilitate VHP capabilities such as storage, analysis, retrieval, access, modification, deletion, aggregation, generation, and/or the like of data (e.g., the use of data stores 330). The database subcomponent may make use of database languages (e.g., Structured Query Language (SQL), XQuery), stored procedures, triggers, APIs, and/or the like to provide these capabilities. In various embodiments the database subcomponent may comprise a cloud database, a data warehouse, a distributed database, an embedded database, a parallel database, a real-time database, and/or the like. For example, the database subcomponent may comprise a database such as Microsoft SQL Server, Microsoft Access, MySQL, IBM DB2, Oracle Database, and/or the like.
  • In some implementations, the operating environment component may include an information handling subcomponent. The information handling subcomponent may provide the VHP with capabilities to serve, deliver, upload, obtain, present, download, and/or the like a variety of information. The information handling subcomponent may use protocols such as Hypertext Transfer Protocol (HTTP), Hypertext Transfer Protocol Secure (HTTPS), File Transfer Protocol (FTP), Telnet, Secure Shell (SSH), Transport Layer Security (TLS), Secure Sockets Layer (SSL), peer-to-peer (P2P) protocols (e.g., BitTorrent), and/or the like to handle communication of information such as web pages, files, multimedia content (e.g., streaming media), applications, and/or the like.
  • In some embodiments, the information handling subcomponent may facilitate the serving of information to users, VHP components, nodes in a computer network, web browsers, and/or the like. For example, the information handling subcomponent may comprise a web server such as Apache HTTP Server, Microsoft Internet Information Services (IIS), Oracle WebLogic Server, Adobe Flash Media Server, Adobe Content Server, and/or the like. Furthermore, a web server may include extensions, plug-ins, add-ons, servlets, and/or the like. For example, these may include Apache modules, IIS extensions, Java servlets, and/or the like. In some implementations, the information handling subcomponent may communicate with the database subcomponent via standards such as Open Database Connectivity (ODBC), Java Database Connectivity (JDBC), ActiveX Data Objects for .NET (ADO.NET), and/or the like. For example, the information handling subcomponent may use such standards to store, analyze, retrieve, access, modify, delete, aggregate, generate, and/or the like data (e.g., data from data stores 330) via the database subcomponent.
  • In some embodiments, the information handling subcomponent may facilitate presentation of information obtained from users, VHP components, nodes in a computer network, web servers, and/or the like. For example, the information handling subcomponent may comprise a web browser such as Microsoft Internet Explorer, Mozilla Firefox, Apple Safari, Google Chrome, Opera Mobile, Amazon Silk, Nintendo 3DS Internet Browser, and/or the like. Furthermore, a web browser may include extensions, plug-ins, add-ons, applets, and/or the like. For example, these may include Adobe Flash Player, Adobe Acrobat plug-in, Microsoft Silverlight plug-in, Microsoft Office plug-in, Java plug-in, and/or the like.
  • In some implementations, the operating environment component may include a messaging subcomponent. The messaging subcomponent may facilitate VHP message communications capabilities. The messaging subcomponent may use protocols such as Simple Mail Transfer Protocol (SMTP), Internet Message Access Protocol (IMAP), Post Office Protocol (POP), Extensible Messaging and Presence Protocol (XMPP), Real-time Transport Protocol (RTP), Internet Relay Chat (IRC), Skype protocol, AOL's Open System for Communication in Realtime (OSCAR), Messaging Application Programming Interface (MAPI), Facebook API, and/or the like to facilitate VHP message communications. The messaging subcomponent may facilitate message communications such as email, instant messaging, Voice over IP (VoIP), video conferencing, Short Message Service (SMS), web chat, and/or the like. For example, the messaging subcomponent may comprise Microsoft Exchange Server, Microsoft Outlook, Sendmail, IBM Lotus Domino, Gmail, AOL Instant Messenger (AIM), Yahoo Messenger, ICQ, Trillion, Skype, Google Talk, Apple FaceTime, Apple iChat, Facebook Chat, and/or the like.
  • In some implementations, the operating environment component may include a security subcomponent that facilitates VHP security. In some embodiments, the security subcomponent may restrict access to the VHP, to one or more services provided by the VHP, to data associated with the VHP (e.g., stored in data stores 330), to communication messages associated with the VHP, and/or the like to authorized users. Access may be granted via a login screen, via an API that obtains authentication information, via an authentication token, and/or the like. For example, the user may obtain access by providing a username and/or a password (e.g., a string of characters, a picture password), a personal identification number (PIN), an identification card, a magnetic stripe card, a smart card, a biometric identifier (e.g., a finger print, a voice print, a retina scan, a face scan), a gesture (e.g., a swipe), a media access control (MAC) address, an IP address, and/or the like. Various security models such as access-control lists (ACLs), capability-based security, hierarchical protection domains, and/or the like may be used to control access. For example, the security subcomponent may facilitate digital rights management (DRM), network intrusion detection, firewall capabilities, and/or the like.
  • In some embodiments, the security subcomponent may use cryptographic techniques to secure information (e.g., by storing encrypted data), verify message authentication (e.g., via a digital signature), provide integrity checking (e.g., a checksum), and/or the like by facilitating encryption and/or decryption of data. Furthermore, steganographic techniques may be used instead of or in combination with cryptographic techniques. Cryptographic techniques used by the VHP may include symmetric key cryptography using shared keys (e.g., using one or more block ciphers such as triple Data Encryption Standard (DES), Advanced Encryption Standard (AES); stream ciphers such as Rivest Cipher 4 (RC4), Rabbit), asymmetric key cryptography using a public key/private key pair (e.g., using algorithms such as Rivest-Shamir-Adleman (RSA), Digital Signature Algorithm (DSA)), cryptographic hash functions (e.g., using algorithms such as Message-Digest 5 (MD5), Secure Hash Algorithm 2 (SHA-2)), and/or the like. For example, the security subcomponent may comprise a cryptographic system such as Pretty Good Privacy (PGP).
  • In some implementations, the operating environment component may include a virtualization subcomponent that facilitates VHP virtualization capabilities. In some embodiments, the virtualization subcomponent may provide support for platform virtualization (e.g., via a virtual machine). Platform virtualization types may include full virtualization, partial virtualization, paravirtualization, and/or the like. In some implementations, platform virtualization may be hardware-assisted (e.g., via support from the processor using technologies such as AMD-V, Intel VT-x, and/or the like). In some embodiments, the virtualization subcomponent may provide support for various other virtualized environments such as via operating-system level virtualization, desktop virtualization, workspace virtualization, mobile virtualization, application virtualization, database virtualization, and/or the like. In some embodiments, the virtualization subcomponent may provide support for various virtualized resources such as via memory virtualization, storage virtualization, data virtualization, network virtualization, and/or the like. For example, the virtualization subcomponent may comprise VMware software suite (e.g., VMware Server, VMware Workstation, VMware Player, VMware ESX, VMware ESXi, VMware ThinApp, VMware Infrastructure), Parallels software suite (e.g., Parallels Server, Parallels Workstation, Parallels Desktop, Parallels Mobile, Parallels Virtuozzo Containers), Oracle software suite (e.g., Oracle VM Server for SPARC, Oracle VM Server for x86, Oracle VM VirtualBox, Oracle Solaris 10, Oracle Solaris 11), Informatica Data Services, Wine, and/or the like.
  • In some embodiments, components 340 may include a user interface component 340 b. The user interface component may facilitate user interaction with the VHP by providing a user interface. In various implementations, the user interface component may include programmatic instructions to obtain input from and/or provide output to the user via physical controls (e.g., physical buttons, switches, knobs, wheels, dials), textual user interface, audio user interface, GUI, voice recognition, gesture recognition, touch and/or multi-touch user interface, messages, APIs, and/or the like. In some implementations, the user interface component may make use of the user interface elements provided by the operating system subcomponent of the operating environment component. For example, the user interface component may make use of the operating system subcomponent's user interface elements via a widget toolkit. In some implementations, the user interface component may make use of information presentation capabilities provided by the information handling subcomponent of the operating environment component. For example, the user interface component may make use of a web browser to provide a user interface via HTML5, Adobe Flash, Microsoft Silverlight, and/or the like.
  • In some embodiments, components 340 may include any of the components WVS 340 c described in more detail in preceding figures.
  • The Embodiments of the VHP
  • The entirety of this disclosure (including the written description, figures, claims, abstract, appendices, and/or the like) for VIDEO HEADPHONES PLATFORM METHODS, APPARATUSES AND MEDIA shows various embodiments via which the claimed innovations may be practiced. It is to be understood that these embodiments and the features they describe are a representative sample presented to assist in understanding the claimed innovations, and are not exhaustive and/or exclusive. As such, the various embodiments, implementations, examples, and/or the like are deemed non-limiting throughout this disclosure. Furthermore, alternate undescribed embodiments may be available (e.g., equivalent embodiments). Such alternate embodiments have not been discussed in detail to preserve space and/or reduce repetition. That alternate embodiments have not been discussed in detail is not to be considered a disclaimer of such alternate undescribed embodiments, and no inference should be drawn regarding such alternate undescribed embodiments relative to those discussed in detail in this disclosure. It is to be understood that such alternate undescribed embodiments may be utilized without departing from the spirit and/or scope of the disclosure. For example, the organizational, logical, physical, functional, topological, and/or the like structures of various embodiments may differ. In another example, the organizational, logical, physical, functional, topological, and/or the like structures of the VHP coordinator, VHP coordinator elements, VHP data stores, VHP components and their subcomponents, capabilities, applications, and/or the like described in various embodiments throughout this disclosure are not limited to a fixed operating order and/or arrangement, instead, all equivalent operating orders and/or arrangements are contemplated by this disclosure. In yet another example, the VHP coordinator, VHP coordinator elements, VHP data stores, VHP components and their subcomponents, capabilities, applications, and/or the like described in various embodiments throughout this disclosure are not limited to serial execution, instead, any number and/or configuration of threads, processes, instances, services, servers, clients, nodes, and/or the like that execute in parallel, concurrently, simultaneously, synchronously, asynchronously, and/or the like is contemplated by this disclosure. Furthermore, it is to be understood that some of the features described in this disclosure may be mutually contradictory, incompatible, inapplicable, and/or the like, and are not present simultaneously in the same embodiment. Accordingly, the various embodiments, implementations, examples, and/or the like are not to be considered limitations on the disclosure as defined by the claims or limitations on equivalents to the claims.
  • This disclosure includes innovations not currently claimed. Applicant reserves all rights in such currently unclaimed innovations including the rights to claim such innovations and to file additional provisional applications, nonprovisional applications, continuation applications, continuation-in-part applications, divisional applications, and/or the like. It is to be understood that while some embodiments of the VHP discussed in this disclosure have been directed to world view sharing video headphones, the innovations described in this disclosure may be readily applied to a wide variety of other fields and/or applications.
  • Further Disclosures
  • With a video headphone as disclosed herein users will be able to record video through a small camera embedded in the headphones themselves. Audio can also be recorded through microphones, which can be installed into the headphone cable control unit and also located underneath the video camera.
  • Once a user has finished recording video, audio can be added, music can be selected for the video and filters added. After the editing process has been completed, audio and video can be seamlessly synced and the output can be shared to desired social networks.
  • A video headphone as disclosed herein may be configured for use and with suitable wired interfacing to connect to an audio source as well as a SmartPhone USB connector.
  • The video headphone may stream audio and video signals to a SmartPhone for storage and later editing and/or uploading through a wire with audio and video controls on the controls that connects from video headphone to smartphone.
  • The video headphone may be worn and/or used by the user such that the camera's view may be adjusted to monitor what may be seen by the user and it's signal output may be combined with the audio source feed, such as that which is monitored by the user, and provided to an external SmartPhone.
  • Specific audio performance desired for the headphone device are those typical in professional use, i.e. bandwidth of 20 Hz to 20 kHz. Specific video performance desired for the camera attachment to the video headphone device shall permit image motion captures of 25 fps or better with a resolution adequate and similar to that which is commonly downloaded to SmartPhone devices from various video streaming and downloadable sources with typical formats such as MP4, FL V, MPEG, etc.
  • Packaging may include 3 connection types from headphone to various inputs of smartphone models. Connection 1 may be a Micro-USB to Apple iPhone, iPhone 3G, iPhone 4, iPhone 4S 30-Pin Style Charger Adapter Tip. Connection 2 may be a 3.5 mm Auxillary Cable Sync Connector. Connection 3 may be a Micro USB Cable.
  • An app for use with a video headphone may:
      • Add filters to videos (e.g., sepia filter, black and white filter, monochromatic filter).
      • record video (e.g., any length, for up to 15 seconds.)
      • Provide the user with the ability to edit video by shortening timeline.
      • Add various frames around video.
  • Audio features of a system according to the present invention may include:
      • Ability to import songs from a user's playlist to video recording.
        • Ability to store the user's playlist locally on the user's device or on a web server.
        • WAY, AIFF, AU or raw header-less PCM; Formats with lossless compression, such as FLAC, Monkey's Audio (filename extension APE), WavPack (filename extension WV), TTA, ATRAC Advanced Lossless, Apple Lossless (filename extension m4a), MPEG-4 SLS, MPEG-4 ALS, MPEG-4 DST, Windows Media Audio Lossless (WMA Lossless), and Shorten (SHN), and/or the like audio formats may be supported.
      • ability to sync the audio and video recording from any point in timeline to record final output.
      • ability to upload final video to Facebook, Tumblr, twitter, instagram, pinterest, vimeo, youtube.
      • ability to edit audio sound speed 2×, 3× faster or slower via a button.
      • ability to edit audio sound with auto tune feature with a single button.
      • ability to press a single button to post artist credit for song being used. The artist credit will scroll across video. Process is controlled by a one button process.
  • A system according to the present invention may:
      • Provide a user with the ability to create a profile. This includes username, location, sex, age, email address.
      • Provide ability to store user profile data at a central web database server using REST API.
      • Provide Audio/video sync software to time to actions in video.
      • Provide playlist suggestions of songs from users playlist that have the same tempo as the actions in video.
      • Provide the ability for user to pay for audio and video features (e.g., video filters and audio enhancements).
      • Provide the ability to combine 2 videos into one, edit music and make one video.

Claims (5)

1. A processor-implemented method to share a world view video, comprising:
receiving via a processor a request to record a user's world view video;
recording via video headphones the world view video;
adjusting via the processor the world view video based on user instructions; and
sharing via the processor the adjusted world view video with the user's social network.
2. The method of claim 1, wherein the video headphones comprise audio headphones with an embedded video camera and a microphone.
3. The method of claim 1, wherein the adjusting the world view video comprises trimming the world view video.
4. The method of claim 1, wherein the adjusting the world view video comprises adding audio to the world view video.
5. The method of claim 1, wherein the adjusting the world view video comprises adding video effects to the world view video.
US14/092,059 2012-11-29 2013-11-27 Video headphones platform methods, apparatuses and media Abandoned US20140147099A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/092,059 US20140147099A1 (en) 2012-11-29 2013-11-27 Video headphones platform methods, apparatuses and media

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261731126P 2012-11-29 2012-11-29
US14/092,059 US20140147099A1 (en) 2012-11-29 2013-11-27 Video headphones platform methods, apparatuses and media

Publications (1)

Publication Number Publication Date
US20140147099A1 true US20140147099A1 (en) 2014-05-29

Family

ID=50773384

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/092,088 Active US10652640B2 (en) 2012-11-29 2013-11-27 Video headphones, system, platform, methods, apparatuses and media
US14/092,059 Abandoned US20140147099A1 (en) 2012-11-29 2013-11-27 Video headphones platform methods, apparatuses and media
US16/845,836 Abandoned US20200245048A1 (en) 2012-11-29 2020-04-10 Video headphones, system, platform, methods, apparatuses and media

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/092,088 Active US10652640B2 (en) 2012-11-29 2013-11-27 Video headphones, system, platform, methods, apparatuses and media

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/845,836 Abandoned US20200245048A1 (en) 2012-11-29 2020-04-10 Video headphones, system, platform, methods, apparatuses and media

Country Status (6)

Country Link
US (3) US10652640B2 (en)
EP (2) EP3637419A3 (en)
JP (2) JP6622588B2 (en)
KR (1) KR20150092220A (en)
CN (1) CN105027206A (en)
WO (1) WO2014085610A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150172238A1 (en) * 2013-12-18 2015-06-18 Lutebox Ltd. Sharing content on devices with reduced user actions
CN106341756A (en) * 2016-08-29 2017-01-18 北海爱飞数码科技有限公司 Customized intelligent sound box
CN108419069A (en) * 2018-03-13 2018-08-17 中浙信科技咨询有限公司 A kind of 3D light path acquisition systems based on digital camera equipment
WO2018187189A1 (en) * 2017-04-08 2018-10-11 Ryan Fuller Magnetic camera coupling system
US11177975B2 (en) 2016-06-13 2021-11-16 At&T Intellectual Property I, L.P. Movable smart device for appliances
US11546460B2 (en) 2020-10-27 2023-01-03 Rakuten Group, Inc. Terminal device, advertisement display method, and computer readable medium storing terminal program
US11711463B2 (en) 2020-10-27 2023-07-25 Rakuten Group, Inc. Telephone advertisement system, telephone advertisement method, and computer readable medium storing telephone advertisement program

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9152993B2 (en) * 2011-05-27 2015-10-06 Sean Werner Systems and methods for a website application for the purpose of trading, bartering, swapping, or exchanging personal property through a social networking environment
US9116545B1 (en) * 2012-03-21 2015-08-25 Hayes Solos Raffle Input detection
US11229995B2 (en) 2012-05-31 2022-01-25 Black Decker Inc. Fastening tool nail stop
US9827658B2 (en) 2012-05-31 2017-11-28 Black & Decker Inc. Power tool having latched pusher assembly
US20180048750A1 (en) * 2012-06-15 2018-02-15 Muzik, Llc Audio/video wearable computer system with integrated projector
US10110805B2 (en) * 2012-12-06 2018-10-23 Sandisk Technologies Llc Head mountable camera system
US10061349B2 (en) * 2012-12-06 2018-08-28 Sandisk Technologies Llc Head mountable camera system
US9459768B2 (en) * 2012-12-12 2016-10-04 Smule, Inc. Audiovisual capture and sharing framework with coordinated user-selectable audio and video effects filters
US9666234B2 (en) * 2013-01-30 2017-05-30 Windense Ltd. Motion picture processing system
US9826275B2 (en) * 2013-02-27 2017-11-21 Comcast Cable Communications, Llc Enhanced content interface
US9544682B2 (en) * 2013-06-05 2017-01-10 Echostar Technologies L.L.C. Apparatus, method and article for providing audio of different programs
US9219967B2 (en) 2013-11-25 2015-12-22 EchoStar Technologies, L.L.C. Multiuser audiovisual control
EP2851001A3 (en) * 2014-12-03 2015-04-22 Sensirion AG Wearable electronic device
US9686602B2 (en) * 2015-06-15 2017-06-20 Uneo Inc. Green headphone
US9583142B1 (en) 2015-07-10 2017-02-28 Musically Inc. Social media platform for creating and sharing videos
CN106358104A (en) * 2015-07-15 2017-01-25 视讯联合科技股份有限公司 Headset with concealed video screen and various sensors
USD801347S1 (en) 2015-07-27 2017-10-31 Musical.Ly, Inc Display screen with a graphical user interface for a sound added video making and sharing app
USD801348S1 (en) 2015-07-27 2017-10-31 Musical.Ly, Inc Display screen with a graphical user interface for a sound added video making and sharing app
USD788137S1 (en) * 2015-07-27 2017-05-30 Musical.Ly, Inc Display screen with animated graphical user interface
US20180095653A1 (en) * 2015-08-14 2018-04-05 Martin Hasek Device, method and graphical user interface for handwritten interaction
WO2017113307A1 (en) * 2015-12-31 2017-07-06 深圳市柔宇科技有限公司 Head-mounted display and method for adjusting camera thereof
US9939843B2 (en) * 2016-01-05 2018-04-10 360fly, Inc. Apparel-mountable panoramic camera systems
CN107371076A (en) * 2016-05-11 2017-11-21 中兴通讯股份有限公司 A kind of earphone and terminal device with projecting function
WO2018027237A1 (en) * 2016-08-05 2018-02-08 Sportscastr.Live Llc Systems, apparatus, and methods for scalable low-latency viewing of broadcast digital content streams of live events
US10349259B2 (en) * 2016-09-23 2019-07-09 Apple Inc. Broadcasting a device state in a wireless communication network
US11298080B2 (en) * 2016-11-11 2022-04-12 Sony Mobile Communications Inc. Reproduction terminal and reproduction method
JP2019536395A (en) 2016-11-13 2019-12-12 エンボディーヴィーアール、インコーポレイテッド System and method for capturing an image of the pinna and using the pinna image to characterize human auditory anatomy
US10701506B2 (en) 2016-11-13 2020-06-30 EmbodyVR, Inc. Personalized head related transfer function (HRTF) based on video capture
USD845975S1 (en) * 2017-01-13 2019-04-16 Koninklijke Philips N.V. Display screen with animated graphical user interface
USD831685S1 (en) * 2017-01-13 2018-10-23 Koninklijke Philips N.V. Display screen with animated graphical user interface
US20200045094A1 (en) * 2017-02-14 2020-02-06 Bluejay Technologies Ltd. System for Streaming
GB201702386D0 (en) 2017-02-14 2017-03-29 Bluejay Tech Ltd System for streaming
CN108632695B (en) 2017-03-16 2020-11-13 广州思脉时通讯科技有限公司 Earphone set
US10091549B1 (en) * 2017-03-30 2018-10-02 Rovi Guides, Inc. Methods and systems for recommending media assets based on the geographic location at which the media assets are frequently consumed
CN107124544A (en) * 2017-03-31 2017-09-01 北京疯景科技有限公司 Panorama shooting device and method
WO2019036533A1 (en) * 2017-08-16 2019-02-21 Veritaz Inc. Personal display headset for mitigating user access to disallowed resources
CN111373364B (en) * 2017-09-25 2023-06-20 深圳传音通讯有限公司 Earphone for intercepting audio files and control method thereof
US11722832B2 (en) 2017-11-14 2023-08-08 Sony Corporation Signal processing apparatus and method, and program
JP7068480B2 (en) * 2018-02-22 2022-05-16 ライン プラス コーポレーション Computer programs, audio playback devices and methods
CN110490578A (en) * 2018-05-14 2019-11-22 天津科技大学 NFC safety payment system based on LabVIEW and fingerprint recognition
US10467981B1 (en) 2018-06-13 2019-11-05 Dell Products, Lp Method and apparatus for providing interface between dedicated discrete graphics processing unit and head mounted display using type-C universal standard bus
USD955403S1 (en) 2019-02-14 2022-06-21 Pax Labs, Inc. Display screen or portion thereof with graphical user interface
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
USD956773S1 (en) 2019-08-20 2022-07-05 Pax Labs, Inc. Display screen or portion thereof with graphical user interface
USD899445S1 (en) * 2019-09-05 2020-10-20 Canopy Growth Corporation Display screen or portion thereof with transitional graphical user interface
USD894914S1 (en) * 2019-09-05 2020-09-01 Canopy Growth Corporation Display screen or portion thereof with transitional graphical user interface
USD945390S1 (en) * 2019-09-24 2022-03-08 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone with animated graphical user interface
US10873852B1 (en) 2020-04-10 2020-12-22 Avila Technology, LLC POOFster: a secure mobile text message and object sharing application, system, and method for same
US11151229B1 (en) 2020-04-10 2021-10-19 Avila Technology, LLC Secure messaging service with digital rights management using blockchain technology
US11356792B2 (en) 2020-06-24 2022-06-07 International Business Machines Corporation Selecting a primary source of text to speech based on posture
US11778408B2 (en) 2021-01-26 2023-10-03 EmbodyVR, Inc. System and method to virtually mix and audition audio content for vehicles
US20220407772A1 (en) * 2021-06-16 2022-12-22 Hewlett-Packard Development Company, L.P. Configuration profiles

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6091546A (en) * 1997-10-30 2000-07-18 The Microoptical Corporation Eyeglass interface system
USD448015S1 (en) * 2000-05-30 2001-09-18 Marie Lapalme Video camera headset
US20040041904A1 (en) * 2002-09-03 2004-03-04 Marie Lapalme Method and apparatus for telepresence
US20060090135A1 (en) * 2002-06-20 2006-04-27 Takahito Fukuda Job guiding system
US20060109350A1 (en) * 2004-11-24 2006-05-25 Ming-Hsiang Yeh Glasses type audio-visual recording apparatus
US20060239648A1 (en) * 2003-04-22 2006-10-26 Kivin Varghese System and method for marking and tagging wireless audio and video recordings
US7365766B1 (en) * 2000-08-21 2008-04-29 Marie Lapalme Video-assisted apparatus for hearing impaired persons
US20090247245A1 (en) * 2004-12-14 2009-10-01 Andrew Strawn Improvements in or Relating to Electronic Headset Devices and Associated Electronic Devices
US20110085041A1 (en) * 2009-10-04 2011-04-14 Michael Rogler Kildevaeld Stably aligned portable image capture and projection
US20120096357A1 (en) * 2010-10-15 2012-04-19 Afterlive.tv Inc Method and system for media selection and sharing
US8188937B1 (en) * 1999-09-06 2012-05-29 Shimadzu Corporation Body mounting type display system
US20120207308A1 (en) * 2011-02-15 2012-08-16 Po-Hsun Sung Interactive sound playback device
US20120237050A1 (en) * 2011-03-16 2012-09-20 Panasonic Corporation Sound pickup device
US20120252483A1 (en) * 2011-01-04 2012-10-04 Qualcomm Incorporated Camera enabled headset for navigation
US20120287284A1 (en) * 2011-05-10 2012-11-15 Kopin Corporation Headset computer that uses motion and voice commands to control information display and remote devices
US20130033610A1 (en) * 2011-08-05 2013-02-07 Calvin Osborn Wearable video camera with dual rotatable imagine devices
US20130162632A1 (en) * 2009-07-20 2013-06-27 Real Time Companies, LLC Computer-Aided System for 360º Heads Up Display of Safety/Mission Critical Data
US20130202274A1 (en) * 2011-12-02 2013-08-08 Eric Chan Video camera band and system
US8593570B2 (en) * 2008-11-07 2013-11-26 Looxcie, Inc. Video recording camera headset
US20140254939A1 (en) * 2011-11-24 2014-09-11 Ntt Docomo, Inc. Apparatus and method for outputting information on facial expression
US9953034B1 (en) * 2012-04-17 2018-04-24 Google Llc System and method for sharing trimmed versions of digital media items

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07170479A (en) * 1993-10-15 1995-07-04 Matsushita Electric Ind Co Ltd Video camera
JP3460380B2 (en) * 1995-04-28 2003-10-27 ソニー株式会社 Video camera adapter and video camera device
WO2002080027A1 (en) * 2001-03-29 2002-10-10 British Telecommunications Public Limited Company Image processing
JP2004221666A (en) * 2003-01-09 2004-08-05 Canon Inc Imaging recording apparatus
JP2005172851A (en) * 2003-12-05 2005-06-30 Sony Corp Image display apparatus
CN1286316C (en) * 2003-12-12 2006-11-22 乐金电子(沈阳)有限公司 Device for synthesizing and editing audio and video based on TV set
JP4737496B2 (en) * 2004-07-06 2011-08-03 ソニー株式会社 REPRODUCTION SYSTEM, REPRODUCTION DEVICE AND METHOD, RECORDING MEDIUM, AND PROGRAM
US20060204214A1 (en) * 2005-03-14 2006-09-14 Microsoft Corporation Picture line audio augmentation
US8045727B2 (en) * 2005-09-30 2011-10-25 Atmel Corporation Headset power management
JP2007121694A (en) * 2005-10-28 2007-05-17 Sharp Corp Mobile terminal device, headphone and reproducing control method of mobile terminal device
GB2443990B (en) * 2005-11-26 2009-01-28 Wolfson Microelectronics Plc Audio device
CN100501568C (en) * 2005-12-30 2009-06-17 鸿富锦精密工业(深圳)有限公司 Camera earphone module and portable electronic apparatus
US20070162855A1 (en) * 2006-01-06 2007-07-12 Kelly Hawk Movie authoring
US8868465B2 (en) * 2006-01-13 2014-10-21 Yahoo! Inc. Method and system for publishing media content
JP2011172273A (en) * 2006-05-08 2011-09-01 Mieko Tsuyusaki Mobile system
US8805164B2 (en) * 2006-05-24 2014-08-12 Capshore, Llc Method and apparatus for creating a custom track
JP2008067219A (en) * 2006-09-08 2008-03-21 Sony Corp Imaging apparatus and imaging method
DE102008010040A1 (en) * 2008-02-20 2009-08-27 Kirner, Markus A. Display arrangement for displaying e.g. film, of mobile telephone, has foldable video spectacles wiredly or wirelessly connected with mobile communication device to display image data on mobile communication device
US20090281435A1 (en) * 2008-05-07 2009-11-12 Motorola, Inc. Method and apparatus for robust heart rate sensing
KR101511193B1 (en) * 2009-02-27 2015-04-10 파운데이션 프로덕션, 엘엘씨 Headset-based telecommunications platform
US8860865B2 (en) * 2009-03-02 2014-10-14 Burning Moon, Llc Assisted video creation utilizing a camera
US8975514B2 (en) * 2010-01-07 2015-03-10 Zipbuds, LLC. Cable organization assemblies
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
JP6016322B2 (en) * 2010-03-19 2016-10-26 ソニー株式会社 Information processing apparatus, information processing method, and program
CN201682601U (en) * 2010-05-17 2010-12-22 济南铁诺运通机械电子有限公司 Laser positioning headset camera
FR2963682B1 (en) * 2010-08-04 2012-09-21 St Microelectronics Rousset METHOD OF DETECTING OBJECT BY MEANS OF PROXIMITY SENSOR
PL392068A1 (en) * 2010-08-05 2012-02-13 Zuza Pictures Spółka Z Ograniczoną Odpowiedzialnością Portable video-telecommunication device, method for data transmission, especially audio/video data and their application
US20120204225A1 (en) * 2011-02-08 2012-08-09 Activepath Ltd. Online authentication using audio, image and/or video
US20140013271A1 (en) * 2012-07-05 2014-01-09 Research In Motion Limited Prioritization of multitasking applications in a mobile device interface

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6091546A (en) * 1997-10-30 2000-07-18 The Microoptical Corporation Eyeglass interface system
US8188937B1 (en) * 1999-09-06 2012-05-29 Shimadzu Corporation Body mounting type display system
USD448015S1 (en) * 2000-05-30 2001-09-18 Marie Lapalme Video camera headset
US7365766B1 (en) * 2000-08-21 2008-04-29 Marie Lapalme Video-assisted apparatus for hearing impaired persons
US20060090135A1 (en) * 2002-06-20 2006-04-27 Takahito Fukuda Job guiding system
US20040041904A1 (en) * 2002-09-03 2004-03-04 Marie Lapalme Method and apparatus for telepresence
US20060239648A1 (en) * 2003-04-22 2006-10-26 Kivin Varghese System and method for marking and tagging wireless audio and video recordings
US20060109350A1 (en) * 2004-11-24 2006-05-25 Ming-Hsiang Yeh Glasses type audio-visual recording apparatus
US20090247245A1 (en) * 2004-12-14 2009-10-01 Andrew Strawn Improvements in or Relating to Electronic Headset Devices and Associated Electronic Devices
US8593570B2 (en) * 2008-11-07 2013-11-26 Looxcie, Inc. Video recording camera headset
US20130162632A1 (en) * 2009-07-20 2013-06-27 Real Time Companies, LLC Computer-Aided System for 360º Heads Up Display of Safety/Mission Critical Data
US20110085041A1 (en) * 2009-10-04 2011-04-14 Michael Rogler Kildevaeld Stably aligned portable image capture and projection
US20120096357A1 (en) * 2010-10-15 2012-04-19 Afterlive.tv Inc Method and system for media selection and sharing
US20120252483A1 (en) * 2011-01-04 2012-10-04 Qualcomm Incorporated Camera enabled headset for navigation
US20120207308A1 (en) * 2011-02-15 2012-08-16 Po-Hsun Sung Interactive sound playback device
US20120237050A1 (en) * 2011-03-16 2012-09-20 Panasonic Corporation Sound pickup device
US20120287284A1 (en) * 2011-05-10 2012-11-15 Kopin Corporation Headset computer that uses motion and voice commands to control information display and remote devices
US20130033610A1 (en) * 2011-08-05 2013-02-07 Calvin Osborn Wearable video camera with dual rotatable imagine devices
US20140254939A1 (en) * 2011-11-24 2014-09-11 Ntt Docomo, Inc. Apparatus and method for outputting information on facial expression
US20130202274A1 (en) * 2011-12-02 2013-08-08 Eric Chan Video camera band and system
US9953034B1 (en) * 2012-04-17 2018-04-24 Google Llc System and method for sharing trimmed versions of digital media items

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Pigear pigear http //.com/store/Earphone-Camera-PV500-Series-p530.html-hereinafter *
Pigear, http://pigear.com/store/Earphone-Camera-PV500-Series-p530.html, 2006. *
Singaspy Tonga, TKS-523.D Earphone Camera, http://www.singaspytonga.com/TKS-523.D.html, 1983 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150172238A1 (en) * 2013-12-18 2015-06-18 Lutebox Ltd. Sharing content on devices with reduced user actions
US11177975B2 (en) 2016-06-13 2021-11-16 At&T Intellectual Property I, L.P. Movable smart device for appliances
CN106341756A (en) * 2016-08-29 2017-01-18 北海爱飞数码科技有限公司 Customized intelligent sound box
WO2018187189A1 (en) * 2017-04-08 2018-10-11 Ryan Fuller Magnetic camera coupling system
CN108419069A (en) * 2018-03-13 2018-08-17 中浙信科技咨询有限公司 A kind of 3D light path acquisition systems based on digital camera equipment
US11546460B2 (en) 2020-10-27 2023-01-03 Rakuten Group, Inc. Terminal device, advertisement display method, and computer readable medium storing terminal program
US11711463B2 (en) 2020-10-27 2023-07-25 Rakuten Group, Inc. Telephone advertisement system, telephone advertisement method, and computer readable medium storing telephone advertisement program

Also Published As

Publication number Publication date
JP6622588B2 (en) 2019-12-18
JP2016509380A (en) 2016-03-24
CN105027206A (en) 2015-11-04
EP2926338B1 (en) 2019-09-18
EP2926338A1 (en) 2015-10-07
US20140161412A1 (en) 2014-06-12
KR20150092220A (en) 2015-08-12
EP2926338A4 (en) 2016-07-27
EP3637419A3 (en) 2020-07-22
EP2926338B8 (en) 2020-03-04
US20200245048A1 (en) 2020-07-30
US10652640B2 (en) 2020-05-12
EP3637419A2 (en) 2020-04-15
WO2014085610A1 (en) 2014-06-05
JP2020096354A (en) 2020-06-18

Similar Documents

Publication Publication Date Title
US20200245048A1 (en) Video headphones, system, platform, methods, apparatuses and media
US11019261B2 (en) Video headphones, systems, helmets, methods and video content files
US9245227B2 (en) Expert answer platform methods, apparatuses and media
US10728349B2 (en) Tru torrent platform methods, apparatuses and media
US9735973B2 (en) Expert answer platform methods, apparatuses and media
US20130246327A1 (en) Expert answer platform methods, apparatuses and media
US9558577B2 (en) Rhythmic mosaic generation methods, apparatuses and media
US20140016147A1 (en) Mosaic generating platform methods, apparatuses and media
US20140185958A1 (en) Mosaic generating platform methods, apparatuses and media
US20140365393A1 (en) Transportation capacity augmentation program methods, apparatuses and media
US11068282B2 (en) Apparatuses, methods and systems for persisting values in a computing environment
US20160117786A1 (en) Residential pipeline platform methods, apparatuses, and media
US10592196B2 (en) Mosaic generating platform methods, apparatuses and media
US20230185588A1 (en) System for simultaneous recording of the pixels of a screen and of accessibility data
US10127000B2 (en) Mosaic generating platform methods, apparatuses and media
WO2023111681A1 (en) Device usage measurement engine apparatuses, methods, systems and media
WO2014062919A1 (en) Rhythmic mosaic generation methods, apparatuses and media
WO2014011198A1 (en) Mosaic generating platform methods, apparatuses and media

Legal Events

Date Code Title Description
STCC Information on status: application revival

Free format text: WITHDRAWN ABANDONMENT, AWAITING EXAMINER ACTION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: SOUNDSIGHT IP, LLC, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SOUNDSIGHT, LLC;REEL/FRAME:050774/0844

Effective date: 20190101

Owner name: SOUNDSIGHT, LLC, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHASE, STEPHEN;GORILOVSKY, DMITRY;SOUNDSIGHT MOBILE LLC;REEL/FRAME:050774/0672

Effective date: 20181111

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION