US20160344926A1 - Camera Tap Switch - Google Patents
Camera Tap Switch Download PDFInfo
- Publication number
- US20160344926A1 US20160344926A1 US15/230,485 US201615230485A US2016344926A1 US 20160344926 A1 US20160344926 A1 US 20160344926A1 US 201615230485 A US201615230485 A US 201615230485A US 2016344926 A1 US2016344926 A1 US 2016344926A1
- Authority
- US
- United States
- Prior art keywords
- camera
- taps
- computing device
- motion
- functionality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23216—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H04N5/2252—
-
- H04N5/2254—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
Definitions
- buttons are not always an appealing feature to add to hardware, such as a camera. This is particularly true when the hardware or camera has a small form factor. Additional physical buttons can create a crowding situation on the hardware or camera and can lead to an unaesthetic appearance. Further, crowded buttons increase the likelihood that a user will inadvertently press the wrong button.
- a wearable camera that can be worn by a user.
- the wearable camera includes an accelerometer that can be used to detect camera motion.
- Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer.
- the tap or taps can be mapped to camera functionality to activate the functionality.
- the camera includes a microphone which detects sound around the camera.
- the microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera.
- the noise profile can be used, together with the motion profile, to confirm the input as a tap input.
- FIG. 1 is an example camera device in accordance with one or more embodiments.
- FIG. 2 illustrates an example camera device in accordance with one or more embodiments.
- FIG. 3 illustrates an example camera device in accordance with one or more embodiments.
- FIG. 4 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 6 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 7 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 8 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- FIG. 9 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- a wearable camera that can be worn by a user.
- the wearable camera includes an accelerometer that can be used to detect camera motion.
- Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer.
- the tap or taps can be mapped to camera functionality to activate the functionality. Any suitable type of functionality can be mapped to the tap or taps.
- the camera includes a microphone which detects sound around the camera.
- the microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera.
- the noise profile can be used, together with the motion profile, to confirm the input as a tap input. This can help to disambiguate various other types of input that might be received by the camera. For example, the user may be wearing the camera and may jump up and down. The jump may have a motion profile that is similar to that of a tap. By looking for a noise profile associated with a tap when a tap-like motion profile is received, the camera can confirm whether or not the received input is a tap.
- the camera can be worn in any suitable location.
- the camera can be worn on a user's head such as, a way of example and not limitation, a hat-mounted camera, glasses-mounted camera, headband-mounted camera, helmet-mounted camera, and the like.
- the camera can be worn on locations other than the user's head.
- the camera can be configured to be mounted on the user's clothing.
- a wearable camera that is mountable on a user's clothing
- the camera is designed to be unobtrusive and user-friendly insofar as being mounted away from the user's face so as not to interfere with their view.
- the camera includes a housing and a clip mounted to the housing to enable the camera to be clipped onto the user's clothing
- the camera is designed to be lightweight with its weight balanced in a manner that is toward the user when clipped to the user's clothing.
- the camera includes a replay mode.
- the replay mode When the replay mode is selected, the camera automatically captures image data, such as video or still images, and saves the image data to a memory buffer.
- the size of the memory buffer can be set by the user to determine how much image data is to be collected. Once the memory buffer is full, the older image data is erased to make room for currently-captured image data.
- a record button can be activated which saves the image data from the beginning of the memory buffer and continues recording until the user presses the record button again. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time.
- Example Environment describes an example environment in which the various embodiments can be utilized.
- Replay Functionality describes an example replay mode in accordance with one or more embodiments.
- Duel Encoding describes an embodiment in which captured image data can be dual encoded in accordance with one or more embodiments.
- Photo Log describes an example photo log in accordance with one or more embodiments.
- a section entitled “Camera Tap Switch” describes a camera tap switch in accordance with one or more embodiments.
- FIG. 1 illustrates a schematic of a camera device 100 in accordance with one or more embodiments.
- the camera device 100 includes a lens 102 having a focal length that is suitable for covering a scene to be pictured.
- a mechanical device may be included with the lens 102 to enable auto or manual focusing of the lens.
- the camera device 100 may be a fixed focus device in which no mechanical assembly is included to move the lens 102 .
- a sensor 104 having a sensing surface (not shown) is also included to convert an image formed by the incoming light on the sensing surface of the sensor 104 into a digital format.
- the sensor 104 may include a charge-coupled device (CCD) or complementary metal oxide semiconductor (CMOS) image sensor for scanning the incoming light and creating a digital picture.
- CCD charge-coupled device
- CMOS complementary metal oxide semiconductor
- Other technologies or devices may be used so long as the used device is capable of converting an image formed by the incoming light on a sensing surface into the digital form.
- these image detection devices determine the effects of light on tiny light sensitive devices and record the changes in a digital format.
- the camera device 100 may include other components such as a battery or power source and other processor components that are required for a processor to operate. However, to avoid obfuscating the teachings, these well-known components are being omitted.
- the camera device 100 does not include a view finder or a preview display. In other embodiments, however, a preview display may be provided.
- the techniques described herein can be used in any type of camera, and are particularly effective in small, highly portable cameras, such as those implemented in mobile telephones and other portable user equipment.
- the camera device 100 includes hardware or software for making and receiving phone calls. Alternately, the camera device 100 can be a dedicated, stand-alone camera.
- the camera device 100 further includes a motion detector 108 that can include an accelerometer and, in some embodiments, a gyroscope.
- the accelerometer is used for determining the direction of gravity and acceleration in any direction.
- the gyroscope may also be used either in addition to the accelerometer or instead of the accelerometer.
- the gyroscope can provide information about how the rotational angle of the camera device 100 changes over time. Any other type of sensor may be used to detect the camera's motion. Using the rotational angle, an angle of rotation of the camera device 100 may be calculated, if the camera device 100 is rotated.
- input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer.
- the tap or taps can be mapped to camera functionality to activate the functionality. Any suitable type of functionality can be mapped to the tap or taps.
- the camera includes a microphone which detects sound around the camera.
- the microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera.
- the noise profile can be used, together with the motion profile, to confirm the input as a tap input. This can help to disambiguate various other types of input that might be received by the camera, as noted above and below.
- an input/output (I/O) port 114 for connecting the camera device 100 to an external device, including a general purpose computer.
- the I/O port 114 may be used for enabling the external device to configure the camera device 100 or to upload/download data.
- the I/O port 114 may also be used for streaming video or pictures from the camera device 100 to the external device.
- the I/O port may also be used for powering the camera device 100 or charging a rechargeable battery (not shown) in the camera device 100 .
- the camera device 100 may also include an antenna 118 that is coupled to a transmitter/receiver (Tx/Rx) module 116 .
- the Tx/Rx module 116 is coupled to a processor 106 .
- the antenna 118 may be fully or partly exposed outside the body of the camera device 100 . However, in another embodiment, the antenna 118 may be fully encapsulated within the body of the camera device 100 .
- the Tx/Rx module 116 may be configured for Wi-Fi transmission/reception, Bluetooth transmission/reception or both. In another embodiment, the Tx/Rx module 116 may be configured to use a proprietary protocol for transmission/reception of the radio signals.
- any radio transmission or data transmission standard may be used so long as the used standard is capable of transmitting/receiving digital data and control signals.
- the Tx/Rx module 116 is a low power module with a transmission range of less than ten feet. In another embodiment, the Tx/Rx module 116 is a low power module with a transmission range of less than five feet. In other embodiments, the transmission range may be configurable using control signals received by the camera device 100 either via the I/O port 114 or via the antenna 118 .
- the camera device 100 further includes a processor 106 .
- the processor 106 is coupled to the sensor 104 and the motion detector 108 .
- the processor 106 may also be coupled to storage 110 , which, in one embodiment, is external to the processor 106 .
- the storage 110 may be used for storing programming instructions for controlling and operating other components of the camera device 100 .
- the storage 110 may also be used for storing captured media (e.g., pictures and/or videos). In another embodiment, the storage 110 may be a part of the processor 106 itself
- the processor 106 may include an image processor 112 .
- the image processor 112 may be a hardware component or may also be a software module that is executed by the processor 106 . It may be noted that the processor 106 and/or the image processor 112 may reside in different chips. For example, multiple chips may be used to implement the processor 106 .
- the image processor 112 may be a Digital Signal Processor (DSP).
- DSP Digital Signal Processor
- the image processor can be configured as a processing module, that is a computer program executable by a processor.
- the processor 112 is used to process a raw image received from the sensor 104 based, at least in part, on the input received from the motion detector 108 .
- Other components such as Image Signal Processor (ISP) may be used for image processing.
- ISP Image Signal Processor
- the storage 110 is configured to store both raw (unmodified image) and the corresponding modified image.
- the storage 110 can include a memory buffer, such as a flash memory buffer, that can be used as a circular buffer to facilitate capturing image data when the camera is set to a replay mode that is supported by replay module 120 .
- the replay module 120 can be implemented in connection with any suitable hardware, software, firmware, or combination thereof When the replay mode is selected, the camera automatically captures image data, such as video or still images, and saves the image data to the memory buffer.
- the size of the memory buffer can be set by the user to determine how much image data is to be collected.
- a record button can be activated which saves the image data from the beginning of the memory buffer and continues recording until the user presses the record button again. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time.
- a processor buffer (not shown) may also be used to store the image data.
- the pictures can be downloaded to the external device via the I/O port 114 or via the wireless channels using the antenna 118 .
- both unmodified and modified images are downloaded to the external device when the external device sends a command to download images from the camera device 110 .
- the camera device 100 may be configured to start capturing a series of images at a selected interval.
- a raw image from the sensor 104 is inputted to an image processor (such as an ISP) for image processing or blur detection.
- image processing is applied to the image outputted by the image processor, the modified image is encoded.
- the image encoding is typically performed to compress the image data.
- the camera device 100 may not include the components for processing the image captured by the sensor 104 .
- the camera device 100 may include programming instructions to transmit the raw image after extracting the image from the sensor 104 to a cloud based processing system that is connected to the camera device 100 via the Internet or a local area network.
- the cloud based system is configured to receive the raw image and process the image or images as described above and below.
- the encoded image is then either stored in a selected cloud based storage or the image is sent back to the camera device 100 or to any other device according to a user configuration.
- the use of a cloud based image processing system can reduce a need for incorporating several image processing components in each camera device, thus making a camera device lighter, more energy efficient and cheaper.
- the camera device 100 may send either a raw image or the image processed through an image processor to another device, e.g., a mobile phone or a computer.
- the image may be transmitted to the mobile phone (or a computer) for further processing via Wi-Fi, Bluetooth or any other type of networking protocol that is suitable for transmitting digital data from one device to another device.
- the mobile device receives the image or images, according to one or more embodiments described herein, the produced image may be saved to local storage on the device, transferred for storage in a cloud based storage system, or transmitted to another device, according to user or system configurations.
- the native image processing system in the camera device 100 may produce images and/or videos in a non-standard format. For example, a 1200 ⁇ 1500 pixel image may be produced. This may be done by cropping, scaling, or using an image sensor with a non-standard resolution. Since methods for transforming images in a selected standard resolution are well-known, there will be no further discussion on this topic.
- FIG. 1 Various embodiments described above and below can be implemented utilizing a computer-readable storage medium that includes instructions that enable a processing unit to implement one or more aspects of the disclosed methods as well as a system configured to implement one or more aspects of the disclosed methods.
- computer-readable storage medium is meant all statutory forms of media. Accordingly, non-statutory forms of media such as carrier waves and signals per se are not intended to be covered by the term “computer-readable storage medium”.
- camera device 100 can assume any suitable form of wearable camera.
- the camera can be worn in any suitable location relative to a user.
- the camera can be worn on a user's head such as, by a way of example and not limitation, a hat-mounted camera, glasses-mounted camera, headband-mounted camera, helmet-mounted camera, and the like.
- the camera can be worn on locations other than the user's head.
- the camera can be configured to be mounted on the user's clothing or other items carried by a user, such as a backpack, purse, briefcase, and the like.
- a wearable camera is described in the context of a camera that is mountable on the user's clothing It is to be appreciated and understood, however, that other types of non-clothing mountable, wearable cameras can be utilized without departing from the spirit and scope of the claimed subject matter.
- FIG. 2 illustrates an example camera device 200 in a front elevational view
- FIG. 3 illustrates the camera device 200 in a side elevational view
- the camera device 200 includes a housing 202 that contains the components described in FIG. 1 .
- a camera lens 204 FIG. 2
- a fastening device 300 FIG. 3
- the fastening device 300 includes a prong 302 with a body having a thumb-engageable portion 304 .
- the body extends along an axis away from the thumb-engageable portion 304 toward a distal terminus 306 .
- a spring mechanism formed by the body or separate from and internal relative to the body, enables prong 302 to be opened responsive to pressure being applied to the thumb-engageable portion 304 .
- a piece of clothing can be inserted into area 308 .
- the thumb-engageable portion 304 is released, the clothing is clamped in place by the prong 302 thereby securely mounting the camera device on a piece of clothing
- the camera device can be mounted, as described above, on a necktie, blouse, shirt, pocket, and the like.
- camera device 200 can include a number of input buttons shown generally at 310 .
- the input buttons can include, by way of example and not limitation, an input button to take a still picture, an input button to initiate the replay mode, an input button to initiate a video capture mode, and an input button to enable the user to adjust the buffer size that is utilized during the replay mode. It is to be appreciated and understood that the various input buttons can be located anywhere on the camera device 200 .
- the camera device 200 can be manufactured in any shape shape and size suitable and sufficient to accommodate the above described components of the camera device 100 .
- the housing 202 of the camera device may be made of a metal molding, a synthetic material molding or a combination thereof. In other embodiments, any suitable type of material may be used to provide a durable and strong outer shell for typical portable device use.
- the fastening device 300 can comprise any suitable type of fastening device.
- the fastening device may be a simple slip-on clip, a crocodile clip, a hook, a Velcro or a magnet or a piece of metal to receive a magnet.
- the camera device 200 may be affixed permanently or semi-permanently to another object using the fastening device 300 .
- any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), or a combination of these implementations.
- the terms “module,” “functionality,” “component” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof
- the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs).
- the program code can be stored in one or more computer readable memory devices.
- the camera device 200 may include a computer-readable medium that may be configured to maintain instructions that cause the camera's software and associated hardware to perform operations.
- the instructions function to configure the camera's software and associated hardware to perform the operations and in this way result in transformation of the software and associated hardware to perform functions.
- the instructions may be provided by the computer-readable medium to the camera device through a variety of different configurations.
- One such configuration of a computer-readable medium is signal bearing medium and thus is configured to transmit the instructions (e.g., as a carrier wave) to the camera device, such as via a network.
- the computer-readable medium may also be configured as a computer-readable storage medium and thus is not a signal bearing medium. Examples of a computer-readable storage medium include a random-access memory (RAM), read-only memory (ROM), an optical disc, flash memory, hard disk memory, and other memory devices that may use magnetic, optical, and other techniques to store instructions and other data.
- camera device 200 includes a replay mode.
- the replay mode is selected, as by the user pressing an input button associated with initiating the replay mode, the camera automatically captures image data, such as video or still images, and saves the image data to a memory buffer.
- the memory buffer is a circular buffer that saves an amount of image data, for example video data.
- the memory buffer is full of image data, it deletes the oldest image data to make room for newly recorded image data. This continues until either the user exits the replay mode or presses a button associated with initiating video capture, i.e. the “record” button.
- the size of the memory buffer can be set by the user to determine how much image data is to be collected.
- the user might set the length of the memory buffer to correspond to 5 seconds, 30 seconds, 1 minute, 2 minutes, and longer.
- the memory buffer comprises flash memory.
- a pointer is used to designate where, in flash memory, the beginning of the captured video data occurs, e.g., the beginning of the last 2 minutes of video data prior to entering the “record” mode.
- the video data captured during replay mode and “record” mode can be written to an alternate storage location.
- FIG. 4 is a flow diagram that describes steps in a method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method is performed by a suitably-configured camera device such as the one described above.
- Step 400 receives input associated with a replay mode.
- This step can be performed in any suitable way. For example, in at least some embodiments, this step can be performed by receiving input from the user via a suitable input device on the camera device.
- step 402 Responsive to receiving the input associated with the replay mode, step 402 captures image data and saves the image data to a memory buffer.
- Step 404 ascertains whether the buffer is full. If the buffer is not full, the method returns to step 402 and continues to capture image data and save image data to the memory buffer. If, on the other hand, the buffer is full, step 406 deletes the oldest image data in the memory buffer and returns to step 402 to capture subsequent image data.
- FIG. 5 is a flow diagram that describes steps in another method in accordance with one or more embodiments.
- the method which allows a user to set the camera device's memory buffer size, can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method is performed by a suitably-configured camera device such as the one described above.
- Step 500 receives input to set a memory buffer size.
- the step can be performed in any suitable way.
- the step can be performed by receiving user input by way of a suitably-configured input mechanism such as a button on the camera device. Responsive to receiving this input, step 502 sets the memory buffer size.
- Step 504 receives input associated with a replay mode.
- This step can be performed in any suitable way. For example, in at least some embodiments, this step can be performed by receiving input from the user via a suitable input device on the camera device.
- step 506 captures image data and saves the image data to a memory buffer.
- Step 508 ascertains whether the buffer is full. If the buffer is not full, the method returns to step 506 and continues to capture image data and save image data to the memory buffer. If, on the other hand, the buffer is full, step 510 deletes the oldest image data in the memory buffer and returns to step 506 to capture subsequent image data.
- FIG. 6 is a flow diagram that describes steps in another method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method is performed by a suitably-configured camera device such as the one described above.
- Step 600 captures image data and saves the image data to a memory buffer.
- the step can be performed in any suitable way.
- the step can be performed as described in connection with FIG. 4 or 5 .
- Step 602 receives input to enter the camera device's record mode. This step can be performed, for example, by receiving user input by way of a “record” button.
- step 604 saves image data from the beginning of the memory buffer.
- This step can be performed in any suitable way.
- the step can be performed by setting a pointer to point to the beginning of the memory buffer.
- Step 606 saves currently captured image data in addition to the image data from the beginning of the memory buffer. This step can be performed until the user presses the “record” button once more.
- the camera device's processor 106 (FIG. 1 ) is configured to encode image data at different levels of resolution.
- the camera device can encode image data at a low level of resolution and at a high level of resolution as well. Any suitable levels of resolution can be utilized.
- the low level of resolution is Quarter-VGA (e.g., 320 ⁇ 240) and the high level of resolution is 720p (e.g., 1280 ⁇ 720).
- Encoding image data at different resolutions levels can enhance the user's experience insofar as giving the user various options to transfer the saved image data.
- the captured image data can be streamed to a device such as a smart phone.
- a network device such as a laptop or desktop computer.
- Photo log refers to a feature that enables a user to log their day in still photos at intervals of their own choosing. So, for example, if the user wishes to photo log their day at every 3 minutes, they can provide input to the camera device so that every 3 minutes the camera automatically takes a still photo and saves it. At the end of the day, the user will have documented their day with a number of different still photos.
- the photo log feature can work in concert with the replay mode described above.
- the camera device's processor can process portions of the captured video data at defined intervals to provide the still photos. This can be performed in any suitable way.
- the camera device's processor can process the video data on the camera's photosensor and read predefined areas of the photosensor to process the read areas into the still photos.
- the photo format is a square format so that the aspect ratio is different from that aspect ratio of the video data.
- input can be received by the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer.
- the tap can occur at any suitable location on the camera's housing.
- a dedicated tap area may be provided and may have characteristics to facilitate detection. These characteristics may include, by way of example and not limitation, being formed from a material that produces a richer or more identifiable profile.
- the tap or taps can be mapped to camera functionality to activate the functionality. Any suitable type of functionality can be mapped to the tap or taps. For example, in at least some embodiments, a tap or taps can be mapped to functionality that enables video taken by the camera to be bookmarked.
- FIG. 7 is a flow diagram that describes steps in another method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method is performed by a suitably-configured camera device such as the one described above.
- Step 700 receives one or more motion inputs.
- Step 702 ascertains whether the one or more motion inputs are a tap or taps, respectively.
- This step can be performed in any suitable way.
- the step is performed through the use of an accelerometer.
- a tap or taps can have an associated profile or profiles as represented by accelerometer data. These profiles can be stored on the camera device in storage, such as storage 110 .
- the accelerometer can detect the motion and produce associated accelerometer data.
- the camera's processor can then analyze the accelerometer data associated with the motion input and look for a particular stored profile associated with the tap or taps. If an input is received that has a profile that matches or resembles the stored tap profile(s) to a certain degree, the processor can ascertain that the input is a tap.
- step 704 accesses a camera functionality that is associated with the input that was received.
- a camera functionality that is associated with the input that was received.
- any suitable type of functionality can be accessed.
- such functionality is functionality that enables video, currently being taken by the camera, to be bookmarked.
- Step 706 activates the camera functionality. This step can be performed in any suitable way. For example, where the camera functionality bookmarks video taken by the camera, the step can be performed by actually bookmarking the video data.
- different combinations of taps can be mapped to different camera functionality.
- a single tap might be mapped to a first functionality
- a double tap might be mapped to a second functionality
- a triple tap might be mapped to a third functionality
- different tap patterns can be mapped to different functionalities. For example, two taps in rapid succession followed by a third somewhat delayed tap might be mapped to one functionality while a single tap followed by two taps in rapid succession might be mapped to another different functionality.
- FIG. 8 is a flow diagram that describes steps in another method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method is performed by a suitably-configured camera device such as the one described above.
- Step 800 receives one or more motion inputs.
- Step 802 ascertains whether the one or more motion inputs are a tap or taps, respectively.
- This step can be performed in any suitable way. In at least some embodiments, this step is performed through the use of an accelerometer.
- a tap or taps can have an associated profile or profiles as represented by accelerometer data. These profiles can be stored on the camera device in storage, such as storage 110 .
- the accelerometer can detect the motion and produce associated accelerometer data.
- the camera's processor can then analyze the accelerometer data associated with the motion input and look for a particular stored profile associated with the tap or taps. If an input is received that has a profile that matches or resembles the stored tap profile(s) to a certain degree, the processor can ascertain that the input is a tap.
- Step 804 maps the tap or taps to a particular functionality. Specifically, in this embodiment, different combinations of taps can be mapped to different functionalities, respectively. Accordingly, this step ascertains which of a number of different functionalities the tap or taps are associated with.
- step 806 accesses the camera functionality that is associated with the input that was received. As noted above, any suitable type of functionality can be accessed.
- Step 808 activates the camera functionality. This step can be performed in any suitable way.
- the camera includes a microphone which detects sound around the camera.
- the microphone can be used to sense or create a noise profile associated with motion input that is received by the camera.
- the noise profile can be used, together with the motion profile, to confirm whether or not the motion input is a tap input. This can help to disambiguate various other types of input that might be received by the camera.
- the user may be wearing the camera and may jump up and down.
- the jump may have a motion profile that is similar to that of a tap.
- the jump also has a noise profile.
- the camera can confirm whether or not the received motion input is a tap.
- the motion profile of the jump may be similar to that of a tap, if the noise profile of the jump is different from that of the tap, the camera can conclude that the motion input is not, in fact, a tap.
- FIG. 9 is a flow diagram that describes steps in another method in accordance with one or more embodiments.
- the method can be performed in connection with any suitable hardware, software, firmware, or combination thereof.
- the method is performed by a suitably-configured camera device such as the one described above.
- Step 900 receives one or more motion inputs.
- Step 902 receives noise input associated with the motion input.
- the step can be performed using the camera's microphone.
- Step 904 ascertains whether the one or more motion inputs are a tap or taps, respectively.
- This step can be performed in any suitable way. In at least some embodiments, this step is performed through the use of an accelerometer and a microphone. Specifically, a tap or taps have a particular motion profile that is represented by accelerometer data and stored on the camera device. Further, the tap or taps have a particular noise profile that is also stored on the camera device.
- the camera's processor can analyze the accelerometer data and the noise input that is received and derive motion and noise profiles from the data, and then look for a stored tap profile for a match (both motion and noise). If an input is received that has a motion profile and noise profile that matches or resembles the tap profile to a certain degree, the processor can ascertain that the input is a tap.
- Step 906 maps the tap or taps to a particular functionality. Specifically, in some embodiments, different combinations of taps are mapped to different functionalities respectively. Accordingly, this step can ascertain which of a number of different functionalities the tap or taps are associated with. Alternately, a single camera functionality, such as bookmarking a video, may be associated with tap input. In this case, the tap or taps are mapped to the single functionality.
- step 908 accesses the camera functionality that is associated with the input that was received. As noted above, any suitable type of functionality can be accessed. Step 910 activates the camera functionality. This step can be performed in any suitable way.
- a wearable camera that can be worn by a user.
- the wearable camera includes an accelerometer that can be used to detect camera motion.
- Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer.
- the tap or taps can be mapped to camera functionality to activate the functionality.
- the camera includes a microphone which detects sound around the camera.
- the microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera.
- the noise profile can be used, together with the motion profile, to confirm the input as a tap input.
Abstract
Various embodiments provide a wearable camera that can be worn by a user. The wearable camera includes an accelerometer that can be used to detect camera motion. Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer. The tap or taps can be mapped to camera functionality to activate the functionality.
Description
- This application is a continuation of U.S. patent application Ser. No. 14/716,355, filed May 19, 2015, entitled “Camera Tap Switch”, which is a continuation of U.S. patent application Ser. No. 13/871,905, filed Apr. 26, 2013, entitled “Camera Tap Switch”, the entire disclosures of which is hereby incorporated by reference herein their entirety.
- Physical buttons are not always an appealing feature to add to hardware, such as a camera. This is particularly true when the hardware or camera has a small form factor. Additional physical buttons can create a crowding situation on the hardware or camera and can lead to an unaesthetic appearance. Further, crowded buttons increase the likelihood that a user will inadvertently press the wrong button.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter.
- Various embodiments provide a wearable camera that can be worn by a user. The wearable camera includes an accelerometer that can be used to detect camera motion. Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer. The tap or taps can be mapped to camera functionality to activate the functionality.
- In at least some embodiments, different combinations of taps can be mapped to different camera functionality. Further, in at least some embodiments, the camera includes a microphone which detects sound around the camera. The microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera. The noise profile can be used, together with the motion profile, to confirm the input as a tap input.
- The detailed description references the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
-
FIG. 1 is an example camera device in accordance with one or more embodiments. -
FIG. 2 illustrates an example camera device in accordance with one or more embodiments. -
FIG. 3 illustrates an example camera device in accordance with one or more embodiments. -
FIG. 4 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 6 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 7 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 8 is a flow diagram that describes steps in a method in accordance with one or more embodiments. -
FIG. 9 is a flow diagram that describes steps in a method in accordance with one or more embodiments. - Overview
- Various embodiments provide a wearable camera that can be worn by a user. The wearable camera includes an accelerometer that can be used to detect camera motion. Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer. The tap or taps can be mapped to camera functionality to activate the functionality. Any suitable type of functionality can be mapped to the tap or taps.
- In at least some embodiments, different combinations of taps can be mapped to different camera functionality. Further, in at least some embodiments, the camera includes a microphone which detects sound around the camera. The microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera. The noise profile can be used, together with the motion profile, to confirm the input as a tap input. This can help to disambiguate various other types of input that might be received by the camera. For example, the user may be wearing the camera and may jump up and down. The jump may have a motion profile that is similar to that of a tap. By looking for a noise profile associated with a tap when a tap-like motion profile is received, the camera can confirm whether or not the received input is a tap.
- The camera can be worn in any suitable location. For example, the camera can be worn on a user's head such as, a way of example and not limitation, a hat-mounted camera, glasses-mounted camera, headband-mounted camera, helmet-mounted camera, and the like. Alternately or additionally, the camera can be worn on locations other than the user's head. For example, the camera can be configured to be mounted on the user's clothing.
- Various other embodiments provide a wearable camera that is mountable on a user's clothing The camera is designed to be unobtrusive and user-friendly insofar as being mounted away from the user's face so as not to interfere with their view. In at least some embodiments, the camera includes a housing and a clip mounted to the housing to enable the camera to be clipped onto the user's clothing The camera is designed to be lightweight with its weight balanced in a manner that is toward the user when clipped to the user's clothing.
- In one or more embodiments, the camera includes a replay mode. When the replay mode is selected, the camera automatically captures image data, such as video or still images, and saves the image data to a memory buffer. In at least some embodiments, the size of the memory buffer can be set by the user to determine how much image data is to be collected. Once the memory buffer is full, the older image data is erased to make room for currently-captured image data. If an event occurs that the user wishes to memorialize through video or still images, a record button can be activated which saves the image data from the beginning of the memory buffer and continues recording until the user presses the record button again. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time.
- In the discussion that follows, a section entitled “Example Environment” describes an example environment in which the various embodiments can be utilized. Next, a section entitled “Replay Functionality” describes an example replay mode in accordance with one or more embodiments. Following this, a section entitled “Duel Encoding” describes an embodiment in which captured image data can be dual encoded in accordance with one or more embodiments. Next, a section entitled “Photo Log” describes an example photo log in accordance with one or more embodiments. Following this, a section entitled “Camera Tap Switch” describes a camera tap switch in accordance with one or more embodiments.
- Consider now an example environment in which various embodiments can be practiced.
- Example Environment
-
FIG. 1 illustrates a schematic of acamera device 100 in accordance with one or more embodiments. Thecamera device 100 includes alens 102 having a focal length that is suitable for covering a scene to be pictured. In one embodiment, a mechanical device may be included with thelens 102 to enable auto or manual focusing of the lens. In another embodiment, thecamera device 100 may be a fixed focus device in which no mechanical assembly is included to move thelens 102. Asensor 104 having a sensing surface (not shown) is also included to convert an image formed by the incoming light on the sensing surface of thesensor 104 into a digital format. Thesensor 104 may include a charge-coupled device (CCD) or complementary metal oxide semiconductor (CMOS) image sensor for scanning the incoming light and creating a digital picture. Other technologies or devices may be used so long as the used device is capable of converting an image formed by the incoming light on a sensing surface into the digital form. Typically, these image detection devices determine the effects of light on tiny light sensitive devices and record the changes in a digital format. - It should be appreciated that the
camera device 100 may include other components such as a battery or power source and other processor components that are required for a processor to operate. However, to avoid obfuscating the teachings, these well-known components are being omitted. In one embodiment, thecamera device 100 does not include a view finder or a preview display. In other embodiments, however, a preview display may be provided. The techniques described herein can be used in any type of camera, and are particularly effective in small, highly portable cameras, such as those implemented in mobile telephones and other portable user equipment. Thus, in one embodiment, thecamera device 100 includes hardware or software for making and receiving phone calls. Alternately, thecamera device 100 can be a dedicated, stand-alone camera. - In at least some embodiments, the
camera device 100 further includes amotion detector 108 that can include an accelerometer and, in some embodiments, a gyroscope. The accelerometer is used for determining the direction of gravity and acceleration in any direction. The gyroscope may also be used either in addition to the accelerometer or instead of the accelerometer. The gyroscope can provide information about how the rotational angle of thecamera device 100 changes over time. Any other type of sensor may be used to detect the camera's motion. Using the rotational angle, an angle of rotation of thecamera device 100 may be calculated, if thecamera device 100 is rotated. In at least some embodiments, input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer. The tap or taps can be mapped to camera functionality to activate the functionality. Any suitable type of functionality can be mapped to the tap or taps. - In at least some embodiments, different combinations of taps can be mapped to different camera functionality. Further, in at least some embodiments, the camera includes a microphone which detects sound around the camera. The microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera. The noise profile can be used, together with the motion profile, to confirm the input as a tap input. This can help to disambiguate various other types of input that might be received by the camera, as noted above and below.
- Further included is an input/output (I/O)
port 114 for connecting thecamera device 100 to an external device, including a general purpose computer. The I/O port 114 may be used for enabling the external device to configure thecamera device 100 or to upload/download data. In one embodiment, the I/O port 114 may also be used for streaming video or pictures from thecamera device 100 to the external device. In one embodiment, the I/O port may also be used for powering thecamera device 100 or charging a rechargeable battery (not shown) in thecamera device 100. - The
camera device 100 may also include anantenna 118 that is coupled to a transmitter/receiver (Tx/Rx)module 116. The Tx/Rx module 116 is coupled to aprocessor 106. Theantenna 118 may be fully or partly exposed outside the body of thecamera device 100. However, in another embodiment, theantenna 118 may be fully encapsulated within the body of thecamera device 100. The Tx/Rx module 116 may be configured for Wi-Fi transmission/reception, Bluetooth transmission/reception or both. In another embodiment, the Tx/Rx module 116 may be configured to use a proprietary protocol for transmission/reception of the radio signals. In yet another embodiment, any radio transmission or data transmission standard may be used so long as the used standard is capable of transmitting/receiving digital data and control signals. In one embodiment, the Tx/Rx module 116 is a low power module with a transmission range of less than ten feet. In another embodiment, the Tx/Rx module 116 is a low power module with a transmission range of less than five feet. In other embodiments, the transmission range may be configurable using control signals received by thecamera device 100 either via the I/O port 114 or via theantenna 118. - The
camera device 100 further includes aprocessor 106. Theprocessor 106 is coupled to thesensor 104 and themotion detector 108. Theprocessor 106 may also be coupled tostorage 110, which, in one embodiment, is external to theprocessor 106. Thestorage 110 may be used for storing programming instructions for controlling and operating other components of thecamera device 100. Thestorage 110 may also be used for storing captured media (e.g., pictures and/or videos). In another embodiment, thestorage 110 may be a part of theprocessor 106 itself - In one embodiment, the
processor 106 may include animage processor 112. Theimage processor 112 may be a hardware component or may also be a software module that is executed by theprocessor 106. It may be noted that theprocessor 106 and/or theimage processor 112 may reside in different chips. For example, multiple chips may be used to implement theprocessor 106. In one example, theimage processor 112 may be a Digital Signal Processor (DSP). The image processor can be configured as a processing module, that is a computer program executable by a processor. In at least some embodiments, theprocessor 112 is used to process a raw image received from thesensor 104 based, at least in part, on the input received from themotion detector 108. Other components such as Image Signal Processor (ISP) may be used for image processing. - In one embodiment, the
storage 110 is configured to store both raw (unmodified image) and the corresponding modified image. In one or more embodiments, thestorage 110 can include a memory buffer, such as a flash memory buffer, that can be used as a circular buffer to facilitate capturing image data when the camera is set to a replay mode that is supported byreplay module 120. Thereplay module 120 can be implemented in connection with any suitable hardware, software, firmware, or combination thereof When the replay mode is selected, the camera automatically captures image data, such as video or still images, and saves the image data to the memory buffer. In at least some embodiments, the size of the memory buffer can be set by the user to determine how much image data is to be collected. If an event occurs that the user wishes to memorialize through video or still images, a record button can be activated which saves the image data from the beginning of the memory buffer and continues recording until the user presses the record button again. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time. - A processor buffer (not shown) may also be used to store the image data. The pictures can be downloaded to the external device via the I/
O port 114 or via the wireless channels using theantenna 118. In one embodiment, both unmodified and modified images are downloaded to the external device when the external device sends a command to download images from thecamera device 110. In one embodiment, thecamera device 100 may be configured to start capturing a series of images at a selected interval. - In one embodiment, a raw image from the
sensor 104 is inputted to an image processor (such as an ISP) for image processing or blur detection. After image processing is applied to the image outputted by the image processor, the modified image is encoded. The image encoding is typically performed to compress the image data. - In an example embodiment, the
camera device 100 may not include the components for processing the image captured by thesensor 104. Instead, thecamera device 100 may include programming instructions to transmit the raw image after extracting the image from thesensor 104 to a cloud based processing system that is connected to thecamera device 100 via the Internet or a local area network. The cloud based system is configured to receive the raw image and process the image or images as described above and below. The encoded image is then either stored in a selected cloud based storage or the image is sent back to thecamera device 100 or to any other device according to a user configuration. The use of a cloud based image processing system can reduce a need for incorporating several image processing components in each camera device, thus making a camera device lighter, more energy efficient and cheaper. - In another example embodiment, instead of a cloud based image processing, the
camera device 100 may send either a raw image or the image processed through an image processor to another device, e.g., a mobile phone or a computer. The image may be transmitted to the mobile phone (or a computer) for further processing via Wi-Fi, Bluetooth or any other type of networking protocol that is suitable for transmitting digital data from one device to another device. After the mobile device receives the image or images, according to one or more embodiments described herein, the produced image may be saved to local storage on the device, transferred for storage in a cloud based storage system, or transmitted to another device, according to user or system configurations. - In one embodiment, the native image processing system in the
camera device 100 may produce images and/or videos in a non-standard format. For example, a 1200×1500 pixel image may be produced. This may be done by cropping, scaling, or using an image sensor with a non-standard resolution. Since methods for transforming images in a selected standard resolution are well-known, there will be no further discussion on this topic. - Various embodiments described above and below can be implemented utilizing a computer-readable storage medium that includes instructions that enable a processing unit to implement one or more aspects of the disclosed methods as well as a system configured to implement one or more aspects of the disclosed methods. By “computer-readable storage medium” is meant all statutory forms of media. Accordingly, non-statutory forms of media such as carrier waves and signals per se are not intended to be covered by the term “computer-readable storage medium”.
- As noted above,
camera device 100 can assume any suitable form of wearable camera. The camera can be worn in any suitable location relative to a user. For example, the camera can be worn on a user's head such as, by a way of example and not limitation, a hat-mounted camera, glasses-mounted camera, headband-mounted camera, helmet-mounted camera, and the like. Alternately or additionally, the camera can be worn on locations other than the user's head. For example, the camera can be configured to be mounted on the user's clothing or other items carried by a user, such as a backpack, purse, briefcase, and the like. - In the example provided just below, a wearable camera is described in the context of a camera that is mountable on the user's clothing It is to be appreciated and understood, however, that other types of non-clothing mountable, wearable cameras can be utilized without departing from the spirit and scope of the claimed subject matter.
- Moving on to
FIGS. 2 and 3 , consider the following.FIG. 2 illustrates anexample camera device 200 in a front elevational view, whileFIG. 3 illustrates thecamera device 200 in a side elevational view. Thecamera device 200 includes ahousing 202 that contains the components described inFIG. 1 . Also illustrated is a camera lens 204 (FIG. 2 ) and a fastening device 300 (FIG. 3 ) in the form of a clip that operates in a manner that is similar to a clothespin. Specifically, thefastening device 300 includes aprong 302 with a body having a thumb-engageable portion 304. The body extends along an axis away from the thumb-engageable portion 304 toward adistal terminus 306. A spring mechanism, formed by the body or separate from and internal relative to the body, enablesprong 302 to be opened responsive to pressure being applied to the thumb-engageable portion 304. When opened, a piece of clothing can be inserted intoarea 308. When the thumb-engageable portion 304 is released, the clothing is clamped in place by theprong 302 thereby securely mounting the camera device on a piece of clothing For example, the camera device can be mounted, as described above, on a necktie, blouse, shirt, pocket, and the like. - In addition,
camera device 200 can include a number of input buttons shown generally at 310. The input buttons can include, by way of example and not limitation, an input button to take a still picture, an input button to initiate the replay mode, an input button to initiate a video capture mode, and an input button to enable the user to adjust the buffer size that is utilized during the replay mode. It is to be appreciated and understood that the various input buttons can be located anywhere on thecamera device 200. - It may be noted that even though the
camera device 200 is shown to have a particular shape, thecamera device 100 can be manufactured in any shape shape and size suitable and sufficient to accommodate the above described components of thecamera device 100. Thehousing 202 of the camera device may be made of a metal molding, a synthetic material molding or a combination thereof. In other embodiments, any suitable type of material may be used to provide a durable and strong outer shell for typical portable device use. - In addition, the
fastening device 300 can comprise any suitable type of fastening device. For example, the fastening device may be a simple slip-on clip, a crocodile clip, a hook, a Velcro or a magnet or a piece of metal to receive a magnet. Thecamera device 200 may be affixed permanently or semi-permanently to another object using thefastening device 300. - Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), or a combination of these implementations. The terms “module,” “functionality,” “component” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the techniques described below are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- For example, the
camera device 200 may include a computer-readable medium that may be configured to maintain instructions that cause the camera's software and associated hardware to perform operations. Thus, the instructions function to configure the camera's software and associated hardware to perform the operations and in this way result in transformation of the software and associated hardware to perform functions. The instructions may be provided by the computer-readable medium to the camera device through a variety of different configurations. - One such configuration of a computer-readable medium is signal bearing medium and thus is configured to transmit the instructions (e.g., as a carrier wave) to the camera device, such as via a network. The computer-readable medium may also be configured as a computer-readable storage medium and thus is not a signal bearing medium. Examples of a computer-readable storage medium include a random-access memory (RAM), read-only memory (ROM), an optical disc, flash memory, hard disk memory, and other memory devices that may use magnetic, optical, and other techniques to store instructions and other data.
- Having considered an example operating environment in accordance with one or more embodiments, consider now a discussion of replay functionality and other features that can be provided by the camera device.
- Replay Functionality
- As noted above,
camera device 200 includes a replay mode. When the replay mode is selected, as by the user pressing an input button associated with initiating the replay mode, the camera automatically captures image data, such as video or still images, and saves the image data to a memory buffer. In one or more embodiments, the memory buffer is a circular buffer that saves an amount of image data, for example video data. When the memory buffer is full of image data, it deletes the oldest image data to make room for newly recorded image data. This continues until either the user exits the replay mode or presses a button associated with initiating video capture, i.e. the “record” button. - In at least some embodiments, the size of the memory buffer can be set by the user to determine how much image data is to be collected. As an example, the user might set the length of the memory buffer to correspond to 5 seconds, 30 seconds, 1 minute, 2 minutes, and longer.
- Assume now that an event occurs that the user wishes to memorialize through video or still images. Assume also that the user has initiated the replay mode so that video data is currently being buffered in the memory buffer. By pressing the “record” button the video data is now saved from the beginning of the memory buffer and recording continues until the user presses the record button again. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time. So, for example, if the user initially set the memory buffer to capture 2 minutes worth of video data, by pressing the “record” button, the last 2 minutes of video data will be recorded in addition to the currently recorded video data.
- In one or more embodiments, the memory buffer comprises flash memory. When the user presses the “record” button and the camera device is in replay mode, a pointer is used to designate where, in flash memory, the beginning of the captured video data occurs, e.g., the beginning of the last 2 minutes of video data prior to entering the “record” mode. In other embodiments, the video data captured during replay mode and “record” mode can be written to an alternate storage location.
-
FIG. 4 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method is performed by a suitably-configured camera device such as the one described above. - Step 400 receives input associated with a replay mode. This step can be performed in any suitable way. For example, in at least some embodiments, this step can be performed by receiving input from the user via a suitable input device on the camera device. Responsive to receiving the input associated with the replay mode,
step 402 captures image data and saves the image data to a memory buffer. Step 404 ascertains whether the buffer is full. If the buffer is not full, the method returns to step 402 and continues to capture image data and save image data to the memory buffer. If, on the other hand, the buffer is full,step 406 deletes the oldest image data in the memory buffer and returns to step 402 to capture subsequent image data. - This process continues until either the user presses the “record” button or exits the replay mode.
-
FIG. 5 is a flow diagram that describes steps in another method in accordance with one or more embodiments. The method, which allows a user to set the camera device's memory buffer size, can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method is performed by a suitably-configured camera device such as the one described above. - Step 500 receives input to set a memory buffer size. The step can be performed in any suitable way. For example, in at least some embodiments, the step can be performed by receiving user input by way of a suitably-configured input mechanism such as a button on the camera device. Responsive to receiving this input, step 502 sets the memory buffer size.
- Step 504 receives input associated with a replay mode. This step can be performed in any suitable way. For example, in at least some embodiments, this step can be performed by receiving input from the user via a suitable input device on the camera device. Responsive to receiving the input associated with the replay mode,
step 506 captures image data and saves the image data to a memory buffer. Step 508 ascertains whether the buffer is full. If the buffer is not full, the method returns to step 506 and continues to capture image data and save image data to the memory buffer. If, on the other hand, the buffer is full,step 510 deletes the oldest image data in the memory buffer and returns to step 506 to capture subsequent image data. - This process continues until either the user presses the “record” button or exits the replay mode.
-
FIG. 6 is a flow diagram that describes steps in another method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method is performed by a suitably-configured camera device such as the one described above. - Step 600 captures image data and saves the image data to a memory buffer. The step can be performed in any suitable way. For example, the step can be performed as described in connection with
FIG. 4 or 5 . Step 602 receives input to enter the camera device's record mode. This step can be performed, for example, by receiving user input by way of a “record” button. Responsive to receiving the input to enter record mode, step 604 saves image data from the beginning of the memory buffer. This step can be performed in any suitable way. For example, the step can be performed by setting a pointer to point to the beginning of the memory buffer. Step 606 saves currently captured image data in addition to the image data from the beginning of the memory buffer. This step can be performed until the user presses the “record” button once more. - Having considered an example replay mode and how it can be implemented with a suitably hiding configured camera device, consider now aspects of a dual encoding process.
- Dual Encoding
- In one or more embodiments, the camera device's processor 106 (FIG.1) is configured to encode image data at different levels of resolution. For example, the camera device can encode image data at a low level of resolution and at a high level of resolution as well. Any suitable levels of resolution can be utilized. In at least some embodiments, the low level of resolution is Quarter-VGA (e.g., 320×240) and the high level of resolution is 720p (e.g., 1280×720).
- Encoding image data at different resolutions levels can enhance the user's experience insofar as giving the user various options to transfer the saved image data. For example, at lower resolution levels, the captured image data can be streamed to a device such as a smart phone. Alternately or additionally, at higher resolution levels, when the user has Wi-Fi accessibility, they can transfer the image data to a network device such as a laptop or desktop computer.
- Having considered a dual encoding scenario, consider now aspects of a photo log that can be constructed using the principles described above.
- Photo Log
- Photo log refers to a feature that enables a user to log their day in still photos at intervals of their own choosing. So, for example, if the user wishes to photo log their day at every 3 minutes, they can provide input to the camera device so that every 3 minutes the camera automatically takes a still photo and saves it. At the end of the day, the user will have documented their day with a number of different still photos.
- In at least some embodiments, the photo log feature can work in concert with the replay mode described above. For example, if the user has entered the replay mode by causing image data to be captured and saved to the memory buffer, the camera device's processor can process portions of the captured video data at defined intervals to provide the still photos. This can be performed in any suitable way. For example, the camera device's processor can process the video data on the camera's photosensor and read predefined areas of the photosensor to process the read areas into the still photos. In some instances the photo format is a square format so that the aspect ratio is different from that aspect ratio of the video data.
- Having considered an example photo log feature, consider now how this feature can be used in connection with the camera embodiments described below.
- Accelerometer-Based Camera Tap Switch
- As noted above, input can be received by the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer. The tap can occur at any suitable location on the camera's housing. In at least some embodiments, a dedicated tap area may be provided and may have characteristics to facilitate detection. These characteristics may include, by way of example and not limitation, being formed from a material that produces a richer or more identifiable profile. The tap or taps can be mapped to camera functionality to activate the functionality. Any suitable type of functionality can be mapped to the tap or taps. For example, in at least some embodiments, a tap or taps can be mapped to functionality that enables video taken by the camera to be bookmarked. For example, assume that a user is wearing the camera and has placed a camera into a photo log mode that automatically captures video. At some point, something interesting may occur in the video that the user wishes to bookmark. At this point, if the user taps the camera housing, the camera can detect the tap and take steps to bookmark the video.
-
FIG. 7 is a flow diagram that describes steps in another method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method is performed by a suitably-configured camera device such as the one described above. - Step 700 receives one or more motion inputs. Step 702 ascertains whether the one or more motion inputs are a tap or taps, respectively. This step can be performed in any suitable way. In at least some embodiments, the step is performed through the use of an accelerometer. Specifically, a tap or taps can have an associated profile or profiles as represented by accelerometer data. These profiles can be stored on the camera device in storage, such as
storage 110. When the motion input is received, the accelerometer can detect the motion and produce associated accelerometer data. The camera's processor can then analyze the accelerometer data associated with the motion input and look for a particular stored profile associated with the tap or taps. If an input is received that has a profile that matches or resembles the stored tap profile(s) to a certain degree, the processor can ascertain that the input is a tap. - Responsive to the input(s) being a tap or taps, step 704 accesses a camera functionality that is associated with the input that was received. As noted above, any suitable type of functionality can be accessed. In at least some embodiments, such functionality is functionality that enables video, currently being taken by the camera, to be bookmarked. Step 706 activates the camera functionality. This step can be performed in any suitable way. For example, where the camera functionality bookmarks video taken by the camera, the step can be performed by actually bookmarking the video data.
- In at least some embodiments, different combinations of taps can be mapped to different camera functionality. Thus, a single tap might be mapped to a first functionality, a double tap might be mapped to a second functionality, a triple tap might be mapped to a third functionality, and so on. Further, different tap patterns can be mapped to different functionalities. For example, two taps in rapid succession followed by a third somewhat delayed tap might be mapped to one functionality while a single tap followed by two taps in rapid succession might be mapped to another different functionality.
-
FIG. 8 is a flow diagram that describes steps in another method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method is performed by a suitably-configured camera device such as the one described above. - Step 800 receives one or more motion inputs. Step 802 ascertains whether the one or more motion inputs are a tap or taps, respectively. This step can be performed in any suitable way. In at least some embodiments, this step is performed through the use of an accelerometer. Specifically, a tap or taps can have an associated profile or profiles as represented by accelerometer data. These profiles can be stored on the camera device in storage, such as
storage 110. When the motion input is received, the accelerometer can detect the motion and produce associated accelerometer data. The camera's processor can then analyze the accelerometer data associated with the motion input and look for a particular stored profile associated with the tap or taps. If an input is received that has a profile that matches or resembles the stored tap profile(s) to a certain degree, the processor can ascertain that the input is a tap. - Step 804 maps the tap or taps to a particular functionality. Specifically, in this embodiment, different combinations of taps can be mapped to different functionalities, respectively. Accordingly, this step ascertains which of a number of different functionalities the tap or taps are associated with.
- Responsive to ascertaining which functionality is associated with the tap or taps, step 806 accesses the camera functionality that is associated with the input that was received. As noted above, any suitable type of functionality can be accessed. Step 808 activates the camera functionality. This step can be performed in any suitable way.
- Further, in at least some embodiments, the camera includes a microphone which detects sound around the camera. The microphone can be used to sense or create a noise profile associated with motion input that is received by the camera. The noise profile can be used, together with the motion profile, to confirm whether or not the motion input is a tap input. This can help to disambiguate various other types of input that might be received by the camera. For example, the user may be wearing the camera and may jump up and down. The jump may have a motion profile that is similar to that of a tap. However, the jump also has a noise profile. By capturing the noise profile of the jump and comparing it with noise profiles of known taps that can be stored on the camera, the camera can confirm whether or not the received motion input is a tap. For example, while the motion profile of the jump may be similar to that of a tap, if the noise profile of the jump is different from that of the tap, the camera can conclude that the motion input is not, in fact, a tap.
-
FIG. 9 is a flow diagram that describes steps in another method in accordance with one or more embodiments. The method can be performed in connection with any suitable hardware, software, firmware, or combination thereof. In at least some embodiments, the method is performed by a suitably-configured camera device such as the one described above. - Step 900 receives one or more motion inputs. Step 902 receives noise input associated with the motion input. The step can be performed using the camera's microphone. Step 904 ascertains whether the one or more motion inputs are a tap or taps, respectively. This step can be performed in any suitable way. In at least some embodiments, this step is performed through the use of an accelerometer and a microphone. Specifically, a tap or taps have a particular motion profile that is represented by accelerometer data and stored on the camera device. Further, the tap or taps have a particular noise profile that is also stored on the camera device. When the motion input is received, the camera's processor can analyze the accelerometer data and the noise input that is received and derive motion and noise profiles from the data, and then look for a stored tap profile for a match (both motion and noise). If an input is received that has a motion profile and noise profile that matches or resembles the tap profile to a certain degree, the processor can ascertain that the input is a tap.
- Step 906 maps the tap or taps to a particular functionality. Specifically, in some embodiments, different combinations of taps are mapped to different functionalities respectively. Accordingly, this step can ascertain which of a number of different functionalities the tap or taps are associated with. Alternately, a single camera functionality, such as bookmarking a video, may be associated with tap input. In this case, the tap or taps are mapped to the single functionality.
- Responsive to ascertaining which functionality is associated with the tap or taps, step 908 accesses the camera functionality that is associated with the input that was received. As noted above, any suitable type of functionality can be accessed. Step 910 activates the camera functionality. This step can be performed in any suitable way.
- Various embodiments provide a wearable camera that can be worn by a user. The wearable camera includes an accelerometer that can be used to detect camera motion. Input can be provided to the camera in the form of one or more taps which have an associated motion profile, as sensed by the accelerometer. The tap or taps can be mapped to camera functionality to activate the functionality.
- In at least some embodiments, different combinations of taps can be mapped to different camera functionality. Further, in at least some embodiments, the camera includes a microphone which detects sound around the camera. The microphone can be used to sense a noise profile associated with the tap or taps that are received by the camera. The noise profile can be used, together with the motion profile, to confirm the input as a tap input.
- Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the various embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the various embodiments.
Claims (20)
1. A computing device comprising:
a housing;
a camera lens configured to enable capture of image data;
a processor configured to:
detect one or more taps to the housing of the computing device;
responsive to detection of the one or more taps to the housing of the computing device, activate a camera functionality of the camera lens that is associated with the one or more taps.
2. The computing device of claim 1 , wherein different tap patterns are mapped to different camera functionalities of the camera lens.
3. The computing device of claim 1 , wherein the computing device further comprises a microphone configured to sense a noise profile associated with the one or more taps, and wherein the processor is configured to detect the one or more taps to the housing based at least in part on the noise profile associated with the one or more taps.
4. The computing device of claim 1 , wherein the computing device further comprises a motion detector configured to receive one or more motion inputs corresponding to the one or more taps, and wherein the processor is further configured to detect the one or more taps to the housing based at least in part on the one or more motion inputs corresponding to the one or more taps.
5. The computing device of claim 4 , wherein the motion detector comprises an accelerometer.
6. The computing device of claim 4 , wherein the motion detector comprises an accelerometer, and wherein the processor is further configured to ascertain whether the one or more motion inputs are one or more taps by analyzing the accelerometer data to ascertain whether the motion profile associated with the one or more motion inputs matches or resembles one or more stored tap profiles.
7. The computing device of claim 1 , wherein the computing device comprises a wearable device.
8. The computing device of claim 7 , further comprising a fastening device on the housing, the fastening device configured to enable the wearable device to be worn by a user.
9. The computing device of claim 1 , wherein the computing device is configured to enter a camera mode that enables video to be automatically captured by the camera lens, and wherein the camera functionality comprises functionality enabling the video that is automatically captured by the camera lens to be bookmarked.
10. A method comprising:
receiving, with a device, one or more motion inputs;
ascertaining whether the one or more motion inputs are one or more taps to a housing of the device; and
responsive to ascertaining that the one or more motion inputs are one or more taps to the housing of the device, activating a camera functionality that is associated with the one or more taps.
11. The method of claim 10 , wherein the camera functionality comprises functionality enabling automatically captured video to be bookmarked.
12. The method of claim 10 , wherein said ascertaining is performed using an accelerometer on the device.
13. The method of claim 10 , wherein said ascertaining is performed by:
detecting the one or more motion inputs using an accelerometer;
producing associated accelerometer data; and
analyzing the accelerometer data to ascertain whether
the motion profile associated with the one or more motion inputs matches or resembles one or more stored tap profiles.
14. The method of claim 10 , wherein different tap patterns are mapped to different camera functionalities.
15. The method of claim 10 , wherein the device comprises a wearable device.
16. The method of claim 15 , wherein the wearable device is configured to be worn on a user's clothing
17. The method of claim 15 , wherein the wearable device is configured to be worn on a location other than a user's clothing
18. A method comprising:
capturing, with a camera of a device, image data at predefined time intervals;
saving the image data to a circular memory buffer;
determining whether the circular memory buffer is full; and
responsive to determining that the circular memory buffer is full, deleting the oldest image data.
19. The method of claim 18 , wherein the device comprises a wearable device.
20. The method of claim 19 , wherein the wearable device is configured to be worn on a user's clothing
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/230,485 US20160344926A1 (en) | 2013-04-26 | 2016-08-08 | Camera Tap Switch |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/871,905 US9066007B2 (en) | 2013-04-26 | 2013-04-26 | Camera tap switch |
US14/716,355 US9444996B2 (en) | 2013-04-26 | 2015-05-19 | Camera tap switch |
US15/230,485 US20160344926A1 (en) | 2013-04-26 | 2016-08-08 | Camera Tap Switch |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/716,355 Continuation US9444996B2 (en) | 2013-04-26 | 2015-05-19 | Camera tap switch |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160344926A1 true US20160344926A1 (en) | 2016-11-24 |
Family
ID=50842345
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/871,905 Active US9066007B2 (en) | 2013-04-26 | 2013-04-26 | Camera tap switch |
US14/716,355 Active US9444996B2 (en) | 2013-04-26 | 2015-05-19 | Camera tap switch |
US15/230,485 Abandoned US20160344926A1 (en) | 2013-04-26 | 2016-08-08 | Camera Tap Switch |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/871,905 Active US9066007B2 (en) | 2013-04-26 | 2013-04-26 | Camera tap switch |
US14/716,355 Active US9444996B2 (en) | 2013-04-26 | 2015-05-19 | Camera tap switch |
Country Status (5)
Country | Link |
---|---|
US (3) | US9066007B2 (en) |
EP (1) | EP2965173A1 (en) |
KR (1) | KR102155397B1 (en) |
CN (1) | CN105493003A (en) |
WO (1) | WO2014176298A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160344924A1 (en) * | 2015-05-08 | 2016-11-24 | Albert Tsai | System and Method for Preserving Video Clips from a Handheld Device |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9282244B2 (en) | 2013-03-14 | 2016-03-08 | Microsoft Technology Licensing, Llc | Camera non-touch switch |
US9066007B2 (en) * | 2013-04-26 | 2015-06-23 | Skype | Camera tap switch |
US9693010B2 (en) * | 2014-03-11 | 2017-06-27 | Sony Corporation | Method, electronic device, and server for generating digitally processed pictures |
US20160142684A1 (en) * | 2014-11-15 | 2016-05-19 | Robert Gruder | Camera apparatus and method |
EP3065024A1 (en) * | 2015-03-06 | 2016-09-07 | Universidad de Sevilla | Wearable camera apparatus with selectable processing of image content |
GB2539387B (en) * | 2015-06-09 | 2021-04-14 | Oxford Metrics Plc | Motion capture system |
US11190681B1 (en) * | 2015-07-10 | 2021-11-30 | Snap Inc. | Systems and methods for DSP fast boot |
US20180249056A1 (en) * | 2015-08-18 | 2018-08-30 | Lg Electronics Inc. | Mobile terminal and method for controlling same |
US9298283B1 (en) | 2015-09-10 | 2016-03-29 | Connectivity Labs Inc. | Sedentary virtual reality method and systems |
JP6819042B2 (en) | 2016-01-14 | 2021-01-27 | ソニー株式会社 | Imaging control device, imaging control method, program |
JPWO2018155296A1 (en) * | 2017-02-23 | 2019-12-12 | パナソニックIpマネジメント株式会社 | Optical device |
US10127423B1 (en) | 2017-07-06 | 2018-11-13 | Hand Held Products, Inc. | Methods for changing a configuration of a device for reading machine-readable code |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9066007B2 (en) * | 2013-04-26 | 2015-06-23 | Skype | Camera tap switch |
Family Cites Families (135)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3305148A (en) * | 1965-07-13 | 1967-02-21 | Arthur E Zimmerman | Instrument harness |
JP2803072B2 (en) | 1990-10-18 | 1998-09-24 | 富士写真フイルム株式会社 | Image stabilization device |
US6727949B1 (en) | 1993-09-29 | 2004-04-27 | Canon Kabushiki Kaisha | Image pickup apparatus with moving image and still image focus control based on changing threshold value |
US5610678A (en) | 1993-12-30 | 1997-03-11 | Canon Kabushiki Kaisha | Camera including camera body and independent optical viewfinder |
US5927579A (en) | 1996-07-09 | 1999-07-27 | Schwabe; Barry E. | User attachable device for securing single use cameras and the like on clothing |
US6058141A (en) | 1995-09-28 | 2000-05-02 | Digital Bitcasting Corporation | Varied frame rate video |
US6275829B1 (en) | 1997-11-25 | 2001-08-14 | Microsoft Corporation | Representing a graphic image on a web page with a thumbnail-sized image |
JPH11265649A (en) | 1998-03-18 | 1999-09-28 | Mitsubishi Electric Corp | Current detector and power switch with current detector |
ATE259052T1 (en) | 1998-09-09 | 2004-02-15 | Mitsubishi Electric Corp | VIDEO RECORDER FOR A TARGET WEAPON |
KR100283883B1 (en) | 1998-12-04 | 2001-03-02 | 권혁섭 | Compact digital camera |
US7920163B1 (en) | 1999-06-15 | 2011-04-05 | Tessera International, Inc. | Sealed, waterproof digital electronic camera system and method of fabricating same |
US6870547B1 (en) | 1999-12-16 | 2005-03-22 | Eastman Kodak Company | Method and apparatus for rendering a low-resolution thumbnail image suitable for a low resolution display having a reference back to an original digital negative and an edit list of operations |
US6757027B1 (en) | 2000-02-11 | 2004-06-29 | Sony Corporation | Automatic video editing |
WO2001082611A1 (en) | 2000-04-21 | 2001-11-01 | Sony Corporation | Information processing apparatus and method, recorded medium, and program |
US6867680B1 (en) | 2000-06-30 | 2005-03-15 | Otto Controls Division, Otto Engineering, Inc | Dual magnet hall effect switch |
US6964025B2 (en) | 2001-03-20 | 2005-11-08 | Microsoft Corporation | Auto thumbnail gallery |
GB2373944A (en) | 2001-03-28 | 2002-10-02 | Hewlett Packard Co | Wearable transmitting/receiving camera device. |
US20040201774A1 (en) | 2001-05-15 | 2004-10-14 | Gennetten K. Douglas | Docked camera becomes electronic picture frame |
US6612404B2 (en) | 2001-05-25 | 2003-09-02 | Thyssen Elevator Capital Corp. | Contactless hall effect push button switch |
US7253840B2 (en) | 2001-06-11 | 2007-08-07 | Fujifilm Corporation | Cradle for digital camera |
GB0116113D0 (en) | 2001-06-30 | 2001-08-22 | Hewlett Packard Co | Tilt correction of electronic images |
US6680748B1 (en) | 2001-09-27 | 2004-01-20 | Pixim, Inc., | Multi-mode camera and method therefor |
US20030081121A1 (en) | 2001-10-30 | 2003-05-01 | Kirmuss Charles Bruno | Mobile digital video monitoring with pre-event recording |
US6561702B1 (en) * | 2002-02-01 | 2003-05-13 | Concord Camera Corp. | Holder for a portable device |
EP1500100B1 (en) | 2002-04-29 | 2012-03-14 | Thomson Licensing | Digital video recorder and associated method for controlling recording |
US6933964B2 (en) | 2002-06-25 | 2005-08-23 | Kingtek Electronics Technology Corp. | Surveillance system |
GB2393149B (en) | 2002-09-23 | 2004-08-18 | Sheng Tien Lin | Image transmitting ball-point pen |
GB2395081A (en) | 2002-10-31 | 2004-05-12 | Hewlett Packard Co | Image capture system |
GB2394852B (en) | 2002-10-31 | 2006-12-20 | Hewlett Packard Co | Image capture systems using motion detection |
US20050185936A9 (en) | 2002-11-08 | 2005-08-25 | Ich-Kien Lao | Mobile and vehicle-based digital video system |
JP3987788B2 (en) | 2002-11-26 | 2007-10-10 | 富士フイルム株式会社 | Digital camera system |
USD483784S1 (en) | 2002-12-24 | 2003-12-16 | Opcom Inc. | Combined digital camera and writing implement |
US20040145613A1 (en) | 2003-01-29 | 2004-07-29 | Stavely Donald J. | User Interface using acceleration for input |
KR100582788B1 (en) | 2003-02-06 | 2006-05-23 | 엘지전자 주식회사 | Program editing method using thumbnail image |
US7319485B2 (en) | 2003-03-21 | 2008-01-15 | Hewlett-Packard Development Company, L.P. | Apparatus and method for recording data in a circular fashion |
US7324156B2 (en) | 2003-05-02 | 2008-01-29 | Motorola Inc. | Attachable carrier having an optical accessory for a portable electronic device |
US8930561B2 (en) | 2003-09-15 | 2015-01-06 | Sony Computer Entertainment America Llc | Addition of supplemental multimedia content and interactive capability at the client |
US20050093988A1 (en) | 2003-11-03 | 2005-05-05 | Haas William R. | Digital camera with automatic mode detection |
JP2005191756A (en) | 2003-12-25 | 2005-07-14 | Toshiba Corp | Digital still camera |
US8886298B2 (en) | 2004-03-01 | 2014-11-11 | Microsoft Corporation | Recall device |
US20080313172A1 (en) | 2004-12-03 | 2008-12-18 | King Martin T | Determining actions involving captured information and electronic content associated with rendered documents |
US20050248453A1 (en) | 2004-05-10 | 2005-11-10 | Fechter Cary E | Multiple deterrent, emergency response and localization system and method |
US20060078215A1 (en) | 2004-10-12 | 2006-04-13 | Eastman Kodak Company | Image processing based on direction of gravity |
US7326869B2 (en) | 2004-10-22 | 2008-02-05 | Intergraph Hardware Technologies Company | Tactile feedback plunger switch |
JP2006129391A (en) | 2004-11-01 | 2006-05-18 | Sony Corp | Imaging apparatus |
DE602005004496T2 (en) | 2004-12-28 | 2009-02-19 | Seiko Epson Corp. | Imaging device and portable device as well as portable phone with it |
US7496140B2 (en) | 2005-01-24 | 2009-02-24 | Winningstad C Norman | Wireless event authentication system |
WO2006077453A1 (en) | 2005-01-24 | 2006-07-27 | Nokia Corporation | Cradle for mobile phones |
JP2006245726A (en) | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co Ltd | Digital camera |
USD534939S1 (en) | 2005-05-17 | 2007-01-09 | Logitech Europe S.A. | Camera clip |
JP4277837B2 (en) * | 2005-08-03 | 2009-06-10 | ソニー株式会社 | Imaging device |
US20070071423A1 (en) | 2005-09-27 | 2007-03-29 | Fantone Stephen J | Underwater adaptive camera housing |
EP1793580B1 (en) | 2005-12-05 | 2016-07-27 | Microsoft Technology Licensing, LLC | Camera for automatic image capture having plural capture modes with different capture triggers |
KR101107538B1 (en) | 2006-03-15 | 2012-02-08 | 퀄컴 인코포레이티드 | Sensor-based orientation system |
US7623182B2 (en) | 2006-04-10 | 2009-11-24 | Hewlett-Packard Development Company, L.P. | Camera interface module |
JP2007336281A (en) | 2006-06-15 | 2007-12-27 | Sony Corp | Device and method for recording and reproducing image |
US20070291177A1 (en) | 2006-06-20 | 2007-12-20 | Nokia Corporation | System, method and computer program product for providing reference lines on a viewfinder |
US7490721B2 (en) | 2006-06-26 | 2009-02-17 | Bishop J Scott | Camera holder |
US8310540B2 (en) | 2006-08-31 | 2012-11-13 | Stellar, Llc | Loop recording with book marking |
US20080055427A1 (en) | 2006-09-05 | 2008-03-06 | Heino Wendelrup | Video diary |
US8374498B2 (en) | 2006-09-29 | 2013-02-12 | Microscan Systems, Inc. | Systems and/or devices for camera-based inspections |
JP5023663B2 (en) * | 2006-11-07 | 2012-09-12 | ソニー株式会社 | Imaging apparatus and imaging method |
US20080180537A1 (en) | 2006-11-14 | 2008-07-31 | Uri Weinberg | Camera system and methods |
US8199220B2 (en) | 2006-12-06 | 2012-06-12 | Samsung Electronics Co., Ltd. | Method and apparatus for automatic image management |
US8276098B2 (en) | 2006-12-22 | 2012-09-25 | Apple Inc. | Interactive image thumbnails |
US7783133B2 (en) | 2006-12-28 | 2010-08-24 | Microvision, Inc. | Rotation compensation and image stabilization system |
JP4653123B2 (en) | 2007-01-09 | 2011-03-16 | 富士フイルム株式会社 | Image acquisition apparatus and image acquisition method |
US20100037139A1 (en) | 2007-01-12 | 2010-02-11 | Norbert Loebig | Apparatus for Processing Audio and/or Video Data and Method to be run on said Apparatus |
GB2456587A (en) | 2007-03-13 | 2009-07-22 | James Bircumshaw | A multi-functional body worn camera |
EP1983740A1 (en) | 2007-04-16 | 2008-10-22 | STMicroelectronics (Research & Development) Limited | Image stabilisation method and apparatus |
JP2010525400A (en) * | 2007-04-17 | 2010-07-22 | ブラック ラピッド インコーポレイテッド | Improved camera transport system and method |
US20080260291A1 (en) | 2007-04-17 | 2008-10-23 | Nokia Corporation | Image downscaling by binning |
ATE501594T1 (en) | 2007-06-27 | 2011-03-15 | Panasonic Corp | IMAGING DEVICE, METHOD, SYSTEM INTEGRATED CIRCUIT AND PROGRAM |
JP2009110351A (en) | 2007-10-31 | 2009-05-21 | Sony Corp | Cradle and electronic appliance |
JP4458151B2 (en) | 2007-11-06 | 2010-04-28 | ソニー株式会社 | Automatic imaging apparatus, automatic imaging control method, image display system, image display method, display control apparatus, display control method |
US8538376B2 (en) | 2007-12-28 | 2013-09-17 | Apple Inc. | Event-based modes for electronic devices |
GB2457466A (en) | 2008-02-13 | 2009-08-19 | Sight Digital Ltd X | Multi-channel digital video recorder (DVR) with user selection of MPEG codec recording parameters |
US20100027663A1 (en) | 2008-07-29 | 2010-02-04 | Qualcomm Incorporated | Intellegent frame skipping in video coding based on similarity metric in compressed domain |
US8026913B2 (en) | 2008-07-29 | 2011-09-27 | International Business Machines Corporation | Image capture and buffering in a virtual world |
US8130278B2 (en) | 2008-08-01 | 2012-03-06 | Omnivision Technologies, Inc. | Method for forming an improved image using images with different resolutions |
JP4796104B2 (en) | 2008-08-29 | 2011-10-19 | シャープ株式会社 | Imaging apparatus, image analysis apparatus, external light intensity calculation method, image analysis method, imaging program, image analysis program, and recording medium |
US7778023B1 (en) | 2008-09-19 | 2010-08-17 | Victor Mohoney | Docking system for MP3 players and other portable electronic devices |
US8957835B2 (en) | 2008-09-30 | 2015-02-17 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US20100107126A1 (en) | 2008-10-28 | 2010-04-29 | Hulu Llc | Method and apparatus for thumbnail selection and editing |
US8526779B2 (en) | 2008-11-07 | 2013-09-03 | Looxcie, Inc. | Creating and editing video recorded by a hands-free video recording device |
US8593570B2 (en) | 2008-11-07 | 2013-11-26 | Looxcie, Inc. | Video recording camera headset |
KR101737829B1 (en) | 2008-11-10 | 2017-05-22 | 삼성전자주식회사 | Motion Input Device For Portable Device And Operation Method using the same |
US8289162B2 (en) | 2008-12-22 | 2012-10-16 | Wimm Labs, Inc. | Gesture-based user interface for a wearable portable device |
KR101532610B1 (en) | 2009-01-22 | 2015-06-30 | 삼성전자주식회사 | A digital photographing device, a method for controlling a digital photographing device, a computer-readable storage medium |
US8482520B2 (en) | 2009-01-30 | 2013-07-09 | Research In Motion Limited | Method for tap detection and for interacting with and a handheld electronic device, and a handheld electronic device configured therefor |
US8159363B2 (en) | 2009-02-16 | 2012-04-17 | Research In Motion Limited | Using gravity to direct a rotatable camera in a handheld electronic device |
US20100208370A1 (en) | 2009-02-19 | 2010-08-19 | Ying-Kuo Chang | Camera lens assembly |
JP5259464B2 (en) * | 2009-03-13 | 2013-08-07 | オリンパスイメージング株式会社 | Imaging apparatus and mode switching method thereof |
US8016492B2 (en) * | 2009-06-16 | 2011-09-13 | Colin James Pyle | Wrist or arm strap with hinged mount for camera |
KR101051235B1 (en) * | 2009-07-13 | 2011-07-21 | 박원일 | Wearable portable crime prevention CCTV monitoring device worn on clothing |
GB2473235B (en) | 2009-09-04 | 2012-02-29 | Hock Thiam Saw | Portable electric device charging connector arrangement |
US20110064129A1 (en) | 2009-09-16 | 2011-03-17 | Broadcom Corporation | Video capture and generation at variable frame rates |
US9106275B2 (en) | 2009-09-24 | 2015-08-11 | Blackberry Limited | Accelerometer tap detection to initiate NFC communication |
US8411050B2 (en) * | 2009-10-14 | 2013-04-02 | Sony Computer Entertainment America | Touch interface having microphone to determine touch impact strength |
KR101642400B1 (en) | 2009-12-03 | 2016-07-25 | 삼성전자주식회사 | Digital photographing apparatus, method for controlling the same, and recording medium storing program to execute the method |
US8687070B2 (en) | 2009-12-22 | 2014-04-01 | Apple Inc. | Image capture device having tilt and/or perspective correction |
US8744875B2 (en) | 2009-12-23 | 2014-06-03 | Mindray Ds Usa, Inc. | Systems and methods for synchronizing data of a patient monitor and a portable sensor module |
CN102118560A (en) | 2009-12-30 | 2011-07-06 | 深圳富泰宏精密工业有限公司 | Photographic system and method |
US8270827B2 (en) | 2010-01-12 | 2012-09-18 | Tse Jr Kenneth K | Camera lens accessory holder |
JP5457217B2 (en) * | 2010-02-02 | 2014-04-02 | オリンパスイメージング株式会社 | camera |
EP2362593B1 (en) * | 2010-02-05 | 2012-06-20 | Research In Motion Limited | Communications system including aggregation server for determining updated metadata of e-mail messages and related methods |
US8322215B2 (en) | 2010-09-13 | 2012-12-04 | Itron, Inc. | Accelerometer based removal and inversion tamper detection and tap switch feature |
CA2754841C (en) | 2010-10-12 | 2016-11-01 | Research In Motion Limited | Method and apparatus for image orientation indication and correction |
TW201224628A (en) | 2010-12-03 | 2012-06-16 | Hon Hai Prec Ind Co Ltd | Image obtaining device, projector, and method thereof |
KR101769818B1 (en) * | 2010-12-23 | 2017-08-21 | 엘지전자 주식회사 | Mobile terminal and operation control method thereof |
US20120263430A1 (en) | 2011-03-31 | 2012-10-18 | Noah Spitzer-Williams | Bookmarking moments in a recorded video using a recorded human action |
US9052876B2 (en) * | 2011-05-13 | 2015-06-09 | Symbol Technologies, Llc | Peer-to-peer event-time secured link establishment |
JP5868618B2 (en) | 2011-06-14 | 2016-02-24 | オリンパス株式会社 | Information processing apparatus, image processing system, and program |
US20130004153A1 (en) * | 2011-07-01 | 2013-01-03 | Mckee Charles P | Mounting apparatus for a portable video capture device |
US8515241B2 (en) | 2011-07-07 | 2013-08-20 | Gannaway Web Holdings, Llc | Real-time video editing |
US20130014585A1 (en) | 2011-07-13 | 2013-01-17 | P.I. Engineering, Inc. | Accelerometer-based touch pad for timing swimming and other competitive events |
US20130201344A1 (en) | 2011-08-18 | 2013-08-08 | Qualcomm Incorporated | Smart camera for taking pictures automatically |
US8743069B2 (en) | 2011-09-01 | 2014-06-03 | Google Inc. | Receiving input at a computing device |
US9065967B2 (en) | 2011-09-13 | 2015-06-23 | Verizon Patent And Licensing Inc. | Method and apparatus for providing device angle image correction |
US8941560B2 (en) | 2011-09-21 | 2015-01-27 | Google Inc. | Wearable computer with superimposed controls and instructions for external device |
US20130094697A1 (en) * | 2011-10-13 | 2013-04-18 | Fuji Xerox Co., Ltd. | Capturing, annotating, and sharing multimedia tips |
US9354779B2 (en) | 2012-03-12 | 2016-05-31 | Microsoft Technology Licensing, Llc | Providing theme variations in a user interface |
US9894781B2 (en) | 2012-06-06 | 2018-02-13 | Apple Inc. | Notched display layers |
CN202838971U (en) | 2012-08-30 | 2013-03-27 | 西安隆美尔臣电子科技有限责任公司 | Voice and video recording pen |
US9032130B2 (en) | 2012-09-12 | 2015-05-12 | Blackberry Limited | Dock for data transfer to and from portable electronic device |
US8907927B2 (en) | 2012-09-13 | 2014-12-09 | Sap Portals Israel Ltd | Camera based hover detection for touch-based mobile devices |
US9019431B2 (en) | 2012-09-28 | 2015-04-28 | Digital Ally, Inc. | Portable video and imaging system |
US9093849B2 (en) | 2013-01-07 | 2015-07-28 | Superior Communications, Inc. | Universal charging dock with a wall mount |
US20140211031A1 (en) | 2013-01-30 | 2014-07-31 | Microsoft Corporation | Auto picture alignment correction |
US20140270688A1 (en) | 2013-03-14 | 2014-09-18 | Microsoft Corporation | Personal Video Replay |
US9282244B2 (en) | 2013-03-14 | 2016-03-08 | Microsoft Technology Licensing, Llc | Camera non-touch switch |
US8979398B2 (en) | 2013-04-16 | 2015-03-17 | Microsoft Technology Licensing, Llc | Wearable camera |
US20140333828A1 (en) | 2013-05-10 | 2014-11-13 | Microsoft Corporation | Portable camera dock |
US20140354880A1 (en) | 2013-06-03 | 2014-12-04 | Microsoft Corporation | Camera with Hall Effect Switch |
TWI510811B (en) | 2013-09-13 | 2015-12-01 | Quanta Comp Inc | Head mounted system |
-
2013
- 2013-04-26 US US13/871,905 patent/US9066007B2/en active Active
-
2014
- 2014-04-23 WO PCT/US2014/035061 patent/WO2014176298A1/en active Application Filing
- 2014-04-23 KR KR1020157033832A patent/KR102155397B1/en active IP Right Grant
- 2014-04-23 EP EP14727300.7A patent/EP2965173A1/en not_active Ceased
- 2014-04-23 CN CN201480023554.1A patent/CN105493003A/en active Pending
-
2015
- 2015-05-19 US US14/716,355 patent/US9444996B2/en active Active
-
2016
- 2016-08-08 US US15/230,485 patent/US20160344926A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9066007B2 (en) * | 2013-04-26 | 2015-06-23 | Skype | Camera tap switch |
US9444996B2 (en) * | 2013-04-26 | 2016-09-13 | Microsoft Technology Licensing, Llc | Camera tap switch |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160344924A1 (en) * | 2015-05-08 | 2016-11-24 | Albert Tsai | System and Method for Preserving Video Clips from a Handheld Device |
US9848120B2 (en) * | 2015-05-08 | 2017-12-19 | Fast Model Technology Llc | System and method for preserving video clips from a handheld device |
Also Published As
Publication number | Publication date |
---|---|
EP2965173A1 (en) | 2016-01-13 |
US9066007B2 (en) | 2015-06-23 |
WO2014176298A1 (en) | 2014-10-30 |
US9444996B2 (en) | 2016-09-13 |
KR20160004364A (en) | 2016-01-12 |
US20160021300A1 (en) | 2016-01-21 |
KR102155397B1 (en) | 2020-09-11 |
US20140320687A1 (en) | 2014-10-30 |
CN105493003A (en) | 2016-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9444996B2 (en) | Camera tap switch | |
US9516227B2 (en) | Camera non-touch switch | |
US20140354880A1 (en) | Camera with Hall Effect Switch | |
US10020024B2 (en) | Smart gallery and automatic music video creation from a set of photos | |
US20140270688A1 (en) | Personal Video Replay | |
US10750116B2 (en) | Automatically curating video to fit display time | |
US9538083B2 (en) | Motion blur avoidance | |
US20140317480A1 (en) | Automatic music video creation from a set of photos | |
US9451178B2 (en) | Automatic insertion of video into a photo story | |
US9503644B2 (en) | Using image properties for processing and editing of multiple resolution images | |
US20140333828A1 (en) | Portable camera dock |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039359/0760 Effective date: 20141014 Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHAU, SEANG Y.;LUONG, ALDA YUK YING;WILLIAMS, JASON;AND OTHERS;SIGNING DATES FROM 20130418 TO 20130423;REEL/FRAME:039359/0763 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |