US20040197096A1 - Method and apparatus for controlling the depth of field using multiple user interface markers - Google Patents
Method and apparatus for controlling the depth of field using multiple user interface markers Download PDFInfo
- Publication number
- US20040197096A1 US20040197096A1 US10/406,767 US40676703A US2004197096A1 US 20040197096 A1 US20040197096 A1 US 20040197096A1 US 40676703 A US40676703 A US 40676703A US 2004197096 A1 US2004197096 A1 US 2004197096A1
- Authority
- US
- United States
- Prior art keywords
- objects
- imaging device
- digital imaging
- scene
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
Definitions
- the field of this invention relates to digital cameras and more specifically to a digital camera that allows a user to control the depth of field by setting multiple user interface (UI) markers.
- UI user interface
- Digital imaging devices typically have many automatic settings. For example, many digital cameras automatically set the focus, the aperture, and the exposure time. Some digital cameras allow the user to override or assist the automatic setting. For example, some digital cameras allow the user to set the area in the image that will be used by the camera to determine the focus. One way this is done is by centering the area in the viewfinder and pressing the shutter button halfway down. Once the camera has focused, the user re-frames the shot and presses the shutter button the rest of the way down. Other cameras allow the user to mark an object to be used as the focus area, for example see U.S. Pat. No. 5,187,585 “Image sensing apparatus with settable focus detection area” which is hereby included by reference. Yet other cameras divide the viewfinder into regions or zones and allow the users to designate one or more zones as the area used to determine focus.
- Some digital cameras also allow the user to set the aperture of the camera. This allows the user some control over the depth of field and works well when only one object is in the scene.
- the user wishes to make sure two objects, that are different distances from the camera, are in focus at the same time, the user must try to set the focus point between the two objects and adjust the depth of field to include the two objects.
- Some cameras allow the user to focus on a first object, refocus on a second object, and the camera will adjust the focus between the two objects and adjust the depth of field to include the two objects (see the Canon EOS Rebel S camera). Unfortunately the user must remember which objects were selected.
- a digital imaging device that allows the user to mark multiple objects with UI markers is disclosed.
- the digital imaging device automatically adjusts the focus and aperture to include the marked objects in the depth of field of the digital imaging device.
- FIG. 1 is a block diagram of a digital imaging system in accordance with an example embodiment of the present invention.
- FIG. 2 is a flow chart for adjusting the depth of field to include two objects in accordance with an example embodiment of the present invention.
- FIG. 3 is a diagram showing one example embodiment for the UI markers used to mark objects in a scene in accordance with an example embodiment of the present invention.
- FIG. 1 An electronic block diagram of a typical digital imaging device is shown in FIG. 1.
- Digital cameras today typically contain a photo-sensor ( 102 ) for capturing images; a display area ( 104 ) for displaying the captured images and controlling the digital camera; a storage area ( 116 ) for storing the captured images; memory ( 108 ), for temporary manipulation of the captured images and for running the firmware of the camera; a processor ( 110 ) for controlling the camera, and some type of user interface (UI) controls ( 106 ).
- Some digital cameras also include a microphone ( 114 ) for capturing audio clips along with the digital images.
- Some digital cameras include a speaker ( 118 ) and a digital signal processor (DSP 112 ).
- the UI controls ( 106 ) on digital cameras may include physical controls like buttons, rocker switches, a keyboard, and virtual controls shown in the display area.
- the digital images, video clips and audio clips captured by the digital camera may be stored in memory ( 108 ) or may be moved to the storage area ( 116 ).
- Today the memory and the storage area are typically different types of devices.
- the memory is typically fast volatile memory and the storage area is typically slower non-volatile memory. In the future, as the speed of non-volatile memory increases, all the memory may be of the non-volatile type.
- Digital imaging devices typically have an input/output (I/O) channel ( 122 ).
- This I/O channel may be, for example, a USB bus, a SCSI bus, an IR link, Fire Wire, or a parallel link.
- the I/O channel is used to connect the digital imaging device to other computer systems or networks. Some digital imaging devices connect to other computer systems using a camera dock. Digital cameras may also contain a wireless link ( 120 ) to the Internet, for example a cell phone.
- Some digital cameras have more than one display area, for example a camera may have an LCD display on the back of the camera and have a micro display used as a viewfinder. Both the LCD and the micro display can be used to display a real time view of the scene viewed by the camera. This also allows the camera to display additional information in the displays as the user frames the picture.
- One type of information that may be displayed is a marker to show where, in the field of view, the camera is focusing.
- Some cameras allow the user to set or pick an object to use for focusing, for example see U.S. Pat. No. 5,187,585 “Image sensing apparatus with settable focus detection area”.
- an object can be designated, and the camera will use that object as the area to focus on, even if the object is moving.
- the user wishes to make sure two different areas or objects in the scene are in focus at the same time.
- the camera must focus between the two objects and adjust the depth of field to include both objects.
- the depth of field is determined by a relationship between the distance to the object and the current aperture of the lens. As the aperture size is reduced, the depth of field increases.
- the camera can use its current focus distance and its current aperture size to determine when two designated objects are in focus.
- a scene would be viewed in an electronic display ( 200 ).
- the user would designate a first object in the scene.
- the camera would mark the object with a UI marker ( 202 ). Once an object has been designated and marked, the camera would track the position of that object.
- the user would designate a second object and the camera would mark the second object with a UI marker ( 204 ).
- the camera would determine the distance to each object ( 206 ). The distance may be determined when each object is first designated or the distance may be determined after both objects have been designated. Once two objects have been marked, the camera will focus the lens at a distance between the two objects ( 208 ) and adjust the depth of field to try to include both objects ( 210 ). The camera will determine when both objects are in focus and indicate this condition to the user ( 212 ).
- FIG. 3 shows an example embodiment for UI markers ( 302 ) marking two objects in a scene.
- the camera will indicate when both objects are in focus by using an indicator displayed in the electronic display, for example a light turning green.
- an indicator displayed in the electronic display for example a light turning green.
- the camera will change the color of both UI markers surrounding the two designated objects. For example, when the user designates the first object, the UI marker would be one color (for example black), after the second object has been designated and marked, both UI markers would change to a second color (for example green) when both objects where included in the depth of field. If one or both objects are not in focus, the UI marker may change to another color. For example if one object is in focus but the second object in not quite in focus, one marker may be green and the second marker may be yellow.
- the color of the UI markers would indicate how close to focus the two designated objects were. For example, when both objects where not close to focus, the UI markers could be red, when the two objects were close to being in focus, the UI markers could be yellow. And when both designated objects where in focus, the two UI markers could be green. Other methods could be used to indicate the focus condition of the two objects, for example the brightness of the UI markers could change as a function of the focus condition of the two designated objects.
- the user would mark two objects in a scene. Then the user would manually change the focus and aperture settings until the UI markers indicated that both objects were included in the depth of field.
Abstract
A digital imaging device that allows the user to mark multiple objects with UI markers is disclosed. The digital imaging device automatically adjusts the focus and aperture to include the marked objects in the depth of field of the digital imaging device.
Description
- The field of this invention relates to digital cameras and more specifically to a digital camera that allows a user to control the depth of field by setting multiple user interface (UI) markers.
- Digital imaging devices typically have many automatic settings. For example, many digital cameras automatically set the focus, the aperture, and the exposure time. Some digital cameras allow the user to override or assist the automatic setting. For example, some digital cameras allow the user to set the area in the image that will be used by the camera to determine the focus. One way this is done is by centering the area in the viewfinder and pressing the shutter button halfway down. Once the camera has focused, the user re-frames the shot and presses the shutter button the rest of the way down. Other cameras allow the user to mark an object to be used as the focus area, for example see U.S. Pat. No. 5,187,585 “Image sensing apparatus with settable focus detection area” which is hereby included by reference. Yet other cameras divide the viewfinder into regions or zones and allow the users to designate one or more zones as the area used to determine focus.
- Some digital cameras also allow the user to set the aperture of the camera. This allows the user some control over the depth of field and works well when only one object is in the scene. When the user wishes to make sure two objects, that are different distances from the camera, are in focus at the same time, the user must try to set the focus point between the two objects and adjust the depth of field to include the two objects. Some cameras allow the user to focus on a first object, refocus on a second object, and the camera will adjust the focus between the two objects and adjust the depth of field to include the two objects (see the Canon EOS Rebel S camera). Unfortunately the user must remember which objects were selected.
- Therefore there is a need for a digital imaging device that offers an easier way to adjust the depth of field in the image.
- A digital imaging device that allows the user to mark multiple objects with UI markers is disclosed. The digital imaging device automatically adjusts the focus and aperture to include the marked objects in the depth of field of the digital imaging device.
- Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
- FIG. 1 is a block diagram of a digital imaging system in accordance with an example embodiment of the present invention.
- FIG. 2 is a flow chart for adjusting the depth of field to include two objects in accordance with an example embodiment of the present invention.
- FIG. 3 is a diagram showing one example embodiment for the UI markers used to mark objects in a scene in accordance with an example embodiment of the present invention.
- An electronic block diagram of a typical digital imaging device is shown in FIG. 1. Digital cameras today typically contain a photo-sensor (102) for capturing images; a display area (104) for displaying the captured images and controlling the digital camera; a storage area (116) for storing the captured images; memory (108), for temporary manipulation of the captured images and for running the firmware of the camera; a processor (110) for controlling the camera, and some type of user interface (UI) controls (106). Some digital cameras also include a microphone (114) for capturing audio clips along with the digital images. Some digital cameras include a speaker (118) and a digital signal processor (DSP 112). The UI controls (106) on digital cameras may include physical controls like buttons, rocker switches, a keyboard, and virtual controls shown in the display area. The digital images, video clips and audio clips captured by the digital camera may be stored in memory (108) or may be moved to the storage area (116). Today the memory and the storage area are typically different types of devices. The memory is typically fast volatile memory and the storage area is typically slower non-volatile memory. In the future, as the speed of non-volatile memory increases, all the memory may be of the non-volatile type. Digital imaging devices typically have an input/output (I/O) channel (122). This I/O channel may be, for example, a USB bus, a SCSI bus, an IR link, Fire Wire, or a parallel link. The I/O channel is used to connect the digital imaging device to other computer systems or networks. Some digital imaging devices connect to other computer systems using a camera dock. Digital cameras may also contain a wireless link (120) to the Internet, for example a cell phone.
- Some digital cameras have more than one display area, for example a camera may have an LCD display on the back of the camera and have a micro display used as a viewfinder. Both the LCD and the micro display can be used to display a real time view of the scene viewed by the camera. This also allows the camera to display additional information in the displays as the user frames the picture. One type of information that may be displayed is a marker to show where, in the field of view, the camera is focusing. Some cameras allow the user to set or pick an object to use for focusing, for example see U.S. Pat. No. 5,187,585 “Image sensing apparatus with settable focus detection area”. In the 585 patent, an object can be designated, and the camera will use that object as the area to focus on, even if the object is moving. There are a number of well known methods to designate an object. One way is to center the object in the viewfinder and then activate a UI control. Some cameras use the SI position of the shutter button as the UI control (Typically, the S1 position is when the shutter button has been pressed half way down). Other cameras have UI controls that are different than the shutter button.
- Once an object has been designated, some cameras mark that object in the display area. There are many types of markers that can be used, for example square brackets can be shown surrounding the designated object. When the camera is moved to reframe the scene, the marker stays centered on the designated object. In this way, the designated object is tracked as its position shifts in the field of view. When an object has been designated and marked, the user can easily keep track of the focus area of the camera.
- In some cases the user wishes to make sure two different areas or objects in the scene are in focus at the same time. When the two areas of the scene are at different distances from the camera, the camera must focus between the two objects and adjust the depth of field to include both objects. For a given focal length, the depth of field is determined by a relationship between the distance to the object and the current aperture of the lens. As the aperture size is reduced, the depth of field increases. The camera can use its current focus distance and its current aperture size to determine when two designated objects are in focus.
- Some cameras today allow the user to designate two objects, and the camera will focus between them and try to adjust the depth of field to include both objects. However, the camera does not mark the objects in the viewfinder or display area, so it is difficult for the user to make sure that the objects they want are the objects included in the depth of field.
- In one example embodiment of the current invention, a scene would be viewed in an electronic display (200). The user would designate a first object in the scene. The camera would mark the object with a UI marker (202). Once an object has been designated and marked, the camera would track the position of that object. The user would designate a second object and the camera would mark the second object with a UI marker (204). The camera would determine the distance to each object (206). The distance may be determined when each object is first designated or the distance may be determined after both objects have been designated. Once two objects have been marked, the camera will focus the lens at a distance between the two objects (208) and adjust the depth of field to try to include both objects (210). The camera will determine when both objects are in focus and indicate this condition to the user (212).
- FIG. 3 shows an example embodiment for UI markers (302) marking two objects in a scene.
- In one example embodiment the camera will indicate when both objects are in focus by using an indicator displayed in the electronic display, for example a light turning green. In the preferred embodiment, when both objects are in focus the camera will change the color of both UI markers surrounding the two designated objects. For example, when the user designates the first object, the UI marker would be one color (for example black), after the second object has been designated and marked, both UI markers would change to a second color (for example green) when both objects where included in the depth of field. If one or both objects are not in focus, the UI marker may change to another color. For example if one object is in focus but the second object in not quite in focus, one marker may be green and the second marker may be yellow. In one example embodiment of the current invention, the color of the UI markers would indicate how close to focus the two designated objects were. For example, when both objects where not close to focus, the UI markers could be red, when the two objects were close to being in focus, the UI markers could be yellow. And when both designated objects where in focus, the two UI markers could be green. Other methods could be used to indicate the focus condition of the two objects, for example the brightness of the UI markers could change as a function of the focus condition of the two designated objects.
- In another example embodiment for the current invention, the user would mark two objects in a scene. Then the user would manually change the focus and aperture settings until the UI markers indicated that both objects were included in the depth of field.
- The foregoing description of the present invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and other modifications and variations may be possible in light of the above teachings. The embodiment was chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and various modifications as are suited to the particular use contemplated. It is intended that the appended claims be construed to include other alternative embodiments of the invention except insofar as limited by the prior art.
Claims (22)
1. A method comprising:
displaying a scene on an electronic display;
marking a first object in the scene with a first UI marker;
focusing a lens on the first object;
marking a second object in the scene with a second UI marker;
focusing the lens on the second object;
focusing the lens at a distance between the first and second objects;
adjusting an aperture setting to adjust a depth of field to include both objects in the depth of field.
2. The method of claim 1 further comprising:
indicating when both objects are in focus.
3. The method of claim 1 where the lens is automatically focused between the two objects.
4. The method of claim 1 further comprising:
changing the color of the UI markers to indicate the focus condition of the two marked objects.
5. The method of claim 1 further comprising:
changing the brightness of the UI markers to indicate the focus condition of the two marked objects.
6. The method of claim 1 where the UI markers are displayed as a set of square brackets that enclose the designated objects.
7. The method of claim 1 where the objects are designated to be marked by centering the objects in the display, and then activating a UI control.
8. The method of claim 1 where the first object and the second object are different areas of a single object.
9. The method of claim 1 where the electronic display is a viewfinder in a camera.
10. The method of claim 1 where the electronic display is an display on the back of a camera.
11. A digital imaging device, comprising:
an image sensor;
a lens configured to focus a scene onto the image sensor;
an aperture that can adjust the amount of light allowed through the lens;
an electronic display configured to display the scene focused onto the image sensor;
at least one UI control configured to allow user input into the digital imaging device;
a processor configured to monitor the at least one UI control;
the processor configured to detect a first object in the scene when the processor detects input from the at least one UI control;
the processor configured to display a UI marker on the display that identifies the first object;
the processor configured to detect and mark a second object in the scene when a second user input is detected;
the processor configured to determine the distance to the first and second objects;
the processor configured to focus the lens at a distance between the two objects and to adjust the aperture to cause the field of view to include both objects.
12. The digital imaging device of claim 11 where the processor is also configured to indicate when both objects are included in the depth of field.
13. The digital imaging device of claim 11 where the UI markers are displayed as a set of square brackets that enclose the objects.
14. The digital imaging device of claim 11 where the UI markers are changed to indicate the focus condition of the two marked objects.
15. The digital imaging device of claim 14 where the color of the UI markers are changed to indicate the focus condition of the two marked objects.
16. The digital imaging device of claim 11 where the objects are designated to be marked by centering the objects in the display, and then activating the UI control.
17. The digital imaging device of claim 11 where the display is a viewfinder.
18. The digital imaging device of claim 11 where the display is a display on the back of the digital imaging device.
19. A digital imaging device, comprising:
a means for capturing an image of a scene;
a means for displaying the scene;
a means for designating objects in the scene;
a means for tracking the designated objects;
a means for displaying at least two UI markers that identify the designated objects;
a means for determining the distance to each designated object when two objects have been designated;
a means for including both designated objects in the field of view of the digital imaging device.
20. The digital imaging device of claim 19 where the digital imaging device is also configured to indicate when both objects are included in the depth of field.
21. The digital imaging device of claim 19 where the UI markers are displayed as a set of square brackets that enclose the objects.
22. A method comprising:
displaying a scene on an electronic display;
marking a first object in the scene with a first UI marker;
determining the distance to the first object;
marking a second object in the scene with a second UI marker;
determining the distance to the second object;
automatically focusing the lens at a distance between the first and second objects;
adjusting an aperture setting to adjust a depth of field to include both objects in the depth of field.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/406,767 US6801717B1 (en) | 2003-04-02 | 2003-04-02 | Method and apparatus for controlling the depth of field using multiple user interface markers |
JP2004094446A JP2004310086A (en) | 2003-04-02 | 2004-03-29 | Method and device for controlling depth of field by using plural user interface markers |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/406,767 US6801717B1 (en) | 2003-04-02 | 2003-04-02 | Method and apparatus for controlling the depth of field using multiple user interface markers |
Publications (2)
Publication Number | Publication Date |
---|---|
US6801717B1 US6801717B1 (en) | 2004-10-05 |
US20040197096A1 true US20040197096A1 (en) | 2004-10-07 |
Family
ID=33029720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/406,767 Expired - Lifetime US6801717B1 (en) | 2003-04-02 | 2003-04-02 | Method and apparatus for controlling the depth of field using multiple user interface markers |
Country Status (2)
Country | Link |
---|---|
US (1) | US6801717B1 (en) |
JP (1) | JP2004310086A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1902342A1 (en) * | 2005-07-11 | 2008-03-26 | Nokia Corporation | Improved system and mwthod for exhibiting image focus information on a viewfinder |
US20090231454A1 (en) * | 2008-03-12 | 2009-09-17 | Canon Kabushiki Kaisha | Imaging apparatus and its control method |
US20130182167A1 (en) * | 2011-07-15 | 2013-07-18 | Hilti Aktiengesellschaft | Method and device for detecting an object in a substrate |
US8558923B2 (en) | 2010-05-03 | 2013-10-15 | Canon Kabushiki Kaisha | Image capturing apparatus and method for selective real time focus/parameter adjustment |
JP2018097380A (en) * | 2018-02-01 | 2018-06-21 | キヤノン株式会社 | Imaging device and control method of the same |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7038690B2 (en) * | 2001-03-23 | 2006-05-02 | Microsoft Corporation | Methods and systems for displaying animated graphics on a computing device |
WO2004072725A1 (en) * | 2003-02-12 | 2004-08-26 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Adjustment of an image recorder with dynamic measuring fields |
US7515816B2 (en) * | 2004-12-10 | 2009-04-07 | Casio Computer Co., Ltd. | Imaging apparatus having a focus function |
JP3817566B2 (en) * | 2005-01-24 | 2006-09-06 | キヤノン株式会社 | Imaging apparatus, imaging method, imaging program, and storage medium |
US9910341B2 (en) * | 2005-01-31 | 2018-03-06 | The Invention Science Fund I, Llc | Shared image device designation |
US9489717B2 (en) | 2005-01-31 | 2016-11-08 | Invention Science Fund I, Llc | Shared image device |
US9124729B2 (en) | 2005-01-31 | 2015-09-01 | The Invention Science Fund I, Llc | Shared image device synchronization or designation |
US9325781B2 (en) | 2005-01-31 | 2016-04-26 | Invention Science Fund I, Llc | Audio sharing |
US9451200B2 (en) | 2005-06-02 | 2016-09-20 | Invention Science Fund I, Llc | Storage access technique for captured data |
US9967424B2 (en) | 2005-06-02 | 2018-05-08 | Invention Science Fund I, Llc | Data storage usage protocol |
US10003762B2 (en) | 2005-04-26 | 2018-06-19 | Invention Science Fund I, Llc | Shared image devices |
US20070222865A1 (en) | 2006-03-15 | 2007-09-27 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Enhanced video/still image correlation |
US9819490B2 (en) | 2005-05-04 | 2017-11-14 | Invention Science Fund I, Llc | Regional proximity for shared image device(s) |
US9621749B2 (en) | 2005-06-02 | 2017-04-11 | Invention Science Fund I, Llc | Capturing selected image objects |
GB2431804B (en) * | 2005-10-31 | 2011-04-13 | Hewlett Packard Development Co | Image capture device and method of capturing an image |
JP4943769B2 (en) * | 2006-08-15 | 2012-05-30 | 富士フイルム株式会社 | Imaging apparatus and in-focus position search method |
CN101472064A (en) * | 2007-12-25 | 2009-07-01 | 鸿富锦精密工业(深圳)有限公司 | Filming system and method for processing scene depth |
US20090180771A1 (en) * | 2008-01-14 | 2009-07-16 | Ming-Chang Liu | Camera for shooting like a professional |
US7844174B2 (en) * | 2008-07-31 | 2010-11-30 | Fuji Xerox Co., Ltd. | System and method for manual selection of multiple evaluation points for camera control |
US8451683B2 (en) * | 2009-04-03 | 2013-05-28 | Exxonmobil Upstream Research Company | Method for determining the fluid/pressure distribution of hydrocarbon reservoirs from 4D seismic data |
JP5249146B2 (en) * | 2009-07-03 | 2013-07-31 | 富士フイルム株式会社 | Imaging control apparatus and method, and program |
JP2012160864A (en) * | 2011-01-31 | 2012-08-23 | Sanyo Electric Co Ltd | Imaging apparatus |
JP5848507B2 (en) * | 2011-03-08 | 2016-01-27 | キヤノン株式会社 | Image capturing apparatus and method with tracking function |
US9204047B2 (en) * | 2011-04-08 | 2015-12-01 | Nokia Technologies Oy | Imaging |
US8983176B2 (en) * | 2013-01-02 | 2015-03-17 | International Business Machines Corporation | Image selection and masking using imported depth information |
US9449234B2 (en) | 2014-03-31 | 2016-09-20 | International Business Machines Corporation | Displaying relative motion of objects in an image |
US9196027B2 (en) | 2014-03-31 | 2015-11-24 | International Business Machines Corporation | Automatic focus stacking of captured images |
US9300857B2 (en) | 2014-04-09 | 2016-03-29 | International Business Machines Corporation | Real-time sharpening of raw digital images |
US10419658B1 (en) | 2014-07-20 | 2019-09-17 | Promanthan Brains LLC, Series Point only | Camera optimizing for several directions of interest |
WO2016056089A1 (en) * | 2014-10-08 | 2016-04-14 | 日立マクセル株式会社 | Camera, and image-capturing method |
US10009536B2 (en) * | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
CN107277378A (en) * | 2017-08-04 | 2017-10-20 | 维沃移动通信有限公司 | A kind of image capturing method and mobile terminal |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4982217A (en) * | 1987-01-12 | 1991-01-01 | Canon Kabushiki Kaisha | Camera with automatic focusing apparatus |
US5103254A (en) * | 1990-05-29 | 1992-04-07 | Eastman Kodak Company | Camera with subject highlighting and motion detection |
US5187585A (en) * | 1989-08-19 | 1993-02-16 | Canon Kabushiki Kaisha | Image sensing apparatus with settable focus detection area |
US5412487A (en) * | 1991-11-27 | 1995-05-02 | Hitachi, Ltd. | Video camera and apparatus for extracting an object |
US5473369A (en) * | 1993-02-25 | 1995-12-05 | Sony Corporation | Object tracking apparatus |
US5526049A (en) * | 1992-02-29 | 1996-06-11 | Samsung Electronics Co., Ltd. | Circuit for automatically focusing a camera based on changes in luminance signals derived from optical signals received by the camera |
US5532782A (en) * | 1994-02-17 | 1996-07-02 | Nikon Corporation | Camera having a depth priority operation mode |
US5552823A (en) * | 1992-02-15 | 1996-09-03 | Sony Corporation | Picture processing apparatus with object tracking |
US5610653A (en) * | 1992-02-07 | 1997-03-11 | Abecassis; Max | Method and system for automatically tracking a zoomed video image |
US5739857A (en) * | 1990-02-08 | 1998-04-14 | Canon Kabushiki Kaisha | Image pickup device with settable image detecting region |
US5740477A (en) * | 1994-04-15 | 1998-04-14 | Asahi Kogaku Kogyo Kabushiki Kaisha | Multi-point object distance measuring device |
US5745175A (en) * | 1995-10-02 | 1998-04-28 | Flashpoint Technologies, Inc. | Method and system for providing automatic focus control for a still digital camera |
US6081670A (en) * | 1999-03-05 | 2000-06-27 | Lifetouch National School Studies Inc. | Depth-of-field indicator for a camera |
-
2003
- 2003-04-02 US US10/406,767 patent/US6801717B1/en not_active Expired - Lifetime
-
2004
- 2004-03-29 JP JP2004094446A patent/JP2004310086A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4982217A (en) * | 1987-01-12 | 1991-01-01 | Canon Kabushiki Kaisha | Camera with automatic focusing apparatus |
US5187585A (en) * | 1989-08-19 | 1993-02-16 | Canon Kabushiki Kaisha | Image sensing apparatus with settable focus detection area |
US5739857A (en) * | 1990-02-08 | 1998-04-14 | Canon Kabushiki Kaisha | Image pickup device with settable image detecting region |
US5103254A (en) * | 1990-05-29 | 1992-04-07 | Eastman Kodak Company | Camera with subject highlighting and motion detection |
US5412487A (en) * | 1991-11-27 | 1995-05-02 | Hitachi, Ltd. | Video camera and apparatus for extracting an object |
US5610653A (en) * | 1992-02-07 | 1997-03-11 | Abecassis; Max | Method and system for automatically tracking a zoomed video image |
US5552823A (en) * | 1992-02-15 | 1996-09-03 | Sony Corporation | Picture processing apparatus with object tracking |
US5526049A (en) * | 1992-02-29 | 1996-06-11 | Samsung Electronics Co., Ltd. | Circuit for automatically focusing a camera based on changes in luminance signals derived from optical signals received by the camera |
US5473369A (en) * | 1993-02-25 | 1995-12-05 | Sony Corporation | Object tracking apparatus |
US5532782A (en) * | 1994-02-17 | 1996-07-02 | Nikon Corporation | Camera having a depth priority operation mode |
US5740477A (en) * | 1994-04-15 | 1998-04-14 | Asahi Kogaku Kogyo Kabushiki Kaisha | Multi-point object distance measuring device |
US5745175A (en) * | 1995-10-02 | 1998-04-28 | Flashpoint Technologies, Inc. | Method and system for providing automatic focus control for a still digital camera |
US6081670A (en) * | 1999-03-05 | 2000-06-27 | Lifetouch National School Studies Inc. | Depth-of-field indicator for a camera |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1902342A1 (en) * | 2005-07-11 | 2008-03-26 | Nokia Corporation | Improved system and mwthod for exhibiting image focus information on a viewfinder |
EP1902342A4 (en) * | 2005-07-11 | 2011-01-12 | Nokia Corp | Improved system and method for exhibiting image focus information on a viewfinder |
US20090231454A1 (en) * | 2008-03-12 | 2009-09-17 | Canon Kabushiki Kaisha | Imaging apparatus and its control method |
US8330850B2 (en) * | 2008-03-12 | 2012-12-11 | Canon Kabushiki Kaisha | Apparatus and method for shooting a moving image and a still image simultaneously |
US8558923B2 (en) | 2010-05-03 | 2013-10-15 | Canon Kabushiki Kaisha | Image capturing apparatus and method for selective real time focus/parameter adjustment |
US20130182167A1 (en) * | 2011-07-15 | 2013-07-18 | Hilti Aktiengesellschaft | Method and device for detecting an object in a substrate |
US9398224B2 (en) * | 2011-07-15 | 2016-07-19 | Hilti Aktiengesellschaft | Method and device for detecting an object in a substrate |
JP2018097380A (en) * | 2018-02-01 | 2018-06-21 | キヤノン株式会社 | Imaging device and control method of the same |
Also Published As
Publication number | Publication date |
---|---|
JP2004310086A (en) | 2004-11-04 |
US6801717B1 (en) | 2004-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6801717B1 (en) | Method and apparatus for controlling the depth of field using multiple user interface markers | |
TWI549501B (en) | An imaging device, and a control method thereof | |
US7505679B2 (en) | Image-taking apparatus | |
US7466356B2 (en) | Method and apparatus for setting a marker on an object and tracking the position of the object | |
US7973848B2 (en) | Method and apparatus for providing composition information in digital image processing device | |
US8139136B2 (en) | Image pickup apparatus, control method of image pickup apparatus and image pickup apparatus having function to detect specific subject | |
US7668451B2 (en) | System for and method of taking image | |
JP2006211139A (en) | Imaging apparatus | |
US9131138B2 (en) | Photographing apparatus | |
KR20130031207A (en) | Image capturing apparatus and control method thereof | |
CN103227902A (en) | Imaging device, display control method, and program | |
US20100020198A1 (en) | Digital Still Camera and Method of Controlling Same | |
JP2010045625A (en) | Imaging apparatus | |
KR20100055938A (en) | Method and apparatus for displaying scene information, and digital photographing apparatus thereof | |
US9177395B2 (en) | Display device and display method for providing image display in first color mode and second color mode | |
EP2200275B1 (en) | Method and apparatus of displaying portrait on a display | |
JP4842919B2 (en) | Display device, photographing device, and display method | |
JP7380675B2 (en) | Image processing device, image processing method, program, imaging device | |
JP2005223658A (en) | Digital camera | |
JP2008054031A (en) | Digital camera and display control method | |
JP2006086758A (en) | Electronic camera | |
JP2009081823A (en) | Imaging apparatus and imaging control method | |
JP4442344B2 (en) | Digital camera | |
JP2009033386A (en) | Photographing device and method | |
JPH08205021A (en) | Image input device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOFER, GREGORY V.;REEL/FRAME:013894/0288 Effective date: 20030325 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
REMI | Maintenance fee reminder mailed | ||
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |