CN101141567B - Image capturing and displaying apparatus and image capturing and displaying method - Google Patents

Image capturing and displaying apparatus and image capturing and displaying method Download PDF

Info

Publication number
CN101141567B
CN101141567B CN 200710153620 CN200710153620A CN101141567B CN 101141567 B CN101141567 B CN 101141567B CN 200710153620 CN200710153620 CN 200710153620 CN 200710153620 A CN200710153620 A CN 200710153620A CN 101141567 B CN101141567 B CN 101141567B
Authority
CN
China
Prior art keywords
image
user
display
image acquisition
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200710153620
Other languages
Chinese (zh)
Other versions
CN101141567A (en
Inventor
佐古曜一郎
鹤田雅明
伊藤大二
飞鸟井正道
海老泽观
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2006261975A external-priority patent/JP2008083289A/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN101141567A publication Critical patent/CN101141567A/en
Application granted granted Critical
Publication of CN101141567B publication Critical patent/CN101141567B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An image capturing and displaying apparatus is disclosed. The image capturing and displaying apparatus includes an image capturing section, a display section, a user's information obtaining section, and a control section. The image capturing section captures an image such that a direction in which a user sees a subject is a direction of the subject. The display section is disposed in front of eyes of the user and displays the image captured by the image capturing section. The user's information obtaining section obtains information about a motion and a physical situation of the user. The control section controls an operation of the image capturing section or an operation of the display section corresponding to information obtained by the user's information obtaining section.

Description

Image acquisition and display device and image acquisition and display packing
The cross reference of related application
The present invention comprises Japanese patent application 2006-244685 that submits to on September 8th, 2006 and the relevant theme of submitting to September 27 in 2007 of Japanese patent application 2006-261975, and the full content of these applications is incorporated in this by reference.
Technical field
The present invention relates to image acquisition and display device and image acquisition and display packing; They are configured to catch the subject image along the user's visual orientation as the subject direction; And when he or she for example put on the equipment as glasses type installation unit or helmet type installation unit, show the image of being caught in him or her eyes front.
Background technology
In Japanese patent application open HEI 8-126031, HEI 9-27970 and HEI 9-185009, proposed the equipment of many use glasses type installation units or helmet type installation unit, they have the display part that is arranged in eyes of user front and display image.
Summary of the invention
Yet, particularly from helping user's vision and expanding him or the viewpoint of her visual capacity sees that the equipment in the correlation technique is not controlled image acquisition operations and display operation.
Consider said circumstances, desired is the visual field of assisting users and expands him or her visual capacity.In addition, also expectation control display operation and image acquisition operations rightly in case with user's situation (for example, hope, visual state, physical condition etc.) accordingly assisting users the visual field and expand his perhaps her visual capacity.In addition, also expect to control rightly display operation and image acquisition operations so that carry out these operations accordingly with external circumstances (for example, surrounding environment, subject, date and time, position etc.).
According to embodiments of the invention, a kind of image acquisition and display device are provided.This image acquisition and display device comprise that image acquisition section, display part, user profile obtain part and control section.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.User profile obtains part and obtains relevant user's the motion and the information of physical condition.Control section is controlled the operation of image acquisition section or the operation of display part accordingly with the information that is obtained the part acquisition by user profile.
According to embodiments of the invention, the image acquisition and the display packing of a kind of image acquisition and display device is provided.This image acquisition and display device comprise image acquisition section and display part.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.Obtain the information of relevant user movement or user's body situation.Control the operation of image acquisition section or the operation of display part accordingly with the information that is obtained.
According to embodiments of the invention, a kind of image acquisition and display device are provided.This image acquisition and display device comprise that image acquisition section, display part, external information obtain part and control section.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.External information obtains part and obtains external information.Control section is controlled the operation of image acquisition section or the operation of display part accordingly with the information that is obtained the part acquisition by external information.
According to embodiments of the invention, the image acquisition and the display packing of a kind of image acquisition and display device is provided.This image acquisition and display device comprise image acquisition section and display part.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.Obtain external information.Control the operation of image acquisition section or the operation of display part accordingly with the information that is obtained.
According to embodiments of the invention, when the user put on such image acquisition of glasses type installation unit for example or helmet type installation unit and display device, the user saw the display part that is arranged in his or her front.When making the display part show the image that is obtained by image acquisition section, the user can utilize the display part to see the scene image that is obtained him or her normal vision direction.
In this case; Though the user sees scene through the image acquisition in the embodiment of the invention and display device on he or her normal vision direction, he or she sees that on display part images displayed is as the scene in he or her the normal vision scene.When changing the display mode of images displayed on the display part accordingly, can help or expand his or her visual capabilities with such user situations such as for example user's hope, the visual field, physical conditions.
When for example showing distant view image (telescopic image), the user can see the distant place scene that he or she can not see usually.When the user read books or newspaper with him or her amblyopia power eyes, if character has wherein been amplified in the display part, then he or she can be clear that them.
In other words, when the display mode of the image of controlling the operation of image acquisition section and display part accordingly with user's situation and being obtained, can providing wherein, the user feels comfortable vision situation.
According to embodiments of the invention, when the user put on image acquisition as glasses type installation unit for example or helmet type installation unit with display device, the user saw and is arranged in his the perhaps display part of her front.When making the display part show the image that is obtained by image acquisition section, the user can utilize the display part on he or her normal vision direction, to see the scene image that is obtained.
In this case; Though the user sees scene through the image acquisition in the embodiment of the invention and display device on he or her normal vision direction, he or she sees that images displayed on the display part is as the scene in he or her the normal vision scene.When for example such external circumstances such as ambient conditions, subject situation changes the display mode of images displayed on the display part accordingly, can help or expand his or her visual capacity.
When for example showing distant view image, the user can see the distant place scene that he or she can not see usually.When the user read books or newspaper with he or she's amblyopia power eyes, if the display part has been amplified character wherein and adjusted its brightness and contrast, then he or she can be clear that them.
In other words, when the display mode of the image of controlling the operation of image acquisition section and display part accordingly with external information and being obtained, can providing wherein to the user, the user feels comfortable or interested vision situation.
According to embodiments of the invention, show the image that obtains by image acquisition section by the display part that is arranged in the user front, that is, and the image that on user's visual direction, obtains as the subject direction.When controlling the operation of operation or display part of image acquisition section accordingly, can assist and expand his perhaps her visual capacity in fact with the information of user's operation or relevant he or her physical condition.
Because through with user's hope or determinedly control image acquisition section accordingly with the corresponding situation of information of represent he or her motion or physical condition or display part assigns to change display mode, so do not apply operation burden to he or she.In addition, because controlled image acquisition section and display part rightly, so image acquisition and display device have the user friendly of height.
In addition, promptly transparent or translucent because the display part can become through state (through state), rather than show the image that obtains by image acquisition section, can live so carry the user of image acquisition and display device with having no problem.Therefore, in user's normal life, can obtain the benefit of the embodiment of the invention effectively.
According to embodiments of the invention, show the image that obtains by image acquisition section by the display part that is arranged in the user front, that is, and the image that in user's visual direction, obtains as the subject direction.When controlling the operation of operation or display part of image acquisition section accordingly, can assist and expand his or her visual capacity in fact with external information.
Because through with determined, control image acquisition section accordingly with the corresponding surrounding environment of external information, subject type, its situation etc. or display part assigns to change display mode, so do not apply the operation burden to he or she.In addition, because controlled image acquisition section and display part rightly, so image acquisition and display device have the user friendly of height.
In addition, promptly transparent or translucent because the display part can become through state, rather than show the image that obtains by image acquisition section, can live so put on the user of image acquisition and display device with having no problem.Therefore, in user's normal life, can obtain the benefit of the embodiment of the invention effectively.
According to following detailed description to the optimal mode embodiment of the present invention that explains in the accompanying drawing, these and other purpose of the present invention, feature and advantage will become more obvious.
Description of drawings
Fig. 1 is the sketch map of describing according to the exemplary outward appearance of the image acquisition of the embodiment of the invention and display device;
Fig. 2 illustrates according to the image acquisition of first embodiment of the invention and the block diagram of display device;
Fig. 3 A, Fig. 3 B and Fig. 3 C describe passing through state, normally obtain the sketch map of image display status and distant view image show state according to the embodiment of the invention respectively;
Fig. 4 A and Fig. 4 B are the sketch mapes of describing respectively according to the embodiment of the invention that passes through state and wide-angle zoom image display status;
Fig. 5 A and Fig. 5 B are the sketch mapes of describing respectively according to the embodiment of the invention that normally obtains image display status/through state and enlarged image show state;
Fig. 6 A and Fig. 6 B be describe respectively according to the embodiment of the invention normally obtain image display status/through state with the adjustment image display status sketch map;
Fig. 7 A and Fig. 7 B be describe respectively according to the embodiment of the invention normally obtain image display status/through state with increased the sketch map that obtains image display status of infrared sensitivity;
Fig. 8 A and Fig. 8 B be describe respectively according to the embodiment of the invention normally obtain image display status/through state with increased the sketch map that obtains image display status of ultraviolet sensitivity;
Fig. 9 A, Fig. 9 B and Fig. 9 C describe the sketch map that passes through state, two separate picture show states and four separate picture show states according to an embodiment of the invention respectively;
Figure 10 is the flow chart that illustrates according to the control and treatment of first embodiment of the invention;
Figure 11 A and Figure 11 B are that the supervision that illustrates according to first embodiment of the invention shows the flow chart that starts the definite processing of triggering;
Figure 12 A and Figure 12 B are that the image control that illustrates according to first embodiment of the invention triggers the flow chart of confirming processing;
Figure 13 A and Figure 13 B are that the image control that illustrates according to first embodiment of the invention triggers the flow chart of confirming processing;
Figure 14 is that the image control that illustrates according to first embodiment of the invention triggers the flow chart of confirming processing;
Figure 15 is that the image control that illustrates according to first embodiment of the invention triggers the flow chart of confirming processing;
Figure 16 is that the image control that illustrates according to first embodiment of the invention triggers the flow chart of confirming processing;
Figure 17 A and Figure 17 B are that the image control that illustrates according to first embodiment of the invention triggers the flow chart of confirming processing;
Figure 18 A and Figure 18 B are that the supervision that illustrates according to first embodiment of the invention shows the flow chart of accomplishing the definite processing of triggering;
Figure 19 is that the supervision that illustrates according to first embodiment of the invention shows the flow chart of accomplishing the definite processing of triggering;
Figure 20 illustrates according to the image acquisition of second embodiment of the invention and the block diagram of display device;
Figure 21 A and Figure 21 B are the sketch mapes of describing respectively according to second embodiment of the invention of not adjusting image display status and adjustment image display status;
Figure 22 A and Figure 22 B are the sketch mapes of describing according to the outstanding display image show state of second embodiment of the invention;
Figure 23 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing;
Figure 24 A and Figure 24 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing;
Figure 25 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing;
Figure 26 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing;
Figure 27 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing;
Figure 28 A and Figure 28 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing;
Figure 29 A and Figure 29 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing; And
Figure 30 A and Figure 30 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of confirming processing.
Embodiment
Next, will image acquisition and display device and image acquisition and display packing according to first embodiment of the invention be described with following order.
[the 1. exemplary outward appearance of image acquisition and display device]
[the 2. demonstrative structure of image acquisition and display device]
[3. exemplary display image]
[4. user situation confirms]
[5. example operation]
[the 6. effect of first embodiment, modification and expansion]
[the 1. demonstration outward appearance of image acquisition and display device]
Fig. 1 show according to first embodiment of the invention, show that as glasses type image of camera obtains the exemplary outward appearance with display device 1.Image acquisition and display device 1 comprise the installation unit with semi-circular shape, and its each from two stature lateral parts aft section to the end centers on user's head.As shown in Figure 1, the user is suspended on through the predetermined portions with image acquisition and display device 1 on two auricles of he or her ear and puts on this image acquisition and display device 1.
Under installment state shown in Figure 1, a pair of display part 2 that is used for left eye and right eye just is arranged in the front of eyes of user, that is, and and at the lens position place of conventional glasses.Display part 2 comprises for example liquid crystal panel.Through the transmissivity of control display part 2, they can become and as shown in Figure 1 pass through state, be transparent or translucent.When display part 2 has become through state, though the user resemble wear glasses continuous image acquisition and the display device 1 put on, this equipment can not influence his or she's normal life yet.
Put on the user under the state of image acquisition and display device 1, image acquisition lens 3a is arranged by forward direction, so that make it along the image that obtains subject as user's visual direction of subject direction.
In addition, arranged the luminous component 4a of the image acquisition direction that illuminates image acquisition lens 3a.Luminous component 4a comprises for example LED (light-emitting diode).
Also arranged a pair of earphone speaker 5a (only showing left side earphone speaker 5a among Fig. 1) in the left and right sides earhole that under the installment state of image acquisition and display device 1, is inserted into the user.
In addition, collect the microphone 6a of external voice and the right that 6b is arranged in right eye display part 2 and the left side of left eye display part 2.
Fig. 1 only is exemplary.Therefore, can there be many users to put on the structure of image acquisition and display device 1.As long as image acquisition and display device 1 are glasses type installation unit or helmet type installation unit; And as long as just be arranged in the front of eyes of user according to this embodiment, display part 2 at least and the image acquisition direction of image acquisition lens 3a is user's a visual direction; Promptly in user's front, then the structure of image acquisition and display device 1 just is not limited to shown in Figure 1 that.In addition, though two eyes that show wherein two display parts 2 and user structures arranged accordingly can be arranged a display part 2 accordingly with user's eyes.
Likewise, earphone speaker 5a can need not to be left and right sides boombox.Alternatively, can arrange an earphone speaker accordingly with user's a ear.Similarly, can cloth microphone 6a and one of 6b.In addition, image acquisition and display device 1 can have any microphone and any earphone speaker.
In addition, image acquisition and display device 1 can have luminous component 4a.
[the 2. demonstrative structure of image acquisition and display device]
Fig. 2 shows the exemplary internal configuration of image acquisition and display device 1.
System controller 10 comprises microcomputer, and it comprises for example CPU (CPU), ROM (read-only memory), RAM (random access memory), nonvolatile memory part and interface section.System controller 10 is control sections of the whole parts in control image acquisition and the display device 1.
System controller 10 and user situation are controlled each part of image acquisition and display device 1 accordingly.In other words, system controller 10 is operated with the operation sequence of detection and definite user situation accordingly, and operates and control each part accordingly with definite result.Therefore, as shown in Figure 2, system controller 10 comprise on function that the user situation of confirming user situation is confirmed function 10a and control accordingly with definite result and order each part control function 10b.
In image acquisition and display device 1, arranged image acquisition section 3, image acquisition control section 11 and obtained image signal processing section 15, as the structure of obtaining at the image of user front.
Image acquisition section 3 comprises lens combination (it has image acquisition lens 3a (as shown in Figure 1), aperture, zoom lens, condenser lens etc.); Make lens combination carry out the drive system of focusing operation and zoom operation; And the solid state image sensor array, wherein the light that obtains image that obtains by lens combination of solid state image sensor array detection, light is converted into electricity and generates and the electric corresponding picture signal of obtaining.The solid state image sensor array comprises for example CCD (charge coupled device) sensor array or CMOS (complementary metal oxide semiconductors (CMOS)) sensor array.
Obtain image signal processing section 15 and comprise sampling maintenances/AGC (automatic gain control) circuit and video a/d converter, wherein this sampling maintenance/agc circuit is adjusted the gain of the signal that is obtained by the solid state image sensor array in the image acquisition section 3 and is repaired the waveform of this signal.Obtain image signal processing section 15 and obtain picture signal as numerical data.Obtaining image signal processing section 15 is that the picture signal of being obtained is carried out white balance processing, brightness processed, colour signal processing, vibration correction processing etc.
Image acquisition control section 11 is controlled image acquisition section 3 and the operation of obtaining image signal processing section 15 accordingly with the order that receives from system controller 10.Image acquisition control section 11 for example opening and closing image acquisition section 3 and the operation of obtaining image signal processing section 15.In addition, image acquisition control section 11 control image acquisition section 3 are carried out automatic focus operation, automatic exposure adjustment operation, aperture adjustment operation, zoom operation etc. with (through motor).
In addition, image acquisition control section 11 comprises timing generator.Image acquisition control section 11 utilizes the timing signal that is generated by timing generator, sampling maintenance/agc circuit and video a/d converter in control solid state image sensor array and the image acquisition control section 11.In addition, image acquisition control section 11 can utilize timing signal to change the frame frequency that obtains image.
In addition, image acquisition control section 11 control solid state image sensor arrays and image acquisition sensitivity and the signal processing of obtaining image signal processing section 15.In order to control image acquisition sensitivity, gain, the black level setting of the signal that 11 controls of image acquisition control section have for example been read from the solid state image sensor array, the various types of coefficients that obtain the numerical data of picture signal handling, the correcting value that vibration correction is handled etc.About the image acquisition sensitivity adjustment, image acquisition control section 11 can be carried out the total sensitivity adjustment, and irrelevant with the particular sensitivity adjustment that is used for the specific band such as infrared and ultraviolet range with wave band.Can be through in the image acquisition lens combination, inserting wavelength filter, and, carry out the specific sensitivity adjustment of wavelength through being that the picture signal of being obtained is carried out the wavelength filtering computing.In these cases, image acquisition control section 11 can be through inserting wavelength filter and/or specifying the filtering design factor to control sensitivity.
As the structure to user's video data, image acquisition and display device 1 comprise display part 2, display image processing section 12, display driving part 13 and display control section 14.
Its image has been obtained by image acquisition section 3 and offered display image processing section 12 by obtaining the picture signal of obtaining that image signal processing section 15 handles then.Display image processing section 12 for example is so-called video processor.Display image processing section 12 is that the picture signal of obtaining that is provided is carried out various types of demonstrations processing.For example intensity level adjustment, colour correction, contrast adjustment, acutance (edge enhancing) adjustment etc. can be carried out for obtaining picture signal in display image processing section 12.In addition; Display image processing section 12 can generate the enlarged image and the downscaled images of wherein amplifying a part of obtaining picture signal, and separate picture is so that separate demonstration, combination image; Generate character picture and graph image, and with the image that is generated with to obtain image superimposed.In other words, display image processing section 12 can be for carrying out various types of processing as the data image signal that obtains picture signal.
Display driving part 13 comprises pixel-driving circuit, and it is in the picture signal that for example provides from display image processing section 12 for demonstration on the display part 2 of LCD.In other words, display driving part 13 will be applied to based on the drive signal of picture signal in each pixel that forms with matrix shape in the display part 2, to cause display part 2 display images with predetermined horizontal/vertical driving timing.In addition, the transmissivity of display driving part 13 each pixel of control becomes through state to cause display part 2.
Display control section 14 is controlled the processing of display image processing section 12 and the operation of operation and display driving part 13 accordingly with the order that receives from system controller 10.In other words, display control section 14 makes display image processing section 12 carry out above-mentioned various types of processing.In addition, display control section 14 control display driving parts 13 are passing through switching displayed state between state and the image display status to cause display part 2.
In the following description, wherein display part 2 becomes transparent or translucent state and is called as " through state ", and wherein the operation (and state) of display part 2 display images is called as " supervision show state ".
In addition, image acquisition and display device 1 also comprise sound importation 6, sound signal processing part 16 and sound output 5.
Sound importation 6 comprises microphone 6a and 6b shown in Figure 1, and the microphone amplifier section, and it handles the voice signal that is obtained by microphone 6a and 6b.
Sound signal processing part 16 comprises for example A/D converter, digital signal processor, D/A converter etc.Sound signal processing part 16 will convert numerical data into from the voice signal that sound importation 6 provides, and processing such as the adjustment of execution volume, tonequality adjustment, acoustics under the control of system controller 10.Sound signal processing part 16 converts the voice signal that produces into analog signal, and analog signal is offered voice output part 5.Sound signal processing part 16 is not limited to the structure of combine digital signal processing.On the contrary, sound signal processing part 16 can utilize analogue amplifier and analog filter to carry out signal processing.
Voice output part 5 comprises a pair of earphone speaker 5a shown in Figure 1 and the amplifier circuit that is used for earphone speaker 5a.
Sound importation 6, sound signal processing part 16 and sound output 5 allow the user to hear external voice through image acquisition and display device 1.
Voice output part 5 can be constructed to so-called sclerotin (osseous) conduction loud speaker.
In addition, image acquisition and display device 1 comprise that illumination (lighting) part 4 and lighting control section divide 18.Illumination section 4 comprises luminous component 4a shown in Figure 1 (for example, light-emitting diode) and causes the luminous illuminating circuit of luminous component 4a.Lighting control section divides 18 to make illumination section 4 and the order that provides from system controller 10 carry out light emission operation accordingly.
Because the luminous component 4a in the illumination section 4 is arranged to luminous component 4a is forwards illuminated, so illumination section 4 is carried out the illumination operation on user's visual direction.
As the structure that obtains user profile, image acquisition and display device 1 comprise vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17.
Vision sensor 19 detects the information of relevant user's vision.Vision sensor 19 is the transducers that can detect the information of relevant user's vision, and these information are such as opening/close for direction of visual lines, focal length, pupil dilation, eyeground pattern, eyelid etc.
Acceleration transducer 20 and gyroscope 21 outputs and the corresponding signal of user movement.Acceleration transducer 20 is the transducers that detect the motion of user's head, neck, whole health, arm, leg etc. with image acquisition control section 11.
Biological sensor 22 detects user's biological information.Biological sensor 22 is transducers of heart rate information, pulse information, perspiration information, E.E.G information, electrodermal response (GSR), body temperature, blood pressure, respiratory activity information of test example such as user etc.The detection signal of biological sensor 22 becomes such information, utilizes this information to confirm tense situation, excitatory state, tranquility, hypnosis, comfort conditions, uncomfortable state etc.
Importation 17 is parts that the user utilizes its manual input information.Having formed the user in the importation 17 can utilize it to import his the perhaps switch in her visual field.
Utilize vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17; The information about the user's that puts on image acquisition and display device 1 motion or physical condition of obtaining is as user profile, and it is offered system controller 10.
Confirm in the processing of function 10a that at user situation system controller 10 is confirmed to hope or situation with the corresponding user of user profile who is obtained.In the processing that controls function 10b, system controller 10 is hoped with determined user or situation is controlled image acquisition operations and display operation accordingly.In other words, the operation that image signal processing section 15 is obtained in 11 controls of system controller 10 order image acquisition control sections, and the operation of order display control section 14 control display image processing sections 12 and display driving part 13.
As the structure that obtains user profile in image acquisition and the display device 1, vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17 have been described exemplarily.Yet image acquisition and display device 1 can comprise all these transducers.In addition, image acquisition and display device 1 can comprise other transducer such as the transducer of transducer that detects user's voice and the motion that detects lip.
[3. exemplary display image]
System controller 10 is hoped with the user or situation is controlled image acquisition operations and display operation accordingly.Therefore, the various display modes of User Recognition display part 2.Fig. 3 A has exemplarily explained various display modes to Fig. 3 C to Fig. 9 A to Fig. 9 C.
Fig. 3 A shows wherein, and display part 2 is in through the state under the state.In other words.Under this state, display part 2 is simple transparent flat parts, and the user can see the scene in the visual field through transparent display part 2.
Fig. 3 B shows the image that is wherein obtained by image acquisition section 3 and is displayed on the state of keeping watch on the display part 2 of operating under the show state that is in.Image acquisition section 3, obtain image signal processing section 15, display image processing section 12 and display driving part 13 and under the state shown in Fig. 3 A, operate, so that make them normally on display part 2, show the image that obtains.In this case, show on the display part 2 to obtain image (image that normally obtains) almost identical with the image that manifests on through the display part 2 of operating under the state.In other words, under this state, the user can see that the normal visual field is as the image that obtains.
Fig. 3 C shows wherein, and system controller 10 causes image acquisition section 3 to obtain distant view image and distant view image is presented at the state on the display part 2 through image acquisition control section 11.
By contrast, when system controller 10 causes image acquisition section 3 to obtain wide angle picture through image acquisition control section 11, on display part 2, show closely wide angle picture (not shown).Though the zoom lens that image acquisition section 3 is obtained part 3 through driven image are carried out distant view and wide-angle control, obtain image processing section 15 and can carry out these control through processing signals.
Fig. 4 A shows wherein, and display part 2 is in the state that is reading newspaper through state, for example user.
Fig. 4 B shows so-called wide-angle zoom state.In other words, Fig. 4 B shows such state, under this state, obtains nearly focal length zoom image and it is presented on the display part 2, so that for example amplify the character in the newspaper.
Fig. 5 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in through under the state.
At this moment, when display control section 14 carries out image processing and amplifying are passed through in system controller 10 order display image processing sections 12, on display part 2, show the enlarged image shown in Fig. 5 B.
Fig. 6 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in through under the state.Especially, Fig. 6 A shows the state that user is wherein reading newspaper or books.In this case, suppose because around be dim, so the user can not with the image that normally obtains or display part 2 pass through see the character in the newspaper etc. under the state.
In this case; System controller 10 order image acquisition control sections 11 (image acquisition section 3 with obtain image signal processing section 15) increase image acquisition sensitivity; And/or make display control section 14 (display image processing section 12 and display driving part 13) increase brightness and adjustment contrast and acutance, so that on display part 2, show than more sharpening, shown in Fig. 6 B the image of the image shown in Fig. 6 A.On the contrary, when system controller 10 makes illumination section 4 carry out the illumination operation, can be on display part 2 display image sharply.
Fig. 7 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in through under the state.In this case, the user is in the dark bedroom that wherein child is sleeping.Because the user is in the dark room, thus he or she can not utilize the image that normally obtains or display part 2 pass through clearly see child under the state.
At this moment; When system controller 10 order image acquisition control sections 11 (image acquisition section 3 with obtain image signal processing section 15) increase infrared image and obtain sensitivity; On display part 2, show the infrared image that obtains shown in Fig. 7 B, so that make the user can see children's sleeping face etc.
Fig. 8 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in through under the state.
When system controller 10 order image acquisition control sections 11 (image acquisition section 3 with obtain image signal processing section 15) increase ultraviolet image and obtain sensitivity, on display part 2, show shown in Fig. 8 B, have ultraviolet component obtain image.
Fig. 9 A shows wherein, and display part 2 is in through the state under the state.
When system controller 10 order display control sections 14 (display image processing section 12 and display driving part 13) display images or display image and partly during enlarged image respectively respectively, can be on display part 2 image shown in the displayed map 9B.In other words, the screen of display part 2 is separated into regional AR1 and AR2, and wherein regional AR1 is in through state or is in the normal picture show state, and regional AR2 is in the enlarged image show state.
Fig. 9 C shows another exemplary separation and shows.In this case, the screen of display part 2 is separated into regional AR1, AR2, AR3 and AR4, and these zones show the frame with the image of predetermined amount of time interval acquiring.System controller 10 makes display image processing section 12 from the picture signal of being obtained, extract a frame with 0.5 second interval, and shows the frame that is extracted with the order of regional AR1, AR2, AR3, AR4, AR1, AR2 etc.In this case, the image that on display part 2, shows so-called flicker display mode (strobedisplay mode) discretely.
Various types of display images exemplarily have been described hereinbefore.Yet, in this embodiment,, can realize various types of display modes through control image acquisition section 3, each processing of obtaining image signal processing section 15, display image processing section 12 and display driving part 13 and each operation.
For example; Estimate to have the display mode of many types, such as distant view display mode, wide-angle display mode, scope, change from the distant view display mode to the wide-angle display mode dwindle display mode dwindle display mode, variable frame frequency display mode (obtaining image), high brightness display mode, low-light level display mode, variable contrast display mode, variable acutance display mode with enlarged display mode, image enlarged display mode, image with high frame frequency, increased sensitivity obtain image display mode, increased infrared sensitivity obtain image display mode, increased ultraviolet sensitivity obtain image display mode, image effect display mode (such as mosaic image, brightness reverse image, soft focus image, the outstanding display image of part screen, have variable color atmosphere image etc.), slowly display mode, by the frame display mode, with these display modes combine separate display mode, with through state with obtain that image combines separate display mode, flicker display mode, have the rest image display mode of the frame in the image that obtained etc.
[4. user situation confirms]
As stated, as the structure that obtains user profile, comprise vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17 according to image acquisition and the display device 1 of this embodiment.
Vision sensor 19 detects the information of relevant user's vision.Vision sensor 19 can comprise for example image acquisition section, and it arranges and be used to obtain the image of eyes of user near one of display part 2.The image of the eyes of user that obtains by system controller 10 to obtain through image acquisition section.User situation confirms that function 10a analyzes this image, and detection and the corresponding direction of visual lines of analysis result, focal length, pupil dilation, eyeground pattern, eyelid are opened/closed etc.Therefore, user situation is confirmed corresponding user situation of result and the hope that function 10a can confirm and detected.
Alternatively, vision sensor 19 can comprise luminous component and light receiving part, and wherein luminous component is arranged near one of display part 2 and be luminous to user's eyes, and light receiving part receives the light from eye reflections.Through for example utilizing the lenticular thickness that detects eyes of user with the corresponding signal of light that receives, can detect the focal length of eyes of user.
Through detecting the direction of visual lines of eyes of user, system controller 10 can be confirmed the part that the user is focusing in the images displayed on display part 2.
In addition, system controller 10 can discern eyes of user direction of visual lines as the operation input.For example, when the user left when moving right sight line, system controller 10 can identify these operations as importing to the scheduled operation of image acquisition and display device 1.
Through detecting the focal length of eyes of user, system controller 10 can confirm that it is a long way off or on hand that the user is just focusing on scene on it.System controller 10 can be carried out zoom control, amplify control, dwindle control etc. with definite result accordingly.For example, when the user watched the distant place scene, system controller 10 can be carried out the distant view display operation.
When detecting pupil dilation under the state, the brightness around can confirming.When under keeping watch on show state, detecting pupil dilation, can confirm that the user feels to institute's images displayed to dazzle the eyes etc.Can adjust brightness, image acquisition sensitivity etc. accordingly with the result who confirms.
When the pattern of detection user's eyeground, can verify the user accordingly with testing result.Because the eyeground pattern all is unique for each user, so can identify the user who puts on image acquisition and display device 1.Can control image acquisition and display device 1 accordingly with the result who is identified.Alternatively, system controller 10 can only show for the predesignated subscriber controls to keep watch on.
When the eyelid that detects the user open/during closing operation, can confirm eyes of user stare and tired.In addition, can the opening of eyelid/closing operation be identified as the deliberate operation input of user.When the user had carried out the opening of eyelid/closing operation three times, it was the operation input of being scheduled to that these actions can be confirmed as.
Acceleration transducer 20 and gyroscope 21 outputs and user movement information corresponding.The motion of acceleration transducer 20 test example such as property direction along the line.Gyroscope 21 suitably detects the motion and the vibration of rotary system.
Depend on that acceleration transducer 20 and gyroscope 21 are arranged in the position in image acquisition and the display device 1, motion or he that they can detect the whole health of user is the motion of each part of her health perhaps.
When acceleration transducer 20 and gyroscope 21 are arranged in glasses type image acquisition shown in Figure 1 and the display device 1; Promptly; When acceleration transducer 20 detected the motion of user's heads with gyroscope 21, the information of acceleration transducer 20 became the acceleration information as the motion of user's head or he or her whole health.In this case, the information of gyroscope 21 becomes as the angular speed of the motion of user's head or he or her whole health and the information of vibration.
Therefore, can detect the action of user from the neck moving-head.For example, can confirm state that the user looks up and the state that he or she looks down.When the user looks down, can confirm that he or she is seeing subject nearby, for example he or she is reading books etc.On the contrary, when the user looks up, can confirm that perhaps she is seeing subject at a distance for he.
When system controller 10 has detected the user when he or she's neck moves the action of he or her head, system controller 10 can be identified as the deliberate action of user with it.For example, if the user rocks he or her neck twice to the left side, then system controller 10 can be confirmed as predetermined operation input with this action.
Depend on acceleration transducer 20 and gyroscope 21, they can confirm that the user is in halted state (non-walking states), walking states, still is in the running state.In addition, acceleration transducer 20 changes the perhaps change of the state from the state of being seated to the state of standing with the state that gyroscope 21 can detect from standing state to the state of being seated.
When acceleration transducer 20 separated with helmet installation unit with gyroscope 21 and be arranged in an arm or pin place of user, they can only detect the motion of this arm or pin.
Biological sensor 22 test example such as heart rate information (heart rate), pulse information (pulse frequency), perspiration information, brain wave information are (for example; The information of α ripple, β ripple, θ ripple and δ ripple), electrodermal response, body temperature, blood pressure, respiratory activity (for example, respiration rate, the degree of depth and lung capacity) etc. are as user's biological information.System controller 10 can be confirmed the corresponding tense situation of information, excitatory state, mood tranquility, comfort conditions or the uncomfortable the state whether user is in and is detected.
Can confirm accordingly whether the user has put on image acquisition and display device 1 with the biological information that is detected.For example, when the user did not also put on image acquisition and display device 1, system controller 10 can be controlled image acquisition and display device 1 so that only operate under the holding state of detection of biological information therein.When system controller 10 detected with detected biological information that the user has put on image acquisition and display device 1 accordingly, system controller 10 can the conducting image acquisition and the power supply of display device 1.On the contrary, when the user had taken off image acquisition and display device 1, system controller 10 can revert to holding state with image acquisition and display device 1.
In addition, can be used for verifying user (user that image acquisition and display device 1 have been put in identification) by biological sensor 22 detected information.
When biological sensor 22 is arranged in the installation frame of glasses type image acquisition and display device 1, on user's a lateral parts or an aft section, detect above-mentioned information.Alternatively, a pre-position of user's body can separated and be arranged in to biological sensor 22 with the installation frame of image acquisition and display device 1.
Importation 17 is that the user can utilize it to import his the perhaps part of her visual field information.When the user imports he or she's visual field information, for example field number and relevant myopia, long sight, astigmatism, presbyopia's etc. information, system controller 10 can with user's the visual field demonstration of control chart picture accordingly.
[5. example operation]
In the image acquisition and display device 1 of this embodiment according to the present invention, system controller 10 is controlled image acquisition operations and display operation accordingly with the user profile that is detected by vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17.Therefore, hope and the corresponding display operation of situation with the user carried out in display part 2, so that the vision of assistance and extending user.
Next, with the various types of example operation under the control that is described in system controller 10.
Figure 10 shows the control and treatment that controls function 10b as system controller 10.
In step F 101, system controller 10 is controlled display control sections 14 so that display part 2 becomes through state.When initial conducting image acquisition and display device 1, flow process advances to step F 101.In step F 101, system controller 10 control display parts 2 become through state.
When display part 2 was in through state, flow process advanced to step F 102.In step F 102, system controller 10 determines whether to have occurred the supervision show state and starts triggering.When system controller 10 confirms that definite user hopes that perhaps situation has started the supervision show state accordingly with confirmed function 10a by user situation, show state occurs keeping watch on and start triggering.System controller 10 determines whether to have occurred the supervision show state accordingly and start and trigger with user's operation, user's motion (being identified as the motion of operation) or user's be not intended to motion or the situation (comprising the identification to the user) of having a mind to.These concrete examples will be described after a while.
When definite result represented to have occurred supervision show state startup triggering, flow process advanced to step F 103.In step F 103, system controller 10 execution monitorings show start-up control.In other words, system controller 10 order image acquisition control sections 11 make image acquisition section 3 and obtain the normal image acquisition operations of image signal processing section 15 execution.In addition, system controller 10 order display control sections 14 make display image processing section 12 and display driving part 13 cause display part 2 to show that the picture signal of being obtained is as the image that normally obtains.
In this processing procedure, the state that passes through shown in Fig. 3 A is switched to supervision show state shown in Fig. 3 B, that be used for normally obtaining image.
When display part 2 showed the image that normally obtains (its with user identical through the scene seen under the state), flow process advanced to step F 104.In step F 104, whether system controller 10 monitoring image control occurred is triggered.In step F 105, whether system controller 10 monitoring the supervision show state occurred is accomplished triggering.
When system controller 10 confirm need with confirm by user situation that user that function 10a confirms hopes or situation is being kept watch on when changing display image mode in the show state accordingly, image control occurred and triggered.When system controller 10 confirm need with confirm by user situation that the determined user of function 10a hopes or situation is accomplished accordingly and kept watch on show state and will keep watch on show state when switching to through state, show state occurs keeping watch on and accomplish triggering.System controller 10 determines whether to have occurred the supervision show state accordingly and accomplish and trigger with user's operation, user's motion (being identified as the motion of operation) or user's be not intended to motion or the situation (user's physical condition, user's identification etc.) of having a mind to.These concrete examples will be described after a while.
When definite result represented to have occurred image control triggering, flow process advanced to step F 106 from step F 104.In step F 106, the display operation of image is obtained in system controller 10 controls.In other words, system controller 10 order image acquisition control sections 11 and display control section 14 make display part 2 to hope or the corresponding display mode display image of situation with user on this time point.
After step F 106 system controllers 10 had been controlled display mode, flow process turned back to step F 104 or F105.In step F 104 or step F 105, whether system controller 10 monitoring triggering occurred.
When definite result represented to have occurred supervision show state completion triggering, flow process turned back to step F 101 from step F 105.In step F 101, system controller 10 order image acquisition control sections 11 are accomplished image acquisition operations, and order display control section 14 becomes through state display part 2.
When the user put on image acquisition and display device 1 and its power supply of conducting, the function 10b that controls in the system controller 10 carried out control and treatment shown in Figure 10.
In this processing procedure, start the definite result execution monitoring demonstration accordingly start-up control whether triggering has occurred with the supervision show state.Trigger the definite result who whether has occurred with image control and control display mode accordingly.Trigger the definite result execution monitoring demonstration accordingly that whether has occurred with the completion of supervision show state and stop and passing through State Control.To Figure 19 A and Figure 19 B the concrete example that triggering is confirmed and controlled be described with reference to figure 11A and Figure 11 B after a while.
The user situation that Figure 11 A and Figure 11 B show system controller 10 to Figure 19 A and Figure 19 B is confirmed the exemplary processes of function 10a.Suppose that these are handled and the processing that controls function 10b shown in Figure 10 is carried out concurrently.Carry out these parallel processings, so that for example when system controller 10 is being carried out processing shown in Figure 10, execution graph 11A and Figure 11 B handle to the detection shown in Figure 19 A and Figure 19 B termly as Interrupt Process.Figure 11 A and Figure 11 B can be built in the program of carrying out processing shown in Figure 10 to the program of the processing shown in Figure 19 A and Figure 19 B.Alternatively, these programs can be other programs of regularly calling.In other words, the structure of these programs is not limited to specific structure.
With reference to figure 11A and Figure 11 B, describe determining whether to have occurred causing starting the exemplary processes that triggers with switch to the supervision show state of keeping watch on show state through state.
Figure 11 A and Figure 11 B show the exemplary processes that detects as keeping watch on the user movement that shows start-up operation.
In the step F 200 shown in Figure 11 A, the information (acceleration signal or angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 or gyroscope 21.
Defined such as twice of the neck that teetertotter, transverse shakiness neck once, the predetermined motion inferior of rolling the neck keeping watch on the operation of operating under the show state for expression user command image acquisition and display device 1.Confirm users when system controller 10 and carried out when representing that information that he or she orders image acquisition and display device 1 and acceleration transducer 20 and/or gyroscope 21 to be detected starts the motion of supervision show state accordingly that flow process advances to step F 202 from step F 201.In step F 202, system controller 10 confirms to have occurred being used for the supervision show state startup triggering of the picture signal of obtaining.
When the definite result in step F 202 represented to have occurred supervision show state startup triggering, flow process advanced to step F 103 from step F shown in Figure 10 102.In step F 103, system controller 10 control display parts 2 start the display operation of the image that obtains.
As the information of acceleration transducer 20 and/or gyroscope 21 and to be detected and order image acquisition and display device 1 are keeping watch on that the consumer premise operated under the show state moves that other is exemplified as jump, waves, rocks arm, rocks pin etc.
Figure 11 B is the exemplary processes that determines whether to have occurred accordingly with the information of vision sensor 19 supervision show state startup triggering.
In step F 210, system controller 10 is analyzed the information that is detected by vision sensor 19.When the image acquisition section that will obtain the eyes of user image was arranged as vision sensor 19, system controller 10 was analyzed the image that is obtained by this image acquisition section.
As supposition user when continuously nictation, three times specific action was defined as user command image acquisition and display device 1 with operation keeping watch on show state and operate, the image that system controller 10 is obtained through analysis comes supervisory control action.
Blinked continuously three times the time when system controller 10 has detected the user, flow process advances to step F 212 from step F 211.In step F 212, system controller 10 confirms to have occurred being used for the supervision show state startup triggering of the picture signal of obtaining.
When the definite result in step F 212 represented to have occurred supervision show state startup triggering, flow process advanced to step F 103 from step F shown in Figure 10 102.In step F 103, system controller 10 control display parts 2 are so that to keep watch on the display operation that show state starts the image that obtained.
With the information that detects by vision sensor 19 to be detected accordingly and order image acquisition and display device 1 with other example of the user action keeping watch on show state and operate comprise rotate eyeball, laterally mobile they twice, move up and down that they are two inferior.
Except with user's voluntary action accordingly with image acquisition and display device 1 from switching to through state these exemplary processes of keeping watch on show state, other action can also be arranged.
In order to switch to the supervision show state, for example, can arrange switch through state.Can with the operation of switch switching displayed state accordingly.
When the user from the importation 17 when having imported visual field information, system controller 10 can be confirmed to have occurred the supervision show state and start and trigger.
Alternatively, when the user put on image acquisition and display device 1, system controller 10 can be confirmed to have occurred the supervision show state and start triggering.Because system controller 10 can be confirmed the user and whether put on image acquisition and display device 1 accordingly with the information of biological sensor 22 detections; So when biological sensor 22 had for example detected pulse, brain wave, electrodermal response etc., system controller 10 can be confirmed to have occurred the supervision show state and start and trigger.In this case, when the user had just put on image acquisition and display device 1, image acquisition and display device 1 were operated to keep watch on show state.
Alternatively, when the predesignated subscriber had put on image acquisition and display device 1, it can begin operation to keep watch on show state.Can with discern the user accordingly by vision sensor 19 detected eyeground patterns and by biological sensor 22 detected signals.When the eyeground pattern of having registered the user who uses image acquisition and display device 1 and biological information, system controller 10 can confirm whether this predesignated subscriber has put on image acquisition and display device 1.
Therefore, when the predesignated subscriber put on image acquisition and display device 1, system controller 10 was verified he or she.When image acquisition and display device 1 had identified the predesignated subscriber, system controller 10 was confirmed to have occurred the supervision show states and is started and trigger, and the control image acquisition is operated with the supervision show state with display device 1.
When allowing function with image acquisition and display device 1 only to be used for the predesignated subscriber, can such personal verification be added to the condition that has determined whether to have occurred to keep watch on show state startup triggering.
When showing that with above-mentioned supervision starting triggering shows the image that obtains accordingly on display part 2; Shown in Fig. 9 B; Zone AR1 can be in through state on the screen of display part 2, and can in the regional AR2 as a screen part, show the image that is obtained.
Next, with reference to figure 12A and Figure 12 B to Figure 17 A and Figure 17 B, describe as the processing in the step F shown in Figure 10 104, determined whether to occur the exemplary processes that image control triggers.
Figure 12 A shows the exemplary processes of controlling zoom operation with the mobile phase of user's sight line accordingly.
In the step F 300 shown in Figure 12 A, system controller 10 is analyzed by vision sensor 19 detected information.For example, when the image acquisition section that will obtain the eyes of user image was arranged as vision sensor 19, system controller 10 was analyzed the image that is obtained.
When the visual direction that has detected the user when system controller 10 had moved down, flow process advanced to step F 302 from step F 301.In step F 302, system controller 10 is confirmed to have occurred amplification (wide-angle zoom) and is shown handover trigger.
When the definite result in step F 302 represented the wide-angle zoom handover trigger to have occurred, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 10 order image acquisition control sections 11 are carried out amplifieroperation.Therefore, the image of display part 2 demonstrations shown in Fig. 4 B.
When user's sight line moved down, he or she were reading newspaper or books or are watching very the position near eyes.Therefore, when enlarged image, it has carried out appropriate display for near-sighted or presbyopic user.
Figure 12 B is an exemplary processes of controlling zoom operation with the focal length of the motion of user's neck (head) and he or her eyes accordingly.
In the step F 310 shown in Figure 12 B, system controller 10 is analyzed by vision sensor 19 detected information, and the focal length and the visual direction of detection and the corresponding eyes of user of analysis result.In step F 311, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 and gyroscope 21, and information definite and that detected is corresponding, the motion of user's neck.
After this, in step F 312 and step F 313, the definite accordingly user of testing result of the orientation of the focal length of system controller 10 and relevant eyes of user and he or her neck is seeing nearby the position or is seeing remote location.
When system controller 10 confirmed that the user is seeing position (particularly, he or her hand) nearby, flow process advanced to step F 314 from step F 312.In step F 314, system controller 10 is confirmed to have occurred amplification (wide-angle zoom) and is shown handover trigger.In step F 316, the focal length of system controller 10 calculating and eyes of user and he is the directed corresponding appropriate zoom magnification ratio of her neck (head) perhaps.
When system controller 10 confirmed that the user is seeing remote location, flow process advanced to step F 315 from step F 313.In step F 315, system controller 10 is confirmed to have occurred the distant view zoom and is shown handover trigger.In step F 316, the focal length of system controller 10 calculating and eyes of user and he is the directed corresponding appropriate zoom magnification ratio of her neck (head) perhaps.
When the processing of the processing of having carried out step F 314 and step F 316 or step F 315 and step F 316, flow process advances to step F 106 from step F shown in figure 10 104.In step F 106, system controller 10 order image acquisition control sections 11 utilize the magnification ratio that is calculated to carry out zoom operation.
Therefore, corresponding, enlarged image shown in Fig. 4 B of the scene seen of display part 2 demonstrations and user or the distant view image shown in Fig. 3 C.
Such operation becomes the user's who assists myopia and long sight function.
In Figure 12 A and Figure 12 B, the processing that is changed display image by the zoom operation of display driving part 13 has been described exemplarily.Alternatively, system controller 10 can cause the focal length of display image processing section 12 and user's visual direction, he or her eyes, the orientation of he or her neck etc. carries out image processing and amplifying, image dwindle processing etc. accordingly.
Figure 13 A shows the exemplary processes of the information in the relevant user of the 17 inputs visual field from the importation.From the importation 17 when having imported the information in the relevant user visual field, flow process advances to step F 401 from the step F 400 shown in Figure 13 A when.In step F 401, system controller 10 is confirmed to have occurred basis, the visual field and is shown handover trigger.In step F 402, system controller 10 calculates the corresponding magnification ratio of value with the visual field.
When having carried out the processing of step F 401 and step F 402, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 10 order display image processing sections 12 are carried out enlarged and displayed with the magnification ratio that is calculated and are operated.In this processing procedure, display part 2 shows and the corresponding enlarged image in the user visual field.
Alternatively, system controller 10 can for example store user's eyeground pattern and visual field information in the internal storage explicitly in advance.Discern the user through detecting user he or her eyeground pattern.System controller 10 can order display part 2 to show and the corresponding enlarged image in the user visual field.
The exemplary processes that Figure 13 B shows that user's reply as far as presbyopia for example and astigmatism and reply dark surrounds reduced for the sensitivity of brightness, ambiguity etc.
In step F 410, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 and gyroscope 21, and the corresponding user movement of result of confirming and being detected.For example, system controller 10 confirm users whether with testing result be in accordingly wherein he or she not under the resting state of walking.In step F 411; System controller 10 is analyzed the information that is detected by vision sensor 19, and the pupil dilation and the eyelid folding (screw-up) (he is her eyelid state perhaps) of the focal length of detection and the corresponding eyes of user of analysis result, he or her eyes.
When confirming that the result representes that the user is in resting state and seeing closely or during positive closed (screw up) he or her eyelid, flow process advances to step F 413 from step F 412.In step F 413, system controller 10 is confirmed to have occurred the presbyopia and is waited reply to show triggering.
When the definite result in step F 413 represented triggering to have occurred, flow process advanced to F106 from step F shown in Figure 10 104.In this case; System controller 10 order display driving parts 13 improve image acquisition sensitivity, and image signal processing section 15 is obtained in order or the processing that increases the brightness and contrast and strengthen edge (acutance) is carried out in display image processing section 12.Therefore, in this processing procedure, display part 2 clearly illustrates the image shown in Fig. 6 B.Therefore, this handles when the user who the presbyopia is arranged or be in the dark place for example reads newspaper assisting users visually.
In this case, shown in Fig. 4 B, system controller 10 can cause display part 2 carries out image amplifieroperations.
When system controller 10 confirmed that users are in the corresponding dark place of pupil dilation with he or her eyes, system controller 10 can be controlled illumination section 4 and throw light on.
Figure 14 shows and depends on that user he or she is the exemplary processes of the comfortable or uncomfortable user's of dealing with vision.
In step F shown in Figure 14 420, system controller 10 is analyzed the information that is detected by vision sensor 19, and detection is with analysis result is corresponding, the pupil dilation of eyes of user and he perhaps nictation (number of times of time per unit) of her eyes.
In step F 421, the information of the brain wave that system controller 10 inspections are detected by biological sensor 22, heart rate, volume of perspiration, blood pressure etc.
System controller 10 with confirm accordingly that by vision sensor 19 and biological sensor 22 detected information the user is for watching on the display part 2 images displayed to feel comfortable or feel under the weather.
When definite result representes the user when watching image not feel comfortable, flow process advances to step F 423 from step F 422.In step F 423, system controller 10 is confirmed to have occurred image adjustment control and is triggered.In this case, flow process advances to step F 424.In step F 424, system controller 10 calculating and user situation (situation of for example, brightness, contrast, acutance, image acquisition sensitivity, brightness of illumination etc.) are considered to comfortable adjusted value accordingly.
When having carried out the processing of step F 423 and step F 424, flow process advances to step F 106 from step F shown in Figure 10 104.In this case; In step F 106; The 13 adjustment image acquisition sensitivity of system controller 10 order image acquisition section, and the processing of perhaps display image processing section 12 execution adjustment brightness of image signal processing section 15, contrast, acutance etc. is obtained in order.In this processing procedure, be adjusted at images displayed quality on the display part 2, so that make the user for watching on display part 2 images displayed to feel comfortable.When definite result represented that the user is in the dark place, system controller 10 can be controlled illumination section 4 and throw light on.
When because the situation of the fatigue of the user visual field, surrounding brightness and he or her eyes and cause the user when watching that images displayed feels uncomfortable on the display part 2 for example, this processing procedure provides comfortable visible situation to he or she.For example, when the user was in dark place and he or she and can not be clear that image, this processing procedure provided distinct image to he or she.When user's eye fatigue, this processing procedure provides soft image to he or she.
Execution graph 12A and Figure 12 B, Figure 13 A and Figure 13 B and processing shown in Figure 14 and carry out the operation had a mind to without the user in such a way is so that make system controller 10 confirm he or her situation and control display image mode accordingly with him or her situation.On the contrary, carry out the processing shown in Figure 15, Figure 16 and Figure 17 A and Figure 17 B in such a way, trigger (perhaps a kind of trigger condition) so that image control is used as in user's voluntary action.Next, will these processing be described with reference to Figure 15, Figure 16 and Figure 17 A and Figure 17 B.
Figure 15 shows the processing of the motion of user's neck (head) being used as an operation.
In step F 500, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 and gyroscope 21.In step F 501, system controller 10 is confirmed and the corresponding user's head movement of information that is detected.For example, system controller 10 is confirmed users whether receded his or her head twice, he or she whether turned forward his or her head twice, and perhaps whether he or she has rocked he or her neck left twice.
When system controller 10 detected the user and receded he or she's head twice, flow process advanced to step F 505 from step F 502.In step F 505, system controller 10 confirms that the image switching that 2 times of distant view magnification ratios occurred triggers.
In this case, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 10 order image acquisition control sections 11 are carried out the zoom operation with 2 times of distant view magnification ratios.Therefore, shown in Fig. 3 C, display part 2 shows the image with 2 times of distant view magnification ratios.
When system controller 10 detected the user and turned forward he or she's head twice, flow process was from the step F 503 step F506 that advances.In step F 506, system controller 10 confirms that the image switching that 1/2 times of distant view magnification ratio occurred triggers.In this case, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 104, system controller 10 order image acquisition control sections 11 are carried out the zoom operation with 1/2 times of distant view magnification ratio.Therefore, display part 2 shows the image with 1/2 times of distant view magnification ratio.
When system controller 10 detected the user and rocked he or she's neck twice left, flow process advanced to step F 507 from step F 504.In step F 507, system controller 10 confirm to have occurred to cause the to reset image switching of distant view magnification ratio triggers.In this case, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 104, system controller 10 order image acquisition control sections 11 are carried out the zoom operation with standard magnification ratio.Therefore, display part 2 shows the image with standard magnification ratio.
Because triggering is confirmed as in user's the motion of having a mind to, and switching displayed image model accordingly therewith, so provide he or she the desired visual field to he or she.
Except the motion of user's neck, can also scheduled operation be confirmed as in he or her motion of whole health such as the motion of jumping with he or her hand, arm and leg.
Except zoom operation, can also be with user's action or move the image amplifieroperation shown in the execution graph 5B, image reduction operation, image acquisition sensitivity operation accordingly, obtain the increase infrared image shown in picture frame frequency selection operation, Fig. 7 B and obtain the increase ultraviolet image shown in sensitivity display operation, Fig. 8 B and obtain and separate flicker (strobe) display operation shown in display operation, Fig. 9 C etc. shown in sensitivity display operation, Fig. 9 B.
Figure 16 shows user's action is used as and he or she's the situation and the processing of the corresponding predetermined control triggering of surrounding environment.
In step F shown in Figure 16 600, system controller 10 is analyzed the information that is detected by vision sensor 19, and detection is with analysis result is corresponding, his the perhaps pupil dilation and nictation of her eyes.In step F 601, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and definite user (stopping) he or her neck or whole health (not walking states) that whether move.
In step F 602; System controller 10 confirms whether the user is in resting state and makes he or her neck downward-sloping; Whether surrounding environment deceives, and whether he or she have carried out the specific action three times of having blinked such as he or she.
In other words, system controller 10 confirms accordingly with the information that is detected by acceleration transducer 20 and gyroscope 21 whether the user is in resting state and makes he or her neck downward-sloping.In addition, system controller 10 confirms accordingly with the pupil dilation of eyes of user whether he or she is in the dark surrounds.When having satisfied these conditions, system controller 10 confirms whether the user has blinked three times continuously.
When the user is in resting state and makes he or she's neck downward-sloping and he or she when being in the dark surrounds, he or she is just reading something in dark room.When user has in this case blinked three times the time wittingly continuously, system controller 10 confirms that he or she hopes a bright and distinct image.Therefore, blinked continuously three times the time when system controller 10 under these circumstances detects the user, flow process advances to step F 603.In step F 603, system controller 10 is confirmed to have occurred image adjustment control and is triggered.
When having carried out the processing of step F 603, flow process advances to step F 106 from step F shown in Figure 10 104.In this case; In step F 106; System controller 10 order display driving parts 13 improve image acquisition sensitivity, and image signal processing section 15 is obtained in order or the processing that increases brightness, enhancing contrast ratio and acutance etc. is carried out in display image processing section 12.Alternatively, system controller 10 can order illumination section 4 to be thrown light on.
Therefore, the user can see image under comfortable situation.
In this example,, just allow the eyes of user operation of nictation as long as satisfied user's situation and ambient environmental conditions.This processing procedure is effectively, and this is because even after the user has by mistake carried out relevant action, also can not change image.
Figure 17 A shows and obtains the described exemplary processes with the image that increases infrared sensitivity with reference to figure 7B.Allow accordingly with user's physical condition or forbid and the corresponding operation of user action.
In the step F 700 shown in Figure 17 A; The information (acceleration signal and angular velocity signal) that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and confirm motion and his the perhaps motion of her whole health corresponding with the result who is detected, user's neck.
In step F 701, the brain wave that system controller 10 inspections are detected by biological sensor 22, heart rate, volume of perspiration, blood pressure etc.System controller 10 confirms accordingly with the information that is detected by biological sensor 22 whether the user is nervous or excited.
When system controller 10 detected the predetermined action (for example, he or she has rocked its neck) that causes image acquisition and display device 1 to carry out infrared image obtaining operation, flow process advanced to step F 703 from step F 702.In step F 703, system controller 10 confirms whether the user is nervous or excited.
Confirm that users had not only taken it easy but also excitation time not when system controller 10, system controller 10 confirms that user actions are valid functions.After this, flow process advances to step F 704.In step F 704, system controller 10 has confirmed to have occurred to increase the image acquisition operations triggering of infrared sensitivity.
When having carried out the processing of step F 704, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 10 order image acquisition section 3 increase infrared image and obtain sensitivity.Therefore, the image of display part 2 demonstrations shown in Fig. 7 B.
On the contrary, the definite result when step F 703 representes user's anxiety or excitation time, system controller 10 definite image acquisition operations triggerings that infrared sensitivity also occurs increasing.In other words, system controller 10 is forbidden and the corresponding operation of user action.
Therefore, can confirm the validity with he or her corresponding operation of action together with the condition of user's physical condition.In this case, can prevent effectively that particular image such as the image acquisition operations that increases infrared sensitivity from obtaining function and being used irrelevantly.
Figure 17 B shows and obtains the described exemplary processes with the image that increases ultraviolet sensitivity with reference to figure 8B.
In the step F 710 shown in Figure 17 B; The information (acceleration signal and angular velocity signal) that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and the perhaps motion etc. of her whole health of the motion or he that confirm, user neck corresponding with the result who is detected.
Carried out when causing image acquisition and display device 1 to carry out ultraviolet image obtaining the predetermined action of operation when system controller 10 detects the user, flow process advances to step F 712 from step F 711.In step F 712, system controller 10 has confirmed to have occurred to increase the image acquisition operations triggering of ultraviolet sensitivity.
After the processing of having carried out step F 712, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 10 order image acquisition section 3 increase ultraviolet image and obtain sensitivity.Therefore, the image of display part 2 demonstrations shown in Fig. 8 B.
Display mode handover trigger and the display mode that is used to obtain image exemplarily has been described hereinbefore.Should be noted that to have other example.
When triggering the display mode of switching displayed part 2 accordingly with image control; Shown in Fig. 9 B; Regional AR1 in the screen of display part 2 can be the image that normally obtains through state or demonstration, and can be with another kind of display mode display image as the regional AR2 of a screen part.Alternatively, the AR1 as broad area can show and the corresponding image of image control triggering.Alternatively, can equally divide screen, and can show the image that normally obtains therein and trigger corresponding image with image control.
Next, will be described in detected triggering in the step F shown in Figure 10 105, that is, cause keeping watch on show state and be switched to the triggering of passing through state of obtaining image with reference to figure 18A, Figure 18 B, Figure 19 A and Figure 19 B.
Figure 18 A shows with user's voluntary action and accomplishes the exemplary processes of keeping watch on show state accordingly.
In the step F 800 shown in Figure 18 A, the information that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and the perhaps motion etc. of her whole health of the motion or he that confirm, user neck corresponding with the information that is detected.
Carried out when causing image acquisition and display device 1 to accomplish the predetermined action of keeping watch on show state when system controller 10 detects the user, flow process advances to step F 802 from step F 801.In step F 802, system controller 10 confirms that image monitoring having occurred obtaining shows that completion triggers.
When having carried out the processing of step F 802, flow process advances to step F 101 from step F shown in Figure 10 105.In this case, in step F 101, system controller 10 order display control section 14 switches to show state through state.Therefore, shown in Fig. 3 A, display part 2 reverts to show state through state.
Figure 18 B shows with user's voluntary action and accomplishes the exemplary processes of keeping watch on show state accordingly.
In the step F 810 shown in Figure 18 B, system controller 10 is analyzed the information that is detected by vision sensor 19.Continuously nictation, three times predetermined action was defined as the user who causes image acquisition and display device 1 to accomplish the supervision show state when operating as the user, and system controller 10 comes supervisory control action through analysis image.
Blinked continuously three times the time when system controller 10 detects the user, flow process advances to step F 812 from step F 811.In step F 812, system controller 10 confirms to have occurred being used for the supervision show state completion triggering of the picture signal of obtaining.
When having carried out the processing of step F 812, flow process advances to step F 101 from step F shown in Figure 10 105.In step F 101, system controller 10 order display control section 14 switches to show state through state.Therefore, shown in Fig. 3 A, display part 2 reverts to show state through state.
In the processing shown in Figure 18 A and Figure 18 B, when the user causes image acquisition and display device 1 when operating through state, display part 2 becomes through state with him or her hope accordingly.
Certainly, can have and cause display part 2 that show state is reverted to the user action through other type of state.
Figure 19 A shows with user movement (motion that not being considered to of he or she operates) and accordingly show state is reverted to the exemplary processes through state automatically.
In the step F 900 shown in Figure 19 A, system controller 10 is monitored the information that is detected by acceleration transducer 20 and gyroscope 21, and determines whether the motion of the whole health of user.Especially, system controller 10 detects users and has a rest, walks, or running.
When system controller 10 definite users had begun walking or run, flow process advanced to step F 902 from step F 901.In step F 902, system controller 10 confirms to have occurred being used for the supervision show state completion triggering of the picture signal of obtaining.
When having carried out the processing of step F 902, flow process turns back to step F 101 from step F shown in Figure 10 105.In this case, in step F 101, system controller 10 order display control section 14 switches to show state through state.Therefore, shown in Fig. 3 A, display part 2 reverts to show state through state.
When the user is walking or is running, see from the angle of safety, be preferably show state is reverted to through state.
Be not that show state is reverted to through state; But when the user is walking or is running; System controller 10 can order display control section 14 that show state is switched to the supervision show state that normally obtains image, and wherein this normally obtains images category and is similar to user shown in Fig. 3 B at that image through seeing under the state.
The physical condition that Figure 19 B shows with the user reverts to through state show state so that prevent to use irrelevantly infrared image to obtain the exemplary processes of operation accordingly automatically.
In step F 910 shown in Figure 19 B, the information of the relevant for example brain wave of system controller 10 inspections, heart rate, volume of perspiration, blood pressure etc. by biological sensor 22 detections.System controller 10 confirms accordingly with the information that is detected by biological sensor 22 whether the user is nervous or excited.
When carrying out the image acquisition operations that increases infrared sensitivity, flow process advances to step F 912 from step F 911.In step F 912, system controller 10 confirms whether the user is nervous or excited.
When definite result representes that the user had not only taken it easy but also excitation time not, system controller 10 allows image acquisition and display device 1 to continue the image acquisition operations of this increase infrared sensitivity.On the contrary, when definite result representes the nervous or excitation time of user, flow process advances to step F 913.In step F 913, system controller 10 confirms to have occurred being used for the supervision show state completion triggering of the image that obtains.
When having carried out the processing of step F 913, flow process turns back to step F 101 from step F shown in Figure 10 105.In this case, in step F 101, system controller 10 order display control section 14 switches to show state through state.In other words, system controller 10 order display control sections 14 are accomplished the supervision show state of the image acquisition operations that increases infrared sensitivity.Therefore, display part 2 reverts to show state through state.
Be preferably with user's physical condition and accomplish the image acquisition operations that increases infrared sensitivity accordingly, and show state is reverted to through state, to prevent that he or she uses the image acquisition operations that has increased infrared sensitivity improperly.
Be not that show state is reverted to through state, but can accomplish the image acquisition operations that increases infrared sensitivity, and can show the image that normally obtains.
[the 6. effect of first embodiment, modification and expansion]
According to this embodiment; The image that obtains by the image acquisition section 3 that is arranged in glasses type installation unit or the helmet type installation unit; That is, as the image that obtains in the eyes of user direction of subject direction, be displayed on the display part 2 of he or her eyes front.In this case, control image acquisition operations or display operation accordingly with relevant he or her motion or physical condition.Therefore, can create the situation of assistance in fact or extending user visual capacity.
Because the image acquisition operations of image acquisition section 3 and change with the corresponding display mode of signal processing that obtains image signal processing section 15 and display image processing section 12 and to be to hope or situation is carried out accordingly with the user who confirms accordingly with user's motion or the relevant information of physical condition.Therefore, do not apply the operation burden to the user.In addition, because controlled image acquisition and display device 1 rightly, so the user can easily use it.
In addition, because it is transparent in or translucent state through the transmissivity of controlling display part 2 it to be become, when the user put on installation unit, it can not upset his or her normal life.Therefore, in user's normal life, can use effectively according to the image acquisition of this embodiment and the benefit of display device 1.
In this embodiment, the image acquisition operations of image acquisition section 3 and the display mode of realizing through the signal processing of obtaining image signal processing section 15 and display image processing section 12 have mainly been described.For example, the switching that can control energising, outage and standby accordingly with user's action and/or physical condition, and from the volume and the tonequality of the sound of voice output part 5 outputs.For example, can consider that accordingly user's comfort level adjusts volume with the information that for example biological sensor 22 is detected.
The outward appearance of image acquisition and display device 1 and structure are not limited to illustrated in figures 1 and 2 those.Alternatively, can carry out various modifications.
For example, can in image acquisition and display device 1, arrange the storage area of storing the picture signal of obtaining by image acquisition section 3, and picture signal is sent to the hop of miscellaneous equipment.
Except image acquisition section 3, can in image acquisition and display device 1, arrange importation and receiving unit from the external equipment input picture as the source of images displayed on the display part 2.
In addition, can in image acquisition and display device 1, arrange the character recognition part and the synthetic speech synthesiser branch of handling of execution sound of the character that comprises in the recognition image.When the image that obtains comprised character, the speech synthesiser branch can generate the reading voice signal, and voice output part 5 can be exported and the corresponding voice of this signal.
In this embodiment, having described wherein, image acquisition and display device 1 are the examples of glasses type installation unit or helmet type installation unit.Yet the direction of family eyes is obtained image and at him or her eyes front display image, this equipment can be any kind that the user can put on, such as headphone type, neckline type, ear-hung etc. as long as image acquisition and display device are continued to use.Alternatively, image acquisition and display device 1 can be to use installing component such as clip to be attached to the unit of glasses, bongrace, headphone etc.
(second embodiment)
Next, will image acquisition and display device and image acquisition and display packing according to second embodiment of the invention be described with following order.
[the 1. exemplary outward appearance of image acquisition and display device]
[the 2. demonstrative structure of image acquisition and display device]
[3. exemplary display image]
[4. user situation confirms]
[5. example operation]
[the 6. effect of second embodiment, modification and expansion]
[the 1. demonstration outward appearance of image acquisition and display device]
Identical according to the exemplary outward appearance of the image acquisition of second embodiment and display device with exemplary outward appearance according to first embodiment.
[the 2. exemplary property surface structure of image acquisition and display device]
Figure 20 shows the exemplary internal configuration according to the image acquisition of second embodiment of the invention and display device 101.
System controller 110 comprises microcomputer, and it comprises for example CPU (CPU), ROM (read-only memory), RAM (random access memory), nonvolatile memory part and interface section.System controller 110 is control sections of the whole parts in control image acquisition and the display device 101.
System controller 110 is controlled each part in image acquisition and the display device 101 accordingly with external circumstances.In other words, system controller 110 be used to detect and the operation sequence of definite external circumstances is operated accordingly, and control each part accordingly with the situation that institute detects and confirms.Therefore, shown in figure 20, system controller 110 on function, comprise the external circumstances of confirming external circumstances confirm function 110a and with external circumstances confirm definite result of function 110a control accordingly and order each part control function 110b.
In image acquisition and display device 101, arranged image acquisition section 103, image acquisition control section 111 and obtained image signal processing section 115, as being used to obtain structure at the image of user front.
Image acquisition section 103 comprises lens combination (it has image acquisition lens 103a (as shown in Figure 1), aperture, zoom lens, condenser lens etc.); Make lens combination carry out the drive system of focusing operation and zoom operation; And the solid state image sensor array, wherein the light that obtains image that obtains by lens combination of solid state image sensor array detection, light is converted into electricity and generates and the electric corresponding picture signal of obtaining.The solid state image sensor array comprises for example CCD (charge coupled device) sensor array or CMOS (complementary metal oxide semiconductors (CMOS)) sensor array.
Obtain image signal processing section 115 and comprise sampling maintenances/AGC (automatic gain control) circuit and video a/d converter, wherein this sampling maintenance/agc circuit is adjusted the gain of the signal that is obtained by the solid state image sensor array in the image acquisition section 103 and is repaired the waveform of this signal.Obtain image signal processing section 115 and obtain picture signal as numerical data.Obtaining image signal processing section 115 is that the picture signal of being obtained is carried out white balance processing, brightness processed, colour signal processing, vibration correction processing etc.
Image acquisition control section 111 is controlled image acquisition section 103 and the operation of obtaining image signal processing section 115 accordingly with the order that receives from system controller 110.Image acquisition control section 111 for example begins and stops image acquisition section 103 and the operation of obtaining image signal processing section 115.In addition, image acquisition control section 111 control image acquisition section 103 are carried out automatic focus operation, automatic exposure adjustment operation, aperture adjustment operation, zoom operation etc. with (through motor).
In addition, image acquisition control section 111 comprises timing generator.Image acquisition control section 111 utilizes the timing signal that is generated by timing generator to control sampling maintenance/agc circuit and video a/d converter in solid state image sensor array and the image acquisition control section 111.In addition, image acquisition control section 111 can utilize timing signal to change the frame frequency that obtains image.
In addition, image acquisition control section 111 control solid state image sensor arrays and image acquisition sensitivity and the signal processing of obtaining image signal processing section 115.In order to control image acquisition sensitivity, gain, the black level setting of the signal that 111 controls of image acquisition control section have for example been read from the solid state image sensor array, the various types of coefficients that obtain the numerical data of picture signal handling, the correcting value that vibration correction is handled etc.For the image acquisition sensitivity adjustment; Image acquisition control section 111 can be carried out the total sensitivity adjustment; And do not consider that wave band and the particular sensitivity that is used for the specific band such as infrared and ultraviolet range adjust (for example, can obtain image like this so that the intercepting predetermined band).Can be through in the image acquisition lens combination, inserting wavelength filter, and, carry out the specific sensitivity adjustment of wavelength through being that the picture signal of being obtained is carried out the wavelength filtering computing.In these cases, image acquisition control section 111 can be through inserting wavelength filter and/or specifying the filtering design factor to control sensitivity.
As the structure to user's video data, image acquisition and display device 101 comprise display part 102, display image processing section 112, display driving part 113 and display control section 114.
Its image has been obtained by image acquisition section 103 and offered display image processing section 112 by obtaining the picture signal of obtaining that image signal processing section 115 handles then.Display image processing section 112 for example is so-called video processor.Display image processing section 112 is that the picture signal of obtaining that is provided is carried out various types of demonstrations processing.For example intensity level adjustment, colour correction, contrast adjustment, acutance (edge enhancing) adjustment etc. can be carried out for the picture signal of being obtained in display image processing section 112.In addition; Display image processing section 112 can generate the enlarged image and the downscaled images of wherein amplifying a part of obtaining picture signal; The a part of image of outstanding demonstration, separate picture is so that separate demonstration, combination image; Generate character picture and graph image, and with the image that is generated with to obtain image superimposed.In other words, display image processing section 112 can be for carrying out various types of processing as the data image signal that obtains picture signal.
Display driving part 113 comprises pixel-driving circuit, and it is in the picture signal that for example provides from display image processing section 112 for demonstration on the display part 102 of LCD.In other words, display driving part 113 will be applied to based on the drive signal of picture signal in each pixel that forms with matrix shape in the display part 102, to cause display part 102 display images with predetermined horizontal/vertical driving timing.In addition, the transmissivity of display driving part 113 each pixel of control becomes through state to cause display part 102.
Display control section 114 is controlled the processing of display image processing section 112 and the operation of operation and display driving part 113 accordingly with the order that receives from system controller 110.In other words, display control section 114 makes display image processing section 112 carry out above-mentioned various types of processing.In addition, display control section 114 control display driving parts 113 are passing through switching displayed state between state and the image display status to cause display part 102.
In the following description, wherein display part 102 becomes transparent or translucent state and is called as " through state ", and wherein the operation (and state) of display part 102 display images is called as " supervision show state ".
In addition, image acquisition and display device 101 comprise sound importation 106, sound signal processing part 106 and sound output 105.
Sound importation 106 comprises microphone 106a and 106b shown in Figure 1, and the microphone amplifier section, and it handles the voice signal that is obtained by microphone 106a and 106b.
Sound signal processing part 116 comprises for example A/D converter, digital signal processor, D/A converter etc.Sound signal processing part 116 will convert numerical data into from the voice signal that sound importation 106 provides, and processing such as the adjustment of execution volume, tonequality adjustment, acoustics under the control of system controller 110.Sound signal processing part 116 converts the voice signal that produces into analog signal, and analog signal is offered voice output part 105.Sound signal processing part 116 is not limited to the structure of combine digital signal processing.On the contrary, sound signal processing part 116 can utilize analogue amplifier and analog filter to carry out signal processing.
Voice output part 105 comprises a pair of earphone speaker 105a shown in Figure 1, and the amplifier circuit that is used for earphone speaker 105a.
Sound importation 106, sound signal processing part 116 and sound output 105 allow the user to hear external voice through image acquisition and display device 101.
Voice output part 105 can be constructed to so-called sclerotin (osseous) conduction loud speaker.
In addition, image acquisition and display device 101 comprise that speech synthesiser divides 127.Speech synthesiser divide 127 synthetic with from the corresponding sound of system controller 110 issued command, and the voice signal that synthesized of output.
Speech synthesiser divides 127 synthetic voice signal outputed to sound signal processing part 116.Sound signal processing part 116 is handled synthetic voice signal, and the signal that produces is offered voice output part 105.Voice output part 105 is given the user with voice output.
Speech synthesiser divides 127, and generate will be at the voice signal of the reading voice of describing after a while.
In addition, image acquisition and display device 1 comprise that illumination section 104 and lighting control section divide 118.Illumination section 104 comprises luminous component 104a shown in Figure 1 (for example, light-emitting diode) and causes the luminous illuminating circuit of luminous component 104a.Lighting control section divides 118 to make illumination section 104 and the order that provides from system controller 110 carry out light emission operation accordingly.
Because the luminous component 104a in the illumination section 104 is arranged to luminous component 104a is forwards thrown light on, so illumination section 104 is carried out the illumination operation on user's visual direction.
As the structure that obtains external information, image acquisition and display device 101 comprise ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126.
Ambient sensors 119 particularly is luminance sensor, temperature sensor, humidity sensor, barometric pressure sensor etc., is used to detect the surrounding environment of the information of relevant surrounding brightness, temperature, humidity, weather etc. as image acquisition and display device 101.
Obtain sensor of interest 120 and be the transducer of the information of obtaining target of the image acquisition operations that detects relevant image acquisition section 103.Obtaining sensor of interest 120 can be distance measuring sensor, and it detects about the for example information from image acquisition and display device 101 to the distance of obtaining target.
Obtaining sensor of interest 120 can be the transducer such as infrared sensor, and it is the thermoelectric pickup that detects the information of the predetermined wavelength obtain the emission of target infrared line and energy.In this case, obtain sensor of interest 120 and can detect whether obtain target be the live body such as people or animal.
Alternatively, obtaining sensor of interest 120 can be such as one of various types of UV (ultraviolet ray) transducer such transducer, and predetermined wavelength and the information of energy of the ultraviolet ray emission of target is obtained in its detection.In this case, obtain sensor of interest 120 and can detect whether obtain target be fluorescent material or phosphor, and detect the tanned necessary external ultraviolet radiation emission measure of the antagonism sun.
GPS receiving unit 121 receives radio wave from GPS (global positioning system) satellite, obtains the current location of image acquisition and display device 101, and the latitude and longitude information of output current location.
Date and time segment count 122 is so-called clock parts, its to date and time (year, month, day, hour, minute and second) count, and export current date and the information of time.
Image analyzing section 128 is analyzed by image acquisition section 103 and is obtained the image that image signal processing section 115 is obtained and handled.In other words, the image that image analyzing section 128 is analyzed as subject, and obtain to be included in the information of obtaining the subject in the image.
Communications portion 126 is carried out data communication with external equipment.The example of external equipment comprises the equipment of any kind with the information processing function and communication function, such as computer equipment, PDA (personal digital assistant), mobile phone, video equipment, audio frequency apparatus and tuner apparatus.
In addition, the example of external equipment can comprise terminal equipment and the server apparatus that is connected to the network such as Internet.
In addition, the example of external equipment can comprise non-contact type communication IC-card, and it has the therefrom holographic memory of acquired information of build-in IC chip, the two-dimentional bar code such as the QR sign indicating number and communications portion 126.
In addition, the example of external equipment can comprise another image acquisition and display device 101.
Communications portion 126 can with for example with Wireless LAN system, Bluetooth system etc. corresponding near access point communicate.Alternatively, communications portion 126 can directly communicate with the external equipment with respective communication function.
These ambient sensors 119, obtain the external information that sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126 obtain image acquisition and display device 101, and the information that is obtained is offered system controller 110.
The processing of system controller 110 executable operations controlled function 110b is so that control image acquisition operations and display operation with confirmed the external information that function 110a obtains by external circumstances accordingly.In other words, system controller 110 order image acquisition control sections 111 control image acquisition section 103 and the operation of obtaining image signal processing section 115.In addition, the operation of system controller 110 order display control sections, 114 control display image processing sections 112 and display driving part 113.
In this embodiment, the structure that obtains external information comprises ambient sensors 119, obtains sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126.Yet, in this structure, can omit some in them.In addition, can in image acquisition and display device 101, arrange other transducer, such as the speech analysis part of voice around detecting and analyzing.
[3. exemplary display image]
System controller 110 is controlled image acquisition operations and display operation accordingly with the external information that is obtained.Therefore, the various display modes of User Recognition display part 102.Fig. 3 A to Fig. 3 C to Fig. 9 A to Fig. 9 C, Figure 21 A and Figure 21 B and Figure 22 A exemplarily explained various display modes to Figure 22 B.
Fig. 3 A shows wherein, and display part 102 is in the state through state.In other words, under this state, display part 102 is simple transparent flat parts, and the user can see the scene in the visual field through transparent display part 102.
Fig. 3 B shows the image that is wherein obtained by image acquisition section 103 and is displayed on the state of keeping watch on the display part 102 of operating under the show state that is in.Image acquisition section 103, obtain image signal processing section 115, display image processing section 112 and display driving part 113 and operate, so that make them normally on display part 102, show the image that obtains with the state shown in Fig. 3 A.In this case, show on the display part 102 to obtain image (image that normally obtains) almost identical with the image that manifests on through the display part 102 of operating under the state.In other words, under this state, the user sees that the normal visual field is as the image that obtains.
Fig. 3 C shows wherein, and system controller 110 makes image acquisition section 103 obtain the state of distant view image through image acquisition control section 111.
By contrast, when system controller 110 causes image acquisition section 103 to obtain wide angle picture through image acquisition control section 111, on display part 102, show closely wide angle picture (not shown).Though the zoom lens that image acquisition section 103 is obtained part 103 through driven image are carried out distant view and wide-angle control, obtain image processing section 115 and can carry out these control through processing signals.
Fig. 4 A show display part 102 wherein be in through under the state, for example user just reading the state of newspaper.
Fig. 4 B shows so-called wide-angle zoom state.In other words, Fig. 4 B shows such state, under this state, obtains nearly focal length zoom image, and it is presented at display part 102, so that for example amplify the character in the newspaper.
Fig. 5 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in through under the state.
At this moment, when 114 carries out image processing and amplifying are controlled through showing in system controller 110 order display image processing sections 112, on display part 102, show the enlarged image shown in Fig. 5 B.
Fig. 6 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in through under the state.Especially, Fig. 6 A shows the state that user is wherein reading newspaper or books.In this case, suppose because around be dim, so the user can not with the image that normally obtains or display part 102 pass through see the character in the newspaper etc. under the state.
In this case; System controller 110 order image acquisition control sections 111 (image acquisition section 103 with obtain image signal processing section 115) increase image acquisition sensitivity; And/or make display control section 114 (display image processing section 112 and display driving part 113) increase brightness and adjustment contrast and acutance, so that on display part 102, show than more sharpening, shown in Fig. 6 B the image of the image shown in Fig. 6 A.On the contrary, when system controller 110 makes illumination section 104 carry out the illumination operation, can be on display part 102 display image sharply.
Fig. 7 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in through under the state.In this case, the user is in the dark bedroom that wherein child is sleeping.Because the user is in the dark room, thus he or she can not utilize the image that normally obtains or display part 102 pass through clearly see child under the state.
At this moment; When system controller 110 order image acquisition control sections 111 (image acquisition section 103 with obtain image signal processing section 115) increase infrared image and obtain sensitivity; On display part 102, show the infrared image that obtains shown in Fig. 7 B, so that make the user can see child's sleeping face etc.
Fig. 8 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in through under the state.
When system controller 110 order image acquisition control sections 111 (image acquisition section 103 with obtain image signal processing section 115) increase ultraviolet image and obtain sensitivity, on display part 102, show shown in Fig. 8 B, have ultraviolet component obtain image.
Fig. 9 A shows wherein, and display part 102 is in through the state under the state.
When system controller 110 order display control sections 114 (display image processing section 112 and display driving part 113) display images or display image and partly during enlarged image respectively respectively, can be on display part 102 image shown in the displayed map 9B.In other words, the screen of display part 102 is separated into regional AR1 and AR102, and wherein regional AR1 is in through state or is under the normal picture show state, and regional AR102 is under the enlarged image show state.
Fig. 9 C shows another exemplary separation and shows.In this case, the screen of display part 102 is separated into regional AR1, AR102, AR3 and AR4, and these zones show the frame with the image of predetermined amount of time interval acquiring.System controller 110 makes display image processing section 112 from the picture signal of being obtained, extract a frame with 0.5 second interval, and shows the frame that is extracted with the order of regional AR1, AR2, AR3, AR4, AR1, AR2 etc.In this case, the image that on display part 102, shows so-called flicker display mode discretely.
Figure 21 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in through under the state.Because the image shown in Figure 21 A is the scene that the Association football stadium of sunlight and shadow edge is wherein arranged, be difficult to see image on the roof.
System controller 110 is to increase image acquisition sensitivity or display brightness with the corresponding pixel of dash area on CCID transducer or the cmos sensor.On the contrary, system controller 110 is to reduce image acquisition sensitivity or display brightness with the corresponding pixel of sunlight.Therefore, shown in Figure 21 B, show the image that has wherein reduced sunlight and shade influence.
Figure 22 A and Figure 22 B show wherein and to show that the image that comprises bird for example is so that the outstanding state that shows bird.
When in image, detecting bird, if outstanding demonstration bird can prevent that then the user from omitting the bird as subject.
Processing as outstanding display image can reduce the brightness of interested part.Alternatively, can reduce the brightness of the part that is different from part interested.Can use the colored interested part that shows.Can monochromaticly show the part that is different from part interested.Alternatively, can utilize any character image such as framework, cursor, indicator marker etc. to give prominence to the interested part of demonstration.
Above-mentioned display image only is exemplary.In other words, when control image acquisition section 103, the processing of obtaining image signal processing section 115, display image processing section 112 and display driving part 113 and operation, can realize various display modes.
For example; Estimate to have the display mode of many types; Such as, distant view display mode, wide-angle display mode, scope, change from the distant view display mode to the wide-angle display mode dwindle display mode dwindle display mode, variable frame frequency display mode (obtain image or obtain image), high brightness display mode, low-light level display mode, variable contrast display mode, variable acutance display mode with enlarged display mode, image enlarged display mode, image with low frame rate with high frame frequency, increased sensitivity obtain image display mode, increased infrared sensitivity obtain image display mode, increased ultraviolet sensitivity obtain image display mode, wherein intercepting the image display mode of predetermined band, image effect display mode (such as mosaic image, brightness reverse image, soft focus image, the outstanding display image of part screen, have the image of variable color atmosphere etc.), slowly display mode, by the frame display mode, with these display modes combine separate display mode, with through state with obtain that image combines separate display mode, flicker display mode, have the rest image display mode of the frame in the image that obtained etc.
[4. pair outside detection of information 1
As stated; As the structure that obtains external information, image acquisition and display device 101 comprise ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126.
The example of ambient sensors 119 can comprise luminance sensor, temperature sensor, humidity sensor, barometric pressure sensor etc.
The luminance sensor detected image is obtained the brightness information with display device 101.
Temperature sensor, humidity sensor and barometric pressure sensor obtain to utilize it to confirm the information of temperature, humidity, atmospheric pressure and weather.
Because ambient sensors 119 can be confirmed surrounding brightness, outdoor weather situation etc., so system controller 110 can be controlled image acquisition operations and display operation accordingly with surrounding brightness that is confirmed as external information and weather condition.
Obtain sensor of interest 120 and detect the relevant information of obtaining target.Obtaining sensor of interest 120 can be distance measuring sensor or thermoelectric pickup.Obtaining sensor of interest 120 can acquire the distance of obtaining target and utilize it to confirm to obtain the information of target.
When obtain sensor of interest 120 detected to obtain target apart from the time, system controller 110 can be controlled image acquisition operations and display operation accordingly with the distance that is detected.Detected when obtaining target and being the live body such as the people when obtaining sensor of interest 120, system controller 110 can with obtain target and control image acquisition operations and display operation accordingly.
GPS receiving unit 121 obtains the latitude and longitude information of current location.When GPS receiving unit 121 when Reference Map database etc. has detected the longitude and latitude of current location, can obtain the information (the contiguous information of relevant current location) of relevant current location.When comprising system controller 110, image acquisition and display device 101 can visit and have the recording medium of big relatively recording capacity (such as HDD (hard disk drive) or flash memory) (not illustrating) at Figure 20; And recording medium can obtain the information of relevant current location when having write down map data base.
Even image acquisition and display device 101 do not have built-in map data base; Image acquisition and display device 101 also can cause communications portion 126 visits for example to have the webserver or the equipment of storing map database; The latitude and longitude information of current location is sent to this webserver or equipment; Ask this webserver or equipment that the information of relevant current location is sent to communications portion 126, and receive this information.
The information example of relevant current location comprises place name, building name, organization names, firm name, name of station etc.
In addition, the information example about current location comprises information such as parking lot, theme park, music hall, arenas, cinema and sports ground, that represent building type.
In addition, the information example about current location comprises type such as beach, river, mountain range, mountain top, forest, lake and grassland, natural things and title.
About the information example of detail location more comprises zone and the zone in the music hall in zone, ball park and the Association football stadium in the theme park.
When having obtained the information of relevant current location, system controller 110 can be controlled image acquisition operations and display operation accordingly with near current location, the current location geographical conditions, mechanism etc.
122 pairs of date and time segment counts are for example counted year, the moon, day, hour, minute and second.System controller 110 can identify and corresponding current time of value, daytime or the evening of being counted by date and time segment count 122, the moon, season etc.Therefore, system controller 110 can be controlled image acquisition operations and display operation accordingly with daytime or evening (time), and with control these operations current season accordingly.
Image analyzing section 128 can detect relevant various types of information of obtaining target from the image that obtains.
At first, image analyzing section 128 can identify the type that target is obtained in conducts such as people, animal, natural things, building, machine from the image that obtains.For animal, image analyzing section 128 can be discerned and wherein obtain the situation of bird as subject, has wherein obtained kitten as situation of subject etc.For natural things, image analyzing section 128 can be discerned from the image that obtains and go to sea, mountain range, tree, river, lake, sky, the sun, the moon etc.For building, image analyzing section 128 can identify house, building, stadium etc. from the image that obtains.For equipment, image analyzing section 128 can identify conducts such as personal computer, AV (audiovisual) equipment, mobile phone, PDA, IC-card, two-dimensional bar and obtain target from the image that obtains.
When registering to various types of shape facilities that obtain target in the image analyzing section 128 in advance, the subject that can confirm to obtain in the image accordingly and comprised with the characteristic of being registered.
In addition, the motion that image analyzing section 128 can be for example comes from the image that obtains, to detect subject through the difference of the consecutive frame of detected image, the for example rapid movement of subject.For example, image analyzing section 128 can detect the situation of wherein obtaining subject (for example, the player in the automobile that sports tournament is perhaps moving).
In addition, image analyzing section 128 can be confirmed ambient conditions through analysis image.For example, image analyzing section 128 can be confirmed because the brightness that daytime, evening or weather cause.In addition, image analyzing section 128 can identify rainy intensity etc.
In addition, image analyzing section 128 can confirm wherein for example obtaining the situation of books or newspaper through analysis image.For example, image analyzing section 128 can be confirmed such situation through the shape that from image, identifies character or identify books or newspaper.
When image analyzing section 128 had identified character, system controller 110 can offer speech synthesiser as text data with the character that is identified and divide 127.
In addition, as people during as subject, image analyzing section 128 can be come according to the face recognition people through analysis image.Well-knownly be, can people's face be registered as the personal characteristics data, it is the relative position information of facial construction unit.For example, at the center and the ratio between the nose (Ed/EN) of eyes, and, be unique for everyone at the center and the ratio between the mouth (Ed/EM) of eyes apart from Ed apart from EM and eyes apart from Ed apart from EN and eyes.In addition, people's face can not receive the influence of the wear such as hair style, glasses etc.In addition, well-knownly be that people's face can not change along with him or her age.
Therefore, when the image that obtains comprised people facial, image analyzing section 128 can detect above-mentioned personal characteristics data through analysis image.
When image analyzing section 128 detects the personal characteristics data from the image that is obtained; Can visit and wherein write down the recording medium of individual database if system controller 110 has for example HDD (hard disk drive), flash memory etc. as system controller 110, then image analyzing section 128 can obtain the personal information of subject from individual database.Even image acquisition and display device 101 do not have built-in individual database; System controller 110 also can make communications portion 126 visits for example have the webserver or the equipment of built-in individual database; Ask this server or equipment that information is sent to communications portion 126, and receive unique individual's information from it.
When the user together with personal characteristics data when individual database has been registered that the user has run into, personal information such as everyone name, unit etc.; If the user has run into specific people's (having obtained he or her image), then image acquisition can be retrieved relevant this individual information with display device 101 from individual database.
When the individual database of the information of having prepared wherein to have registered relevant famous person and personal characteristics data, if the user has run into a certain famous person, then image acquisition can be retrieved relevant this individual information with display device 101 from individual database.
Communications portion 126 can obtain various types of external informations.
For example, the information that as stated, that communications portion 126 can obtain is corresponding with the latitude and longitude information that sends from image acquisition and display device 101, personal characteristics data etc., retrieved by external equipment.
In addition, communications portion 126 can obtain the weather information such as Weather information, temperature information and humidity information from external equipment.
In addition, communications portion 126 can obtain mechanism's use information, forbid/permit photographic information, mechanism guides information etc. from external equipment.
In addition, communications portion 126 can obtain the identification information of external equipment.The example of identification information is included in device type and the device id that is identified as the network equipment in the predefined communication protocol.
In addition, communications portion 126 can obtain to be stored in the view data in the external equipment, reproduce or the images displayed data by external equipment, and the view data of receiving by external equipment.
Hereinbefore, exemplarily explained ambient sensors 119, obtain the information that sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126 can obtain separately.Alternatively, a plurality of in these parts can detect and confirm predetermined external information.
In conjunction with the humidity information that obtains by ambient sensors 119 etc. and by the Weather information that communications portion 126 receives, can confirm the weather of current time more accurately.
In addition, compare, can confirm to obtain the current location and the situation of target accordingly, more accurately with the information of the relevant current location that obtains by GPS receiving unit 121 and communications portion 126 and by the information that image analyzing section 117 obtains with said structure.
[5. example operation]
In the image acquisition and display device 101 of this embodiment; System controller 110 with by ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and external information that communications portion 126 obtained is confirmed surrounding environment accordingly, obtained the situation of target etc.; And control image acquisition operations and display operation accordingly, so that the vision of assistance and extending user with definite result.
Next, the various example operation of the control that is described in system controller 110 being carried out down.
Figure 10 shows the control and treatment that controls function 110b as system controller 110.
In step F 101, system controller 110 control display control sections 114 become through state display part 102.When initial conducting image acquisition and display device 1, flow process advances to step F 101.In step F 101, system controller 110 control display parts 102 are to become through state.
2 when being in through state in the display part, and in step F 102, system controller 110 determines whether to occur keeping watch on show state and starts and trigger.Can in image acquisition and display device 101, arrange to keep watch on and show the starting switch (not shown).When the user had operated this switch, system controller 110 can be confirmed to have occurred the supervision show state and start triggering.Alternatively, as will be as described in after a while, system controller 110 can be confirmed to have occurred the supervision show state accordingly with external information and start and trigger.
When definite result represented to have occurred supervision show state start trigger, flow process advanced to step F 103.In step F 103, system controller 110 execution monitorings show start-up control.In other words, system controller 110 order image acquisition control sections 111 make image acquisition section 103 and obtain the normal image acquisition operations of image signal processing section 115 execution.In addition, system controller 110 order display control sections 114 impel display image processing section 112 and display driving part 113 to make display part 102 show that the picture signal of being obtained is as the image that normally obtains.
In this processing procedure, the state that passes through shown in Fig. 3 A is switched to the supervision show state that normally obtains image shown in Fig. 3 B.
When display part 102 showed the image that normally obtains (its with user identical through the scene seen under the state), flow process advanced to step F 104.In step F 104, whether system controller 110 monitoring image control occurred is triggered.In step F 105, whether system controller 110 monitoring the supervision show state occurred is accomplished triggering.
When system controller 110 confirms and must change accordingly when keeping watch on display mode with confirmed external circumstances (surrounding environment, subject, current date and time, current location etc.) that function 110a confirms by external circumstances that system controller 110 confirms to have occurred images control triggering in step F 104.
When the user had utilized predetermined switch to carry out supervision display mode complete operation, system controller 110 was confirmed to have occurred the supervision show state in step F 105 and is accomplished triggering.Be similar to the triggering in the step F 102, system controller 110 can confirm that having occurred the supervision show state accordingly with the external information that is detected accomplishes triggering.
When definite result represented to have occurred image control triggering, flow process advanced to step F 106 from step F 104.In step F 106, the display operation of image is obtained in system controller 110 controls.In other words, system controller 110 order image acquisition control sections 111 and display control section 114 make display part 102 with the corresponding display mode display image of the external circumstances on this time point.
After step F 106 has been controlled display mode, flow process advances to step F 104 or F105 at system controller 110.In step F 104 or step F 105, whether system controller 110 monitoring triggering occurred.
When definite result represented to have occurred supervision show state completion triggering, flow process turned back to step F 101 from step F 105.In step F 101, system controller 110 order image acquisition control sections 111 are accomplished image acquisition operations, and order display control section 114 becomes through state display part 102.
When user's its power supply of having put on image acquisition and display device 101 and conducting, the function 110b that controls in the system controller 110 carries out control and treatment shown in Figure 10.
In this processing procedure, carry out display mode control accordingly with in step F 104, showing the definite result whether the control triggering has occurred.To Figure 30 A and Figure 30 B the concrete example that triggering is confirmed and controlled be described with reference to Figure 23 after a while.
The external circumstances that Figure 23 shows system controller 110 to Figure 30 A and Figure 30 B is confirmed the exemplary processes of function 110a.Suppose that these are handled and the processing that controls function 110b shown in Figure 10 is carried out concurrently.Carry out these parallel processings,, carry out Figure 23 termly as Interrupt Process and handle to the detection shown in Figure 30 A and Figure 30 B so that for example when system controller 110 is being carried out processing shown in Figure 10.Figure 23 can be built in the program of carrying out processing shown in Figure 10 to the program of these processing shown in Figure 30 A and Figure 30 B.Alternatively, these programs can be other programs of regularly calling.In other words, the structure of these programs is not limited to specific structure.
Figure 23 shows the exemplary processes that in step F shown in Figure 10 104, determines whether to occur image control triggering to Figure 30 A and Figure 30 B.Figure 23 shows the information that determines whether with providing from ambient sensors 119 or image analyzing section 128 and has occurred the exemplary processes that image control triggers accordingly.
In step F shown in Figure 23 1201, in information that system controller 110 monitoring is detected by ambient sensors 119 and the information that obtains by image analyzing section 117 one or this two.In this example, ambient sensors 119 is luminance sensors.Image analyzing section 128 is analyzed and the corresponding surrounding brightness of image that obtains.
System controller 110 information definite and from ambient sensors 119 and one of image analyzing section 128 or this two acquisition are dark or too bright accordingly.For example, quantize detected brightness.When the brightness that detects is x lux or still less the time, system controller 110 is dark around confirming.When the brightness that detects is y lux or more for a long time, system controller 110 is too bright around confirming.
When being dark around definite result representes, flow process advances to step F 1204 from step F 1202.In step F 1204, system controller 110 is confirmed to have occurred image control and is triggered.
In step F 1205, system controller 110 calculates and the corresponding adjusted value of current brightness (darkness) on every side.For example, system controller 110 obtains the for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make around the user can see with the image that obtains comfily.
When having carried out the processing of step F 1204 and step F 1205, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, the 103 adjustment image acquisition sensitivity of system controller 110 order image acquisition section, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.When these parts have been carried out these processing, the quality of images displayed on the adjustment display part 102.Therefore, even be dark on every side, the user also can be with around images displayed be clear that on the display part 102.For example, be dark around wherein and situation that on display part 102, show the image shown in Fig. 6 A is changed to user wherein and can be clear that the situation of this image.
When being dark around definite result representes, system controller 110 can be controlled illumination section 104 and throw light on.
When definite result was too bright around representing, flow process advanced to step F 1206 from step F 1203.In step F 1206, system controller 110 is confirmed to have occurred image control and is triggered.
In step F 1207, system controller 110 calculates and the corresponding adjusted value of current brightness on every side.System controller 110 obtains the for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make around the user can see comfily.In this case, since too bright on every side, so the user feels to dazzle the eyes.Therefore, system controller 110 obtains such adjusted value, utilizes this adjusted value to reduce image acquisition sensitivity and display brightness.
When having carried out the processing of step F 1206 and step F 1207, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, the 103 adjustment image acquisition sensitivity of system controller 110 order image acquisition section, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.In these are handled, the quality of images displayed on the adjustment display part 102.Therefore, even too bright on every side, the user also can see on every side with images displayed on the display part 102 and not feel to dazzle the eyes.
Figure 24 A shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is detected from ambient sensors 119 or provide from communications portion 126.
In the step F 1301 shown in Figure 24 A, in information that system controller 110 monitoring is detected by ambient sensors 119 and the information that receives by communications portion 126 one or this two.The example of ambient sensors 119 comprises temperature sensor, humidity sensor and barometric pressure sensor.Communications portion 126 receives weather information continuously from for example webserver etc.
System controller 110 can be confirmed and the corresponding weather condition on every side of the atmospheric pressure, humidity and the temperature that are detected by for example ambient sensors 119.In addition, system controller 110 can be confirmed and the corresponding weather condition of weather information that is received by communications portion 126.In order to receive weather condition from the webserver, system controller 110 will send to the webserver continuously by the current location information that GPS receiving unit 121 obtains, and from the weather information of webserver reception with the corresponding zone of current location.
Though system controller 110 can confirm with the information that receives by ambient sensors 119 detected information or by communications portion 126 corresponding around weather condition, system controller 110 can be confirmed and two types the corresponding weather condition of information than said structure more accurately.
Weather conditions such as system controller 110 determines whether to rain, overcast with weather condition such as clear sky, broken sky, rainy day, lightning, typhoon, snowing etc. or such as beginning to rain, stopping to change adjusts image accordingly.When definite result represented to adjust image, flow process advanced to step F 1303 from step F 1302.In step F 1303, system controller 110 is confirmed to have occurred image control and is triggered.In step F 1304, system controller 110 calculates and the corresponding adjusted value of current weather.System controller 110 obtains the for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make around the user can be clear that with the image that obtains.
When having carried out the processing of step F 1303 and F1304, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, the 103 adjustment image acquisition sensitivity of system controller 110 order image acquisition section, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.In these are handled, adjust the quality of images displayed on the display part 102 accordingly with weather condition.The user can be with around images displayed be clearly seen on the display part 102.
Alternatively, system controller 110 can be controlled illumination section 104 accordingly with weather condition and throws light on.
In this example, system controller 110 definite corresponding weather conditions of information that perhaps receive with the information that detects by ambient sensors 119 by communications portion 126.When image analyzing section 128 identified the image of rainy day, system controller 110 can detect exactly and begin to rain, finish to rain, lightning appearance etc.
Figure 24 B shows the exemplary processes that the image control that determines whether to have occurred accordingly causing night vision function to be operated with the information that is detected by ambient sensors 119 triggers.
In the step F 1310 shown in Figure 24 B, the information that system controller 110 monitoring are detected by ambient sensors 119.In this example, ambient sensors 119 is luminance sensors.
Whether system controller 110 is dark around confirming accordingly with the information that is detected by ambient sensors 119.The detected brightness of system controller 110 digitlizations.When the brightness that detects is x lux or still less the time, system controller 110 is dark around confirming.
When being dark around definite result representes, flow process advances to step F 1313 from step F 1311.In step F 1313, system controller 110 confirms that the image control that has occurred causing night vision function to be opened triggers.
When having carried out the processing of step F 1313, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 110 control image acquisition and display device 101 are opened night vision function.In other words, the infrared image of system controller 110 order image acquisition control sections 111 increase image acquisition section 103 obtains sensitivity.
In this processing procedure, carry out night vision function.Therein shown in Fig. 7 A because around be dark and make under the situation of user around can not seeing, on display part 102, show increase shown in Fig. 7 B infrared sensitivity obtain image.Therefore, the user can see the ambient conditions in the dark place.
When dark, flow process advances to step F 1312 from step F 1311 around definite result representes.In this case, in step F 1312, when having opened night vision function (increasing the image acquisition operations of infrared sensitivity), flow process advances to step F 1314.In this case, in step F 1314, system controller 110 confirms that the image control that has occurred causing night vision function to be closed triggers.When having carried out the processing of step F 1314, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 110 control image acquisition and display device 101 are closed night vision function.In other words, system controller 110 order image acquisition control sections 111 obtain sensitivity with infrared image and revert to normal image acquisition sensitivity and carry out normal image acquisition operations.
In the processing shown in Figure 24 B, when the user is in the darkroom etc., automatically open night vision function so that he or she in the darkroom, can clearly see around.On the contrary, when the user has left darkroom etc., automatically close night vision function.In other words, realized increasing accordingly the processing of user's visual capacity with ambient conditions.
Alternatively, whether image analyzing section 128 is dark around can detecting through the image that analysis is obtained.When having reduced whole brightness of obtaining image widely, is dark around confirming, system controller 110 can confirm that the image control that has occurred causing night vision function to be opened triggers.
Figure 25 shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is perhaps obtained by image analyzing section 128 by the information of obtaining sensor of interest 120 detections.
In step F shown in Figure 25 1401, system controller 110 monitoring by one in the information of obtaining information that sensor of interest 120 detects and obtaining by image analyzing section 128 or this two.Obtain sensor of interest 120 and for example be distance measuring sensor.Image analyzing section 128 is through analyzing the distance that the image that is obtained calculates subject.
In other words, system controller 110 confirms accordingly that with the information that is obtained by the information of obtaining sensor of interest 120 detections and/or image analyzing section 128 target (obtaining target) that the user is watching is away from for example he or her hand or approaching with it.
When the user is watching at a distance scene or match away from the place, seat of place, stadium, and system controller 110 confirmed to obtain target a long way off the time, and flow process advances to step F 1404 from step F 1402.In step F 1404, system controller 110 confirms that the image control that has occurred causing display mode to be switched to distant view zoom display mode triggers.After this, flow process advances to step F 1405.In step F 1405, system controller 110 calculate with to the corresponding appropriate zoom magnification ratio of the distance of obtaining target.
When the user is watching nearby scene or at he or her newspaper on hand, and system controller 110 confirmed to obtain target on hand the time, and flow process advances to step F 1406 from step F 1403.In step F 1406, system controller 110 confirms that the image control that has occurred causing display mode to be switched to amplification (opening zoom) display mode triggers.After this, flow process advances to step F 1407.In step F 1407, system controller 110 calculate with to the corresponding appropriate zoom magnification ratio of the distance of obtaining target.
When the processing of the processing of having carried out step F 1404 and step F 1405 or step F 1406 and step F 1407, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are carried out the zoom operation with the magnification ratio that is calculated.
Therefore, corresponding, distant view image shown in Fig. 3 C of the scene just seen of display part 102 demonstrations and user or the wide-angle zoom image shown in Fig. 4 B.
In this example, system controller 110 control distant views/wide-angle zoom operation.Alternatively, system controller 110 can be controlled image acquisition and display device 101 and change focal position and amplification/downscaled images accordingly to the distance of obtaining target.
Figure 26 shows and determines whether and occurred the exemplary processes that image control triggers accordingly by obtaining information that sensor of interest 120 detects and the information that is obtained by image analyzing section 128.Especially, in this exemplary processes, confirm to obtain target and whether comprise the character in newspaper, the books etc.
In step F shown in Figure 26 1501, system controller 110 monitoring is by obtaining information that sensor of interest 120 detects and the information that is obtained by image analyzing section 128.In this example, obtaining sensor of interest 120 is distance measuring sensors.The image that image analyzing section 128 is obtained through analysis detects subject and whether comprises character.
System controller 110 with by obtaining information that sensor of interest 120 detects and/or confirming accordingly that by the information that image analyzing section 128 obtains whether on hand and comprise the character in newspaper, the books etc. target (obtaining target) that the user watching.In other words, system controller 110 confirms whether users are reading in the newspaper in he or her hand.
When definite result representes to obtain target nearby and when comprising character, flow process advances to step F 1503 from step F 1502.In step F 1503, system controller 110 is confirmed to have occurred image control and is triggered.
In step F 1504, system controller 110 calculates the user and can utilize it to read comfily the adjusted value of newspaper or books.For example, system controller 110 obtains the for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make the user can read newspaper etc. comfily.
When having carried out the processing of step F 1503 and step F 1504, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, the 103 adjustment image acquisition sensitivity of system controller 110 order image acquisition section, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.In this processing procedure, the quality of images displayed on the adjustment display part 102.Therefore, shown in Fig. 6 B, display part 102 explicit users can be known the image of reading.
Except character detects, can also detect surrounding brightness, and testing result maybe be influential to the calculating of adjusted value.
In addition, when image analyzing section 128 analysis images, image analyzing section 128 can also identify the shape of newspaper or books, advances to the condition of step F 1503 as flow process.
When system controller 110 confirmed that the targets of being obtained are newspapers etc., system controller 110 can be controlled illumination section 104 and throw light on.
Be not adjustment picture quality, but processing and amplifying can be carried out in display image processing section 112, and the enlarged image that display part 102 is shown shown in Fig. 4 B, so that make the user can clearly read character.
When image comprised character, image analyzing section 128 can be confirmed these characters, and they are offered system controller 110 as text data.In this case, system controller 110 makes speech synthesiser divide 127 to carry out and the synthetic processing of the corresponding sound of text data that from image, detects.
Therefore, speech synthesiser divides 127 voice signals that generate as the reading voice of the character that obtains in the image to be comprised.System controller 110 makes voice output part 105 these reading voice of output.
Therefore, when the user watched newspaper etc., he or she can hear it and reads voice.
Figure 27 shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with current date of being counted by date and time segment count 122 and temporal information.
In step F shown in Figure 27 1601, current date and time that system controller 110 inspections are counted by date and time segment count 122.System controller 110 confirms and corresponding time zones of current time (for example morning), for example early morning, morning, daytime, at dusk with the time zone in evening.For example, the time zone from 4 o'clock to 7 o'clock is early morning, and the time zone from 7 o'clock to 9 o'clock is morning, and the time zone from 9 o'clock to 17 o'clock is daytime, and the time zone from 17 o'clock to 19 o'clock is at dusk, and the time zone from 19 o'clock to 4 o'clock is evening.
Be preferably with determined month and date and change the time zone accordingly.For example, because change accordingly with sunset time and month and date at sunrise, so change the time zone accordingly with month and date.For example, in summer, the time zone in early morning from 4 o'clock to 7 o'clock.For example, in the winter time, the time zone in early morning from 6 o'clock to 8 o'clock.
When definite result of step F 1601 represented that the time zone has changed, flow process advanced to step F 1603 from step F 1602.
When the time zone changed early morning into, flow process advanced to step F 1607 from step F 1603.In step F 1607, system controller 110 confirm to have occurred causing for early morning carries out image obtain the image control triggering of operation/display operation.
When the time zone changed morning into, flow process advanced to step F 1608 from step F 1604.In step F 1608, system controller 110 confirm to have occurred causing for morning carries out image obtain the image control triggering of operation/display operation.
When the time zone changed daytime into, flow process advanced to step F 1609 from step F 1605.In step F 1609, system controller 110 confirm to have occurred causing for daytime carries out image obtain the image control triggering of operation/display operation.
When the time zone had changed at dusk, flow process was from the step F 1606 step F1610 that advances.In step F 1610, system controller 110 confirms that the image control that has occurred causing obtaining operation/display operation for the dusk carries out image triggers.
When the time zone changed evening into, flow process advanced to step F 1611.In step F 1611, system controller 110 confirm to have occurred causing for evening carries out image obtain the image control triggering of operation/display operation.
When the definite result at step F 1607, F1608, F1609, F1610 or F1611 represented to have occurred image control triggering, flow process advanced to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are carried out and the corresponding image acquisition operations/display operation in time zone with display control section 114.For example, system controller 110 order image acquisition control sections 111 are adjusted for example image acquisition sensitivity, brightness, contrast, acutance etc. with display control section 114.Alternatively, system controller 110 can order image acquisition control section 111 and display control section 114 to carry out the image effect operation such as the soft focus display operation.
In this processing procedure, will offer the user with the corresponding image in time zone.For example, in the morning, the softening image is offered the user.In the morning, to the user picture rich in detail with strong contrast is provided.Between the lights, to the user tan image is provided.At night, to the user dark image is provided.Therefore, can provide and he or her corresponding image of mood and time zone to the user.
Certainly, can adjust picture quality accordingly, so that improve observability with the brightness that in each time zone, changes.
Except the time zone, can also with weather condition and user whether the user situation in the room adjust the quality of image accordingly.
Alternatively, can adjust the quality of image accordingly accordingly rather than with the time zone with utilizing date and time information determined season.In summer, for example increase the weight of the blue component of image.In autumn, for example increase the weight of the red component of image.In the winter time, for example increase the weight of the white color component of image.In spring, for example increase the weight of green/pink component.Therefore, can the image with sensation in season be provided to the user.
Figure 28 A shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is received by GPS receiving unit 121 and communications portion 126.
In the step F 1701 shown in Figure 28 A; System controller 110 makes the latitude and longitude information of the current location that communications portion 126 will obtain from GPS receiving unit 121 send to the webserver or the equipment with storing map database; Make this server or equipment information, and make communications portion 126 therefrom receive the information of relevant current location from the relevant current location of database retrieval.When image acquisition and display device 101 have the storing map database, system controller 110 can with the latitude and longitude information that receives from GPS receiving unit 121 accordingly from the information of the relevant current location of storing map database retrieval.
System controller 110 confirms accordingly with the current location information that is obtained whether the user is in the place that wherein must carry out predetermined image control.When confirming that the result representes that current location is that flow process advances to step F 1703 from step F 1702 in the time of wherein must carrying out the place of predetermined image control.In step F 1703, system controller 110 confirms that the image control that has occurred causing carrying out predetermined image control triggers.
When definite result represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are carried out predetermined picture control with display control section 114.
In this case, the example of image control is following.
When testing result representes that current location is sports ground, circuit etc.; Because the target that the user sees (obtaining object) is people, automobile of fast moving etc.; So system controller 110 order image acquisition control sections 111 increase the image acquisition frame frequency, so that can show the subject of fast moving rightly.
When current location was concert hall, music hall, amusement assembly hall, sports ground etc., system controller 110 can be ordered image acquisition control section 111 to be carried out with the corresponding distant view image of the distance of obtaining target to stage and obtained operation.In the time can being determined to the distance of obtaining target such as stage, can the distant view magnification ratio be set accordingly with the distance that is detected as current location information.Can detect the distance of obtaining target by obtaining sensor of interest 120 (distance measuring sensor).Can the distant view magnification ratio be set accordingly with the distance that is detected.Be not to carry out the distant view operation, but system controller 110 can be ordered and obtained image processing section 115 or display image processing section 112 carries out image processing and amplifying.
When current location is beach or mountain range; System controller 110 order image acquisition control sections 111 are carried out the image acquisition operations that increases ultraviolet sensitivity; The image that display part 2 is shown shown in Fig. 8 B, and make the user identify amount of ultraviolet irradiation.
In addition, can be with being superimposed upon the place name in the mechanism that obtains target, shop etc. or nominally with corresponding character picture of the current location information that is obtained or text.Can on display part 102, show advertising message, mechanism guides information and the alarm signal of peripheral region.
Figure 28 B shows the exemplary processes that determines whether to have occurred accordingly with the information that is received by GPS receiving unit 121 and by the information of communications portion 126 receptions image control triggering.Especially, when carrying out the image acquisition operations that increases infrared sensitivity, carry out this exemplary processes.
When image acquisition section 103 was being carried out the image acquisition operations that increases infrared sensitivity, flow process advanced to step F 1711 from the step F 1710 shown in Figure 28 B.
In step F 1711; System controller 110 makes the latitude and longitude information of the current location that communications portion 126 will obtain by GPS receiving unit 121 send to the webserver or the equipment with storing map database; Make this server or equipment information, and make communications portion 126 therefrom receive the information of relevant current location from the relevant current location of map datum library searching.When image acquisition and display device 101 had the storing map database, system controller 110 and the latitude and longitude information that is received by GPS receiving unit 121 were accordingly from the information of the relevant current location of storing map database retrieval.
When system controller 110 had obtained the information of relevant current location, system controller 110 confirmed on current location, whether must forbid increasing the image acquisition operations of infrared sensitivity.
When definite result is illustrated in must forbid increasing the image acquisition operations of infrared sensitivity on the current location time, flow process advances to step F 1713 from step F 1712.In step F 1713, system controller 110 is confirmed to have occurred to cause to increase the image control that the image acquisition operations of infrared sensitivity accomplishes and is triggered.
When the definite result in step F 1713 represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are accomplished the image acquisition operations that increases infrared sensitivity.
Because forbidden accordingly increasing the image acquisition operations of infrared sensitivity, so can prevent to use improperly the particular image such as the image acquisition operations that increases infrared sensitivity to obtain function with this position.
Figure 29 A shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is obtained by image analyzing section 128.
In the step F 1801 shown in Figure 29 A, the information that system controller 110 monitoring are obtained by image analyzing section 128.The image that image analyzing section 128 is obtained through analysis detects subject and whether comprises target.
When analysis result represented that the image that obtains comprises target, flow process advanced to step F 1803 from step F 1802.In step F 1803, system controller 110 is confirmed to have occurred image control and is triggered.
When the definite result in step F 1803 represented to have occurred image control triggering, flow process advanced to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are carried out predetermined picture control with display control section 114.
The example of image control is described below.
If target is a bird, when then in obtaining image, having detected bird, system controller 110 can order display image processing section 112 partly to carry out outstanding display operation like the bird that Figure 22 A and Figure 22 B are depicted as in the image.In this case, when the user watched wild bird, perhaps she can easily find and follow them for he.
If being kitten and user, target likes cat, then when kitten has got into he or she's the visual field, because outstandingly in display image shown kitten, so he or she can easily recognize it.
If target is the people; Then when in the image that is obtained, detecting he or she the time, system controller 110 can order display image processing section 112, obtain image signal processing section 115 or image acquisition section 103 outstanding show, enlarge or enlarged image in people's part.
If target is people, animal, building etc., display-object rendering context not only then.
Alternatively, when the man-hour that detects as target, can carry out the image processing of only from image, wiping the people.For example, can show the image of wherein from natural scene, covering people and the artificiality such as automobile.In this case, can be through carrying out the processing that interpolation processing is carried out the pixel portion of filling target with the surrounding pixel of wanting concealed target.
In addition, can carry out the image effect operation such as the mosaic display operation for the target such as the people.
With the information processing shown in the execution graph 29A accordingly that obtains by image analyzing section 128.Alternatively, if target is the live body such as people or animal, then when having detected target as the thermoelectric pickup that obtains sensor of interest 120, system controller 110 can be confirmed to have occurred image control and trigger.
Figure 29 B shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is obtained by image analyzing section 12.
In the step F 1810 shown in Figure 29 B, the information that system controller 110 monitoring are obtained by image analyzing section 128.Image analyzing section 128 is obtained image through the difference analysis that for example is utilized between the frame that obtains image, whether detects subject just in fast moving.
When testing result was represented the positive fast moving of subject, flow process advanced to step F 1812 from step F 1811.In step F 1812, system controller 110 confirms that the image control that has occurred causing carrying out the flicker display operation triggers.
When the definite result in step F 1812 represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order display control sections 114 carries out image processing are so that show the image shown in Fig. 9 C.
In this processing procedure, carry out the flicker display operation, when watching athletic competition with convenient user, if the positive fast moving of player, then he or she can see player's motion.
In this example, when detecting rapid movement, the triggering of flicker display operation appears causing carrying out.Alternatively, can occur causing display mode to be switched to the triggering of high frame frequency display operation.Alternatively, when obtaining target and comprise rapid movement, can occur causing display mode to be switched to the triggering of zoom display mode or outstanding display mode.
Figure 30 A shows and determines whether to have occurred the exemplary treatments that image control triggers accordingly with the information that is obtained by image analyzing section 128.In this example, obtained a man-hour when, discern he or she.
In the step F 1901 shown in Figure 30 A, the information that system controller 110 monitoring are obtained by image analyzing section 128.The image that image analyzing section 128 is obtained through analysis confirms whether subject comprises people's face.When subject comprised people facial, image analyzing section 128 generated the personal characteristics data according to face-image.As stated, the personal characteristics data for example are the ratio apart from Ed apart from EN and eyes (Ed/EN) between eye center and nose, and the ratio apart from Ed apart from EM and eyes between eye center and mouth (Ed/EM).
When having extracted the personal characteristics data, flow process advances to step F 1903 from step F 1902.In step F 1903, system controller 110 is retrieved from individual database and the corresponding personal information of personal characteristics data.
For example; System controller 110 makes communications portion 126 the personal characteristics data sent to the webserver or the equipment with built-in individual database; Make this server and equipment from individual database, retrieve personal information, and make communications portion 126 therefrom receive the result who is retrieved.When image acquisition and display device 101 had built-in individual database, system controller 110 can be retrieved from individual database and the corresponding personal information of personal characteristics data.
When external equipment or system controller 110 from the personal data library searching during personal information of predetermined persons, flow process advances to step F 1905 from step F 1904.In step F 1905, system controller 110 confirms that the image control that has occurred causing showing personal information triggers.
When definite result represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order display control sections 114 for example are added to the personal information of being retrieved on the image that obtains.
When the user sees a people or a certain famous person that the user had met in the crowd who is walking; And when this people or famous person be registered in the individual database, display part 102 was presented at the information of registering in the individual database (place that name, unit, user once met etc.) together with this people's image.Therefore, the user can recognize this people exactly.
Figure 30 B shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is obtained by image analyzing section 128.When shown in Figure 21 A, when making that owing to sunlight and shade image is difficult for seeing, carry out this processing procedure.
In the step F 1910 shown in Figure 30 B, the information that system controller 110 monitoring are obtained by image analyzing section 128.The image that image analyzing section 128 is obtained through analysis whether detect in the image that is obtained since sunshine situation and bright areas and dark area have appearred.
When analysis result is illustrated in when having sunlight and shade difference in this image, flow process advances to step F 1912 from step F 1911.In step F 1912, system controller 110 is confirmed to have occurred image control and is triggered.
When the definite result in step F 1912 represented to have occurred image control triggering, flow process advanced to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are with display control section 114 carries out image processing or partly change image acquisition sensitivity, so that the difference of sunlight and shade is disappeared.Therefore, shown in Figure 21 B, the image that not too receives sunlight and shade to influence and be easy to see is provided to the user.
When image comprises because the influence of the illumination in room or the mechanism etc. rather than owing to the influence when causing bright and dark part difference of situation at sunshine; Perhaps when image section was not known, system controller 110 can be ordered image acquisition control section 111 and display control section 114 parts adjustment brightness, image acquisition sensitivity, contrast etc.
In Figure 30 A and Figure 30 B, described in the step F 104 of Figure 10, determining whether to occur the exemplary processes that image control triggers at Figure 23.This exemplary processes can be applied to determine whether in the step F shown in Figure 10 102 to occur to keep watch on show state and start the processing that triggers, and has determined whether to occur to keep watch on the processing that the show state completion triggers in the step F 105 shown in Figure 10.
When show state in step F shown in Figure 10 101 is when passing through state; Be similar to exemplary processes shown in Figure 23; If dark or too bright situation around having detected; Then can confirm to have occurred the supervision show state and start triggering, and can be switched to the supervision show state through state.
Confirm that the result representes because weather condition and must adjust when obtaining image can confirm to have occurred supervision show state startup triggering when the exemplary process shown in the image pattern 24A is such.In this case, under predetermined weather condition, can the execution monitoring Presentation Function.
Confirm that when the exemplary process shown in the image pattern 24B is such the result representes when dark on every side, can confirm to have occurred the supervision show state and start and trigger.In this case, when dark on every side, execution monitoring Presentation Function automatically.
Obtain target a long way off or nearby the time when the exemplary processes that kind shown in the image pattern 25, can confirm to have occurred the supervision show state and start and trigger.
When the exemplary processes that kind shown in the image pattern 26 detected near the user comprise the image of character the time, can confirm to have occurred the supervision show state and start trigger.
Exemplary processes similar and shown in Figure 27 can be confirmed to have occurred to start triggering with the corresponding supervision show state in time zone.
When the such current location of the exemplary processes shown in the image pattern 28A is the predetermined area, can confirms to have occurred the supervision show state and start triggering.In this case, can with the predetermined area or class of establishment execution monitoring Presentation Function accordingly.
When the exemplary processes that kind shown in the image pattern 29A has predeterminated target, can confirm to have occurred the supervision show state and start triggering.
Obtained when detecting rapid movement in the target when the exemplary processes that kind shown in the image pattern 29B, can confirm to have occurred the supervision show state and start and trigger.
When the exemplary processes that kind shown in the image pattern 30A has detected predetermined persons, can confirm to have occurred the supervision show state and start triggering.
When there be bright and dark the distribution in the exemplary processes that kind shown in the image pattern 30B in image, can confirm to have occurred the startup of supervision show state and trigger.
In these exemplary processes, when confirming that the supervision show state having occurred starts triggering, flow process advances to step F shown in Figure 10 103.Therefore; When having put on, the user is in through the image acquisition under the state and display device 101 and when not carrying out special operational; Image acquisition and display device 101 to be operating with the corresponding supervision show state of situation, and the user can with the corresponding supervision show state of this situation under see image.
Likewise, can determine whether to have occurred the supervision show state and accomplish triggering.
In exemplary processes shown in Figure 23, when detecting on every side when dark or too bright, therein around under the neither dark not too bright again situation, confirm to have occurred the supervision show state and accomplish and trigger, and show state can return to through state.
In the exemplary processes shown in Figure 24 A, determine whether the image that must obtain owing to weather condition adjustment.When definite result representes to adjust when obtaining image, can confirm to have occurred the supervision show state and accomplish and trigger, and show state can return to through state.
Whether be similar to the exemplary processes shown in Figure 24 B, be dark around can confirming.When around when not dark, can confirm to have occurred the supervision show state and accomplish and trigger, and show state can return to through state.
Be similar to exemplary processes shown in Figure 25, can confirm to obtain target and be a long way off or on hand.To represent to obtain target not only not far but also when near as definite result, can confirm to have occurred the supervision show state and accomplish and trigger, and show state can return to through state.
Exemplary processes shown in the image pattern 26 is such; When near the user, detecting the image that comprises character; Also do not detect under the situation of image therein, can confirm to have occurred the supervision show state and accomplish triggering, and show state can return to through state.
Exemplary processes shown in the image pattern 27 is such, can confirm and time zone, month and/or date, season etc. have occurred the supervision show state accordingly in and accomplish and trigger.
Exemplary processes shown in the image pattern 28A is such, when current location is the precalculated position, can confirms to have occurred the supervision show state and accomplish triggering.In this case, can stop image-acquisition functions accordingly and keep watch on Presentation Function with precalculated position or class of establishment.
Exemplary processes shown in the image pattern 28B is such, when stopping to increase the image acquisition operations of infrared sensitivity, can confirm to have occurred the supervision show state in step F 1713 and accomplish triggering, and show state can return to through state.
Exemplary processes shown in the image pattern 29A is such, when having predetermined subject, can confirm to have occurred the supervision show state and accomplish and trigger, and show state can return to through state.For example, in this case, forbid under the supervision show state, obtaining and/or showing predetermined subject.
Alternatively, when definite result representes to be scheduled to subject, can confirm to have occurred the supervision show state and accomplish triggering, and show state can return to through state.
Exemplary processes shown in the image pattern 29B is such; When detecting the rapid movement that obtains target; Also do not detect under the situation of rapid movement therein, can confirm to have occurred the supervision show state and accomplish triggering, and show state can return to through state.
Exemplary processes shown in the image pattern 30A is such, when detecting predetermined persons, can confirm to have occurred the supervision show state and accomplish triggering, and show state can return to through state.In this case, forbid under the supervision show state, obtaining and/or showing this predetermined persons.
Alternatively, when confirming in image, not have predetermined persons, can confirm to have occurred the supervision show state and accomplish triggering, and show state can return to through state.
Exemplary processes shown in the image pattern 30B is such; When in image, detecting bright and dark the distribution; Also do not detect under the situation of bright and dark difference therein, can confirm to have occurred the supervision show state and accomplish triggering, and show state can return to through state.
In these exemplary processes; Accomplish and trigger and flow process when turning back to step F 101 shown in Figure 10 when confirming to have occurred keeping watch on show state; The user reduces under the situation about perhaps disappearing the needs of keeping watch on show state therein; Perhaps forbid therein keeping watch under the situation of Presentation Function, show state can automatically switch to through state.
[the 6. effect of second embodiment, modification and expansion]
According to this embodiment; The image that obtains by the image acquisition section 103 that is arranged in glasses type installation unit or the helmet type installation unit; That is, as the image that obtains on the eyes of user direction of subject direction, be displayed on the display part 102 of user he or her eyes front.In this case, control image acquisition operations or display operation accordingly with relevant information such as the motion of relevant surrounding brightness, weather, situation, identification, subject, position, date and times as external circumstances.Therefore, can create the situation of assistance in fact or extending user visual capacity.
Because with external circumstances accordingly carries out image obtain the image acquisition operations of part 103 and change with the corresponding display mode of signal processing that obtains image signal processing section 115 and display image processing section 112, so apply the operation burden to the user.In addition, because controlled image acquisition and display device 101 rightly, so the user can easily use it.
In addition, because it is transparent in or translucent state through the transmissivity of controlling display part 102 it to be become, when the user put on installation unit, it can not upset his or her normal life.Therefore, in user's normal life, can use effectively according to the image acquisition of this embodiment and the benefit of display device 101.
In this embodiment, the image acquisition operations of image acquisition section 103 and the display mode of being realized by the signal processing of obtaining image signal processing section 115 and display image processing section 112 have mainly been described.For example, can control the switching of energising, outage and standby accordingly with external circumstances, and from the volume and the tonequality of the sound of voice output part 105 output.For example, can adjust volume accordingly with time and/or place.Alternatively, volume on every side can be detected, and the output volume of loud speaker can be adjusted accordingly with the volume on every side that is detected.
The outward appearance of image acquisition and display device 101 and structure are not limited to Fig. 1, Fig. 2 and those outward appearances and structure shown in Figure 20.Alternatively, can carry out various modifications.
For example, can in image acquisition and display device 101, arrange the storage area of storing the picture signal of obtaining by image acquisition section 103, and picture signal is sent to the hop of miscellaneous equipment.
Except image acquisition section 103, can also in image acquisition and display device 101, arrange importation and receiving unit from the external equipment input picture as the source of images displayed on the display part 102.
In this embodiment, having described wherein, image acquisition and display device 101 are examples of glasses type installation unit or helmet type installation unit.Yet the direction of family eyes is obtained image and at his the perhaps front display image of her eyes, this equipment can be any kind that the user can put on, such as headphone type, neckline type, ear-hung etc. as long as image acquisition and display device are continued to use.Alternatively, image acquisition and display device 101 can be to use the installing component such as clip to be attached to the unit on glasses, bongrace, the headphone etc.
Those skilled in the art are to be understood that: as long as within the scope of claim and their equivalent, depend on designing requirement and other factor, various modifications, combination, son combination and replacement can occur.

Claims (27)

1. image acquisition and display device comprise:
Image acquiring device is used for obtaining image like this so that make the user see that the direction of subject is the direction of said subject;
Display unit is arranged in the front of eyes of user, and is used to show the image that is obtained by image acquiring device;
User profile obtains device, is used to obtain relevant user's the motion and the information of physical condition; And
Control device is used for controlling the operation of image acquiring device or the operation of display unit accordingly with the information that is obtained the device acquisition by user profile,
Wherein said display unit can be with show state from switching to image display status or conversely show state being switched to through state from image display status through state; Wherein said is transparent or translucent through state; And in said image display status; The image that demonstration is obtained by said image acquiring device, and
Wherein said control device is controlled the transmissivity of said display unit, so that show state is perhaps switched to show state through state from image display status from switch to image display status through state conversely.
2. image acquisition as claimed in claim 1 and display device,
Wherein said control device is confirmed and is hoped or user situation by the user profile acquisition corresponding user of information that device obtained, and controls the operation of said image acquiring device or the operation of said display unit accordingly with definite result.
3. image acquisition as claimed in claim 1 and display device,
It is sense acceleration, angular speed, or the transducer of vibration that wherein said user profile obtains device.
4. image acquisition as claimed in claim 1 and display device,
The transducer of the motion of the motion of the motion that wherein said user profile acquisition device is a detection user head, the motion of user's arm, user's hand, the motion of user's shank or the whole health of user.
5. image acquisition as claimed in claim 1 and display device,
It is the transducer that detects user's not walking states, walking states and running state that wherein said user profile obtains device.
6. image acquisition as claimed in claim 1 and display device,
It is the vision sensor that detects user's visual information that wherein said user profile obtains device.
7. image acquisition as claimed in claim 1 and display device,
The sight line that wherein said user profile acquisition device is the detection user, user's focal length, user's pupil dilation, user's eyeground pattern or user's eyelid movement is as the transducer of user's visual information.
8. image acquisition as claimed in claim 1 and display device,
It is the biological sensor that detects user's biological information that wherein said user profile obtains device.
9. image acquisition as claimed in claim 1 and display device,
It is to detect heart rate information, pulse information, perspiration information, brain wave information, electrodermal response information, blood pressure information, body temperature information or the respiratory activity information transducer as user's biological information that wherein said user profile obtains device.
10. image acquisition as claimed in claim 1 and display device,
The biological sensor of the information of tense situation that wherein said user profile acquisition device is the detection representative of consumer or user's excitatory state.
11. image acquisition as claimed in claim 1 and display device,
Wherein said user profile obtains device and is formed the importation that can import visual field information at least.
12. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled said image acquiring device and is started and stop image acquisition operations.
13. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled image acquiring device changeably, obtains from distant view image with execution and operates the image acquisition operations that wide angle picture obtains operation.
14. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the focal length of said image acquiring device.
15. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the image acquisition sensitivity of said image acquiring device changeably.
16. image acquisition as claimed in claim 1 and display device,
The infrared image that wherein said control device is controlled said image acquiring device changeably obtains sensitivity.
17. image acquisition as claimed in claim 1 and display device,
The ultraviolet image that wherein said control device is controlled said image acquiring device changeably obtains sensitivity.
18. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the frame frequency of said image acquiring device changeably.
19. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the operation of the image acquisition lens combination in the said image acquiring device.
20. image acquisition as claimed in claim 1 and display device,
The operation of wherein said control device control image acquisition signal processing, this image acquisition signal processing part divisional processing by said image acquiring device imageing sensor obtained obtains picture signal.
21. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled said display unit and is perhaps dwindled images displayed above that to enlarge.
22. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled said display unit to separate images displayed above that.
23. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the display brightness of images displayed on the said display unit.
24. image acquisition as claimed in claim 1 and display device,
Wherein said control device control is to the signal processing of images displayed signal on the said display unit.
25. image acquisition as claimed in claim 1 and display device also comprise:
Lighting device is used for illuminating this subject along the subject direction,
Wherein said control device is controlled said lighting device so that operate with the illumination of carrying out said lighting device accordingly by the said user profile acquisition information that device obtained.
26. image acquisition and display packing in image acquisition and the display device, this image acquisition and display device have: image acquiring device is used for obtaining image like this so that make the user see that the direction of subject is the direction of said subject; And display unit; Be arranged in user's front and be used to show the image that obtains by image acquiring device; Wherein said display unit can be with show state from switching to image display status or conversely show state being switched to through state from image display status through state, wherein said is transparent or translucent through state, and in said image display status; The image that demonstration is obtained by said image acquiring device, said method comprises step:
Obtain relevant user's motion or user's the information of physical condition; And
Control the operation of said image acquiring device or the operation of said display unit accordingly with the information that obtains to obtain in the step in user profile,
Wherein, the operation of controlling said image acquiring device comprises: control the transmissivity of said display unit, so that show state is perhaps switched to show state through state from image display status from switch to image display status through state conversely.
27. image acquisition and display device comprise:
Image acquisition section is obtained image like this so that make the user see that the direction of subject is the direction of this subject;
The display part is arranged in the front of eyes of user, and shows the image that is obtained by image acquisition section;
User profile obtains part, obtains relevant user's the motion and the information of physical condition; And
Control section is controlled the operation of said image acquisition section or the operation of said display part accordingly with the information that is obtained the part acquisition by said user profile,
Wherein, Said display part can be with show state from switching to image display status or conversely show state being switched to through state from image display status through state; Wherein said is transparent or translucent through state, and in said image display status, shows the image that is obtained by said image acquisition section; And
Wherein, said control section is controlled the transmissivity of said display part, so that show state is perhaps switched to show state through state from image display status from switch to image display status through state conversely.
CN 200710153620 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method Expired - Fee Related CN101141567B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2006-244685 2006-09-08
JP2006244685 2006-09-08
JP2006244685A JP4961914B2 (en) 2006-09-08 2006-09-08 Imaging display device and imaging display method
JP2006261975A JP2008083289A (en) 2006-09-27 2006-09-27 Imaging display apparatus, and imaging display method
JP2006-261975 2006-09-27
JP2006261975 2006-09-27

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN2009101268115A Division CN101520690B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method

Publications (2)

Publication Number Publication Date
CN101141567A CN101141567A (en) 2008-03-12
CN101141567B true CN101141567B (en) 2012-12-05

Family

ID=39193289

Family Applications (2)

Application Number Title Priority Date Filing Date
CN 200710153620 Expired - Fee Related CN101141567B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method
CN2009101268115A Expired - Fee Related CN101520690B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN2009101268115A Expired - Fee Related CN101520690B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method

Country Status (2)

Country Link
JP (1) JP4961914B2 (en)
CN (2) CN101141567B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970258A (en) * 2013-01-28 2014-08-06 联想(北京)有限公司 Wearable electronic equipment and display method
CN108536284A (en) * 2018-03-14 2018-09-14 广东欧珀移动通信有限公司 Image display method and relevant device

Families Citing this family (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4730569B2 (en) 2009-03-27 2011-07-20 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
KR100957575B1 (en) * 2009-10-01 2010-05-11 (주)올라웍스 Method, terminal and computer-readable recording medium for performing visual search based on movement or pose of terminal
JP5494153B2 (en) 2010-04-08 2014-05-14 ソニー株式会社 Image display method for head mounted display
CN102404494B (en) * 2010-09-08 2015-03-04 联想(北京)有限公司 Electronic equipment and method for acquiring image in determined area
EP2499960B1 (en) * 2011-03-18 2015-04-22 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Method for determining at least one parameter of two eyes by setting data rates and optical measuring device
JP5118266B2 (en) * 2011-03-25 2013-01-16 パナソニック株式会社 Display device
TWI425498B (en) * 2011-05-04 2014-02-01 Au Optronics Corp Video-audio playing system relating to 2-views application and method thereof
CN103033936A (en) * 2011-08-30 2013-04-10 微软公司 Head mounted display with iris scan profiling
JP2014531662A (en) * 2011-09-19 2014-11-27 アイサイト モバイル テクノロジーズ リミテッド Touch-free interface for augmented reality systems
DK2590008T3 (en) * 2011-11-03 2016-06-27 Fortuna Urbis S R L Glasses
US20130137076A1 (en) * 2011-11-30 2013-05-30 Kathryn Stone Perez Head-mounted display based education and instruction
CN103258107A (en) * 2012-02-17 2013-08-21 普天信息技术研究院有限公司 Monitoring method and assistant monitoring system
CN108595009B (en) * 2012-02-29 2020-12-18 联想(北京)有限公司 Man-machine interaction control method and electronic terminal
JP5938977B2 (en) * 2012-03-23 2016-06-22 ソニー株式会社 Head mounted display and surgical system
CN103369212B (en) * 2012-03-28 2018-06-05 联想(北京)有限公司 A kind of image-pickup method and equipment
WO2013169237A1 (en) * 2012-05-09 2013-11-14 Intel Corporation Eye tracking based selective accentuation of portions of a display
US9600931B2 (en) 2012-07-25 2017-03-21 Sony Corporation Information processing device and program
CN103576315B (en) 2012-07-30 2017-03-01 联想(北京)有限公司 Display device
CN103595984A (en) * 2012-08-13 2014-02-19 辉达公司 3D glasses, a 3D display system, and a 3D display method
CN103677704B (en) * 2012-09-20 2018-11-09 联想(北京)有限公司 Display device and display methods
CN103713387A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic device and acquisition method
KR20140066848A (en) * 2012-11-22 2014-06-02 경북대학교 산학협력단 Face detecting device and method for detect of face
CN103873998B (en) * 2012-12-17 2018-07-03 联想(北京)有限公司 Electronic equipment and sound collection method
EP2940985A4 (en) * 2012-12-26 2016-08-17 Sony Corp Image processing device, and image processing method and program
JP6264855B2 (en) * 2013-11-18 2018-01-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
RU2621488C2 (en) 2013-02-14 2017-06-06 Сейко Эпсон Корпорейшн Display fixed on head and method of controlling display fixed on head
JP6299067B2 (en) * 2013-02-14 2018-03-28 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP5273323B1 (en) * 2013-03-13 2013-08-28 パナソニック株式会社 Head mounted display device
US9661221B2 (en) * 2013-03-15 2017-05-23 Qualcomm Incorporated Always-on camera sampling strategies
JP5967597B2 (en) * 2013-06-19 2016-08-10 パナソニックIpマネジメント株式会社 Image display device and image display method
KR102083596B1 (en) 2013-09-05 2020-03-02 엘지전자 주식회사 Display device and operation method thereof
CN103501406B (en) * 2013-09-16 2017-04-12 北京智谷睿拓技术服务有限公司 Image collecting system and image collecting method
CN103499885B (en) * 2013-09-30 2014-10-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN103499886B (en) * 2013-09-30 2015-07-08 北京智谷睿拓技术服务有限公司 Imaging device and method
JP6529491B2 (en) * 2013-10-14 2019-06-12 サード ドット アーベー Operating method of wearable life log device
CN103558971A (en) * 2013-10-29 2014-02-05 小米科技有限责任公司 Browsing method, browsing device and terminal device
CN103593051B (en) * 2013-11-11 2017-02-15 百度在线网络技术(北京)有限公司 Head-mounted type display equipment
JP5751315B2 (en) * 2013-11-20 2015-07-22 ソニー株式会社 Image display method for head mounted display
CN104238120A (en) * 2013-12-04 2014-12-24 全蕊 Smart glasses and control method
WO2015083316A1 (en) * 2013-12-05 2015-06-11 ソニー株式会社 Display device
WO2015098253A1 (en) * 2013-12-26 2015-07-02 株式会社ニコン Electronic device
CN104850217A (en) * 2014-02-19 2015-08-19 联想(北京)有限公司 Human eye movement monitoring device, method and equipment
CN103823563B (en) * 2014-02-28 2016-11-09 北京云视智通科技有限公司 A kind of head-wearing type intelligent display device
JP2015192697A (en) * 2014-03-31 2015-11-05 ソニー株式会社 Control device and control method, and photographing control system
CN103976733A (en) * 2014-05-21 2014-08-13 蓝江涌 Multi-passage brain wave control glasses
CN104092935B (en) * 2014-06-05 2018-06-26 西安中兴新软件有限责任公司 A kind for the treatment of method and apparatus of image taking
KR102184272B1 (en) * 2014-06-25 2020-11-30 엘지전자 주식회사 Glass type terminal and control method thereof
CN105511750B (en) * 2014-09-26 2020-01-31 联想(北京)有限公司 switching method and electronic equipment
CN104281266B (en) * 2014-10-17 2017-10-27 深圳鼎界科技有限公司 head-mounted display apparatus
CN104360737A (en) * 2014-11-05 2015-02-18 深圳市中兴移动通信有限公司 Method and device for adjusting color temperature of screen
CN105654894B (en) * 2014-11-12 2018-01-12 西安诺瓦电子科技有限公司 LED bright chroma bearing calibrations
US20160140390A1 (en) * 2014-11-13 2016-05-19 Intel Corporation Liveness detection using progressive eyelid tracking
JP6405991B2 (en) * 2014-12-24 2018-10-17 セイコーエプソン株式会社 Electronic device, display device, and control method of electronic device
WO2016139976A1 (en) * 2015-03-02 2016-09-09 ソニー株式会社 Information processing apparatus, information processing method, and program
JP6459684B2 (en) * 2015-03-23 2019-01-30 カシオ計算機株式会社 Information output device, information output method, and program
CN105007424A (en) * 2015-07-22 2015-10-28 深圳市万姓宗祠网络科技股份有限公司 Automatic focusing system, method and wearable device based on eye tracking
EP3333808B1 (en) * 2015-08-06 2021-10-27 Sony Interactive Entertainment Inc. Information processing device
CN105137601B (en) * 2015-10-16 2017-11-14 上海斐讯数据通信技术有限公司 A kind of intelligent glasses
CN108345384A (en) * 2015-11-03 2018-07-31 安溪县智睿电子商务有限公司 A kind of Intelligent control device
JP2017102618A (en) * 2015-11-30 2017-06-08 株式会社ニコン Display device and display program
WO2017094800A1 (en) * 2015-11-30 2017-06-08 株式会社ニコン Display device, display program, and display method
JP6798106B2 (en) * 2015-12-28 2020-12-09 ソニー株式会社 Information processing equipment, information processing methods, and programs
CN105607256A (en) * 2016-01-04 2016-05-25 深圳市华星光电技术有限公司 Intelligent wearable device
US10908694B2 (en) 2016-02-01 2021-02-02 Microsoft Technology Licensing, Llc Object motion tracking with remote device
CN105676458A (en) * 2016-04-12 2016-06-15 王鹏 Wearable calculation device and control method thereof, and wearable equipment with wearable calculation device
CN107402698A (en) * 2016-05-19 2017-11-28 杨冬源 A kind of image display method
JP2017216667A (en) * 2016-05-31 2017-12-07 フォーブ インコーポレーテッド Image provision system
JP6647150B2 (en) * 2016-06-15 2020-02-14 株式会社Nttドコモ Information display device
JP6685397B2 (en) * 2016-07-12 2020-04-22 三菱電機株式会社 Equipment control system
JP7016211B2 (en) * 2016-08-05 2022-02-04 株式会社コーエーテクモゲームス Production processing program and information processing equipment
JP7148501B2 (en) 2016-09-22 2022-10-05 マジック リープ, インコーポレイテッド Augmented reality spectroscopy
CN106445164A (en) * 2016-10-18 2017-02-22 北京小米移动软件有限公司 Adjusting method and device of intelligent glasses
JP6919222B2 (en) * 2017-02-27 2021-08-18 セイコーエプソン株式会社 Display device and control method of display device
CN106951316B (en) * 2017-03-20 2021-07-09 北京安云世纪科技有限公司 Virtual mode and real mode switching method and device and virtual reality equipment
CN108496107A (en) * 2017-03-28 2018-09-04 深圳市柔宇科技有限公司 Head-mounted display apparatus and its display changeover method
JP7008424B2 (en) 2017-04-10 2022-01-25 株式会社ジャパンディスプレイ Display device
WO2019087996A1 (en) * 2017-10-30 2019-05-09 ピクシーダストテクノロジーズ株式会社 Retinal projection device and retinal projection system
CN108234980A (en) * 2017-12-28 2018-06-29 北京小米移动软件有限公司 Image processing method, device and storage medium
CN108108022B (en) * 2018-01-02 2021-05-18 联想(北京)有限公司 Control method and auxiliary imaging device
US10859830B2 (en) * 2018-01-31 2020-12-08 Sony Interactive Entertainment LLC Image adjustment for an eye tracking system
CN108391049A (en) * 2018-02-11 2018-08-10 广东欧珀移动通信有限公司 Filming control method and relevant device
WO2019183399A1 (en) 2018-03-21 2019-09-26 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
JP2020016869A (en) * 2018-07-27 2020-01-30 伸也 佐藤 Digital telescopic eyeglasses
CN110109256A (en) * 2019-06-24 2019-08-09 京东方科技集团股份有限公司 Glasses and its control method
CN110688005A (en) * 2019-09-11 2020-01-14 塔普翊海(上海)智能科技有限公司 Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
CN110625625A (en) * 2019-09-18 2019-12-31 天津工业大学 Music robot based on electroencephalogram control
CN112684603B (en) * 2019-10-17 2023-03-28 杭州海康威视数字技术股份有限公司 Intelligent glasses
JP2021089351A (en) * 2019-12-03 2021-06-10 キヤノン株式会社 Head-mounted system and information processing apparatus
JP7170277B2 (en) * 2019-12-09 2022-11-14 株式会社辰巳菱機 Reporting device
CA3189540A1 (en) 2020-07-16 2022-01-20 Invacare Corporation System and method for concentrating gas
CA3189534A1 (en) 2020-07-16 2022-01-20 Invacare Corporation System and method for concentrating gas
CN113960788B (en) * 2020-07-17 2023-11-14 宇龙计算机通信科技(深圳)有限公司 Image display method, device, AR glasses and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5623703A (en) * 1990-10-12 1997-04-22 Nikon Corporation Camera capable of detecting eye-gaze
US6091546A (en) * 1997-10-30 2000-07-18 The Microoptical Corporation Eyeglass interface system
EP1593964A1 (en) * 2004-05-07 2005-11-09 Sony Corporation Biological sensor device and content playback method and apparatus
CN1705961A (en) * 2003-07-01 2005-12-07 松下电器产业株式会社 Eye imaging device
CN101141568A (en) * 2006-09-08 2008-03-12 索尼株式会社 Image pickup apparatus and image pickup method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0923451A (en) * 1995-07-05 1997-01-21 Sanyo Electric Co Ltd Sensitivity response controller
JPH09211382A (en) * 1996-02-07 1997-08-15 Canon Inc Optical device
JP3877366B2 (en) * 1997-01-20 2007-02-07 本田技研工業株式会社 Head mounted display device for vehicle
JPH11164186A (en) * 1997-11-27 1999-06-18 Fuji Photo Film Co Ltd Image recorder
JP2003011722A (en) * 2001-06-29 2003-01-15 Toyota Motor Corp Nighttime running support device
JP4182730B2 (en) * 2002-11-19 2008-11-19 ソニー株式会社 Imaging apparatus and method
JP4239738B2 (en) * 2003-07-22 2009-03-18 ソニー株式会社 Imaging device
JP2005078045A (en) * 2003-09-04 2005-03-24 Pentax Corp Optical equipment for observation with image display function
JP3968522B2 (en) * 2003-10-06 2007-08-29 ソニー株式会社 Recording apparatus and recording method
JP3979394B2 (en) * 2004-02-19 2007-09-19 松下電器産業株式会社 Imaging device
JP2006129288A (en) * 2004-10-29 2006-05-18 Konica Minolta Photo Imaging Inc Video display device
JP2006148541A (en) * 2004-11-19 2006-06-08 Denso Corp Navigation device and program
JP2006208997A (en) * 2005-01-31 2006-08-10 Konica Minolta Photo Imaging Inc Video display device and video display system
JP4378636B2 (en) * 2005-02-28 2009-12-09 ソニー株式会社 Information processing system, information processing apparatus, information processing method, program, and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5623703A (en) * 1990-10-12 1997-04-22 Nikon Corporation Camera capable of detecting eye-gaze
US6091546A (en) * 1997-10-30 2000-07-18 The Microoptical Corporation Eyeglass interface system
CN1705961A (en) * 2003-07-01 2005-12-07 松下电器产业株式会社 Eye imaging device
EP1593964A1 (en) * 2004-05-07 2005-11-09 Sony Corporation Biological sensor device and content playback method and apparatus
CN101141568A (en) * 2006-09-08 2008-03-12 索尼株式会社 Image pickup apparatus and image pickup method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2006-227236A 2006.08.31

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970258A (en) * 2013-01-28 2014-08-06 联想(北京)有限公司 Wearable electronic equipment and display method
CN103970258B (en) * 2013-01-28 2018-08-07 联想(北京)有限公司 Wearable electronic equipment and display methods
CN108536284A (en) * 2018-03-14 2018-09-14 广东欧珀移动通信有限公司 Image display method and relevant device

Also Published As

Publication number Publication date
CN101141567A (en) 2008-03-12
CN101520690B (en) 2011-07-06
CN101520690A (en) 2009-09-02
JP2008067218A (en) 2008-03-21
JP4961914B2 (en) 2012-06-27

Similar Documents

Publication Publication Date Title
CN101141567B (en) Image capturing and displaying apparatus and image capturing and displaying method
US7855743B2 (en) Image capturing and displaying apparatus and image capturing and displaying method
US9846304B2 (en) Display method and display apparatus in which a part of a screen area is in a through-state
CN101512632B (en) Display apparatus and display method
CN101165538B (en) Imaging display apparatus and method
CN101155258A (en) Imaging apparatus and imaging method
JP5664677B2 (en) Imaging display device and imaging display method
JP2013083994A (en) Display unit and display method
JP2015046885A (en) Display device and display method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20080314

Address after: Tokyo, Japan

Applicant after: Sony Corp.

Address before: Tokyo

Applicant before: Sony Corp.

Co-applicant before: Sony Computer Entertainment Inc.

C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121205

Termination date: 20210907

CF01 Termination of patent right due to non-payment of annual fee