CN104636051A - User interface unlocking method and electronic equipment - Google Patents

User interface unlocking method and electronic equipment Download PDF

Info

Publication number
CN104636051A
CN104636051A CN201310567222.7A CN201310567222A CN104636051A CN 104636051 A CN104636051 A CN 104636051A CN 201310567222 A CN201310567222 A CN 201310567222A CN 104636051 A CN104636051 A CN 104636051A
Authority
CN
China
Prior art keywords
eye
user
eye motion
motion information
action sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310567222.7A
Other languages
Chinese (zh)
Other versions
CN104636051B (en
Inventor
王健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201310567222.7A priority Critical patent/CN104636051B/en
Publication of CN104636051A publication Critical patent/CN104636051A/en
Application granted granted Critical
Publication of CN104636051B publication Critical patent/CN104636051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Abstract

The invention discloses a user interface unlocking method, the user interface unlocking method comprises the steps that when electronic equipment is in a user interface locking state, eye movement information of a user is extracted from images which are acquired by an image acquisition unit of the electronic equipment; identity verification is conducted on the user according to the fiducial eye movement information and the eye movement information of the user; if the user is a legal user, the electronic equipment switches to a user interface unlocking state; if the user is an illegal user, the electronic equipment keeps in the user interface locking state. Meanwhile, the invention discloses the electronic equipment.

Description

A kind of method that user interface unlocks and electronic equipment
Technical field
The present invention relates to electronic technology field, particularly relate to method and electronic equipment that a kind of user interface unlocks.
Background technology
At present, a lot of electronic equipment (as: smart mobile phone, panel computer etc.) is all configured with touch-screen, when touch-screen is as display device, can provide user interface to user; When touch-screen is as input equipment, user's touch operation on the user interface can be detected, and make response, thus realize the man-machine interaction of user and electronic equipment.
At present, on the electronic equipment with touch-screen, in order to avoid the maloperation (as: finger by mistake encounter touch-screen and cause activating unintentionally or some function of stopping using) of user or the consideration based on information security, be mostly provided with the function of user interface locking (also known as screen locking) and user interface unblock.Existing user interface unlock method is: user performs the slide with a desired trajectory on the touchscreen, or user presses one group of predetermined button or input code or password by touch-screen (simultaneously or in a sequence).
Current user interface unlock method all need user on the touchscreen carry out touch operation, but this can cause the damage of physical property (such as: cause cut in touch screen surface) to touch-screen.
Summary of the invention
The embodiment of the present application is by a kind of provide user interface and unlock method and electronic equipment, solve existing user interface unlock method due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury.
First aspect, provides a kind of method that user interface unlocks, comprising:
When electronic equipment is in user interface lock-out state, from the image that the image acquisition component of described electronic equipment gets, extract the eye motion information of user;
According to the eye motion information of benchmark and the eye motion information of described user, authentication is carried out to described user;
If described user is validated user, then described electronic equipment is switched to user interface released state;
If described user is disabled user, then described electronic equipment is remained on user interface lock-out state.
In conjunction with first aspect, in the first possible embodiment of first aspect, described according to the eye motion information of benchmark and the eye motion information of described user, authentication is carried out to described user, comprising:
If the eye motion information match of the eye motion information of described user and described benchmark, then determine that described user is for validated user;
If the eye motion information of described user is not mated with the eye motion information of described benchmark, then determine that described user is disabled user.
In conjunction with the first possible embodiment of first aspect, in the embodiment that the second of first aspect is possible, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, the eye motion information of described user and the eye motion information match of described benchmark, comprising:
Ratio between the situation occurrence number that eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises is lower than the threshold value set.
In conjunction with the embodiment that the second of first aspect is possible, in the third possible embodiment of first aspect, the eye motion information of described user is not mated with the eye motion information of described benchmark, comprising:
Ratio between the situation occurrence number that eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises is higher than the threshold value set.
In conjunction with the first possible embodiment of first aspect, in the 4th kind of possible embodiment of first aspect, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, the eye motion information of described user and the eye motion information match of described benchmark, comprising:
Described first eye action sequence comprises eye motion each in described second eye action sequence, and described first eye action sequence is identical with the eye motion in same position in described second eye action sequence.
In conjunction with the 4th kind of possible embodiment of first aspect, in the 5th kind of possible embodiment of first aspect, the eye motion information of described user is not mated with the eye motion information of described benchmark, comprising:
There is at least one eye motion in described second eye action sequence not to be included in described first eye action sequence, and/or described first eye action sequence is not identical with the eye motion at least one same position in described second eye action sequence.
In conjunction with first aspect, or any one possible embodiment above-mentioned of first aspect, in the 6th kind of possible embodiment of first aspect, described from the image that the image acquisition component of described electronic equipment gets, before extracting the eye motion information of user, also comprise:
By the range sensor of described electronic equipment, obtain the distance between described user and described electronic equipment;
When described distance is less than predeterminable range, in preset time period, obtain image by described image acquisition component.
In conjunction with first aspect, or any one possible embodiment above-mentioned of first aspect, in the 7th kind of possible embodiment of first aspect, described eye motion information comprises following eye motion:
Eyes are opened; And/or eyes close; And/or left eye is opened, right eye closes; And/or right eye is opened, left eye closes; And/or eyes are seen along preset direction.
Based on same inventive concept, second aspect, provides a kind of electronic equipment, comprising:
Extraction unit, during for being in user interface lock-out state at electronic equipment, from the image that the image acquisition component of described electronic equipment gets, extracts the eye motion information of user;
Identity authenticating unit, for receiving the eye motion information of described user from described extraction unit, and according to the eye motion information of benchmark and the eye motion information of described user, carries out authentication to described user;
User interface control unit, if be validated user for described user, is then switched to user interface released state by described electronic equipment; If described user is disabled user, then described electronic equipment is remained on user interface lock-out state.
In conjunction with second aspect, in the first possible embodiment of second aspect, described identity authenticating unit, comprising:
First determination module, if for the eye motion information of described user and the eye motion information match of described benchmark, then determines that described user is for validated user;
Second determination module, if do not mate with the eye motion information of described benchmark for the eye motion information of described user, then determines that described user is disabled user.
In conjunction with the first possible embodiment of second aspect, in the embodiment that the second of second aspect is possible, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, described first determination module, also for:
If the ratio between the situation occurrence number that the eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises lower than the threshold value set, then determines the eye motion information of described user and the eye motion information match of described benchmark.
In conjunction with the embodiment that the second of second aspect is possible, in the third possible embodiment of second aspect, described second determination module, also for:
If the ratio between the situation occurrence number that the eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises higher than the threshold value set, then determines that the eye motion information of described user is not mated with the eye motion information of described benchmark.
In conjunction with the first possible embodiment of second aspect, in the 4th kind of possible embodiment of second aspect, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, described first determination module, also for:
If described first eye action sequence comprises eye motion each in described second eye action sequence, and described first eye action sequence is identical with the eye motion in same position in described second eye action sequence, then determine the eye motion information of described user and the eye motion information match of described benchmark.
In conjunction with the 4th kind of possible embodiment of second aspect, in the 5th kind of possible embodiment of second aspect, described second determination module, also for:
If there is at least one eye motion in described second eye action sequence is not included in described first eye action sequence, and/or, described first eye action sequence is not identical with the eye motion at least one same position in described second eye action sequence, then determine that the eye motion information of described user is not mated with the eye motion information of described benchmark.
In conjunction with any one possible embodiment above-mentioned of second aspect or second aspect, in the 6th kind of possible embodiment of second aspect, described electronic equipment, also comprises:
First acquiring unit, for described from the image that the image acquisition component of described electronic equipment gets, before extracting the eye motion information of user, by the range sensor of described electronic equipment, obtain the distance between described user and described electronic equipment;
Second acquisition unit, for when described distance is less than predeterminable range, obtains image by described image acquisition component in preset time period.
In conjunction with second aspect, or any one possible embodiment above-mentioned of second aspect, in the 7th kind of possible embodiment of second aspect, described eye motion information comprises following eye motion:
Eyes are opened; And/or eyes close; And/or left eye is opened, right eye closes; And/or right eye is opened, left eye closes; And/or eyes are seen to preset direction.
Based on same inventive concept, the third aspect, provides a kind of electronic equipment, comprising:
Image acquisition component, for gathering image;
Storer, for eye motion information and the program code of Memory Reference;
Processor, is connected with described image acquisition component and described storer, for reading the program code stored in described storer, performs:
When electronic equipment is in user interface lock-out state, from the image that described image acquisition component collects, extract the eye motion information of user; From described storer, obtain the eye motion information of described benchmark, and according to the eye motion information of described benchmark and the eye motion information of described user, authentication is carried out to described user; If described user is validated user, then described electronic equipment is switched to user interface released state; If described user is disabled user, then described electronic equipment is remained on user interface lock-out state.
In conjunction with the third aspect, in the first possible embodiment of the third aspect, described processor, also for:
If the eye motion information match of the eye motion information of described user and described benchmark, then determine that described user is for validated user; If the eye motion information of described user is not mated with the eye motion information of described benchmark, then determine that described user is disabled user.
In conjunction with the first possible embodiment of the third aspect, in the embodiment that the second of the third aspect is possible, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, described processor also for:
If the ratio between the situation occurrence number that the eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises lower than the threshold value set, then determines the eye motion information of described user and the eye motion information match of described benchmark.
In conjunction with the embodiment that the second of the third aspect is possible, in the third possible embodiment of the third aspect, described processor also for:
If the ratio between the situation occurrence number that the eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises higher than the threshold value set, then determines that the eye motion information of described user is not mated with the eye motion information of described benchmark.
In conjunction with the first possible embodiment of the third aspect, in the 4th kind of possible embodiment of the third aspect, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, described processor, also for:
If described first eye action sequence comprises each eye motion in described second eye action sequence, and described first eye action sequence is identical with the eye motion in same position in described second eye action sequence, then determine the eye motion information of described user and the eye motion information match of described benchmark.
In conjunction with the embodiment that the second of the third aspect is possible, in the 5th kind of possible embodiment of the third aspect, described processor, also for:
If there is at least one eye motion in described second eye action sequence is not included in described first eye action sequence, and/or, described first eye action sequence is not identical with the eye motion at least one same position in described second eye action sequence, then determine that the eye motion information of described user is not mated with the eye motion information of described benchmark.
In conjunction with any one possible embodiment above-mentioned of the third aspect or the third aspect, in the 6th kind of possible embodiment of the third aspect, described electronic equipment also comprises range sensor, is connected with described processor, now, described processor, also for:
Described from the image that the image acquisition component of described electronic equipment gets, before extracting the eye motion information of user, by described range sensor, obtain the distance of described user and described electronic equipment; When described distance is less than predeterminable range, in preset time period, obtain image by described image acquisition component.
In conjunction with the third aspect, or any one possible embodiment above-mentioned of the third aspect, in the 7th kind of possible embodiment of the third aspect, described eye motion information comprises following eye motion:
Eyes are opened; And/or eyes close; And/or left eye is opened, right eye closes; And/or right eye is opened, left eye closes; And/or eyes are seen along preset direction.
The application, owing to have employed when electronic equipment is in user interface lock-out state, from the image that the image acquisition component electronic equipment gets, extracts the eye motion information of user; According to the eye motion information of benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then electronic equipment is switched to user interface released state; If user is disabled user, then electronic equipment is remained on user interface lock-out state.So, efficiently solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury.Achieve when unlocking user interface, without the need to user on the touchscreen carry out touch operation, thus alleviate the situation that frequent paddling causes physical injury carried out to touch-screen, extend the technique effect in the serviceable life of touch-screen.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the method that in the embodiment of the present application one, user interface unlocks;
Fig. 2 is the structural representation of electronic equipment in the embodiment of the present application two;
Fig. 3 is the structural representation of electronic equipment in the embodiment of the present application three.
Embodiment
The embodiment of the present application is by a kind of provide user interface and unlock method and electronic equipment, solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury.
The technical scheme of the embodiment of the present application is for solving the problems of the technologies described above, and general thought is as follows:
The method that user interface unlocks, comprising: when electronic equipment is in user interface lock-out state, from the image that the image acquisition component (as: camera) electronic equipment gets, extracts the eye motion information of user; According to the eye motion information of benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then electronic equipment is switched to user interface released state; If user is disabled user, then electronic equipment is remained on user interface lock-out state.
Owing to have employed when electronic equipment is in user interface lock-out state, from the image that the image acquisition component electronic equipment gets, extract the eye motion information of user; According to the eye motion information of benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then electronic equipment is switched to user interface released state; If user is disabled user, then electronic equipment is remained on user interface lock-out state.So, efficiently solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury.Achieve when unlocking user interface, without the need to user on the touchscreen carry out touch operation, thus alleviate the situation that frequent paddling causes physical injury carried out to touch-screen, extend the technique effect in the serviceable life of touch-screen.
For making the object of the application one embodiment, technical scheme and advantage clearly, below in conjunction with the accompanying drawing in the embodiment of the present application, technical scheme in the embodiment of the present application is clearly and completely described, obviously, described embodiment is some embodiments of the present application, instead of whole embodiments.Based on the embodiment in the application, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
First illustrate, the term "and/or" occurred herein, being only a kind of incidence relation describing affiliated partner, can there are three kinds of relations in expression, and such as, A and/or B, can represent: individualism A, exists A and B simultaneously, these three kinds of situations of individualism B.In addition, character "/" herein, general expression forward-backward correlation is to the relation liking a kind of "or".
Secondly explanation, the term " electronic equipment " occurred herein, can be: mobile phone or panel computer or vehicle-mounted computer or digital camera or game machine or PDA(Personal Digital Assistant, personal digital assistant) etc., for this electronic equipment specifically which kind of electronic equipment, the application does not do concrete restriction.
Embodiment one
As shown in Figure 1, the embodiment of the present application provides a kind of method that user interface unlocks, and the executive agent of the method can be electronic equipment, or an application program in electronic equipment, and the method comprises:
Step 101: when electronic equipment is in user interface lock-out state, from the image that the image acquisition component (as: camera) electronic equipment gets, extracts the eye motion information of user.In specific implementation process, can according to the position relationship of upper eyelid (also known as upper eyelid) and palpebra inferior (also known as lower eyelid), determine that eyes are opened or closed, which and the direction of sight line according to the position relationship determination eyes of black eye ball and up/down eyelid, thus determine to include eye motion (such as: eyes are opened in the eye motion information of user; And/or eyes close; And/or left eye is opened, right eye closes; And/or right eye is opened, left eye closes; And/or eyes are seen etc. along preset direction, wherein, this preset direction includes but not limited to: front and/or top and/or below and/or left and/or right and/or upper left side and/or lower left and/or upper right side and/or lower right etc.).
Step 102: according to the eye motion information of benchmark and the eye motion information of user, authentication is carried out to user.Alternatively, in specific implementation process, the eye motion information of benchmark can before execution step 101, arranged by User Defined, such as: electronic equipment provides an interactive interface by its capacitive touch screen to user, this interactive interface shows option corresponding to multiple eye motion, user can select predetermined number (such as: 4 in order, or 6, or 8 etc.) eye motion (such as: eyes eyes front-eyes look down-eyes eye left-eyes eye right), the sequence of the eye motion that electronic equipment is selected based on user, generate the eye motion information of benchmark.Alternatively, in specific implementation process, the eye motion information of benchmark also can before execution step 101, freely recorded by image acquisition component, such as: electronic equipment provides an interactive interface by its capacitive touch screen to user, this interactive interface shows the option of " the eye motion information recording benchmark ", select after this option until user, electronic equipment starts its image acquisition component, at Preset Time (such as: 5 seconds, 10 seconds, 15 seconds etc.) in complete image acquisition, and the image collected is analyzed, obtain eye motion sequence (such as: eyes eyes front-eyes look down-eyes eye left-eyes eye right), again based on the sequence of the eye motion obtained, generate the eye motion information of benchmark.
Step 103: if user is validated user, then electronic equipment is switched to user interface released state.
Step 104: if user is disabled user, then electronic equipment is remained on user interface lock-out state.
In the embodiment of the present application, owing to have employed when electronic equipment is in user interface lock-out state, from the image that the image acquisition component electronic equipment gets, extract the eye motion information of user; According to the eye motion information of benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then electronic equipment is switched to user interface released state; If user is disabled user, then electronic equipment is remained on user interface lock-out state.So, efficiently solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury, achieve when unlocking user interface, without the need to user on the touchscreen carry out touch operation, thus alleviate the situation that frequent paddling causes physical injury carried out to touch-screen, extend the technique effect in the serviceable life of touch-screen.
Further, in the embodiment of the present application, carry out user interface unblock by the eye motion of user, user can be freed from traditional uninteresting unlocking operation, make unlocking operation be full of enjoyment, thus improve the experience sense of user.
And, in the embodiment of the present application, because the eye motion information of benchmark can be arranged by User Defined or freely be recorded by image acquisition component, this makes the eye motion in the eye motion information of benchmark can have multiple array configuration, not easily cracked by other people, thus improve the security of unlocking operation.
And, in the embodiment of the present application, owing to being carry out user interface unblock by the eye motion of user, when the current finger of user is inconvenient to carry out unlocking operation (such as: during in finger injuries, or when having a foul on pointing), user only need move several lower eyes facing to electronic equipment, just can realize the unblock of user interface, have advantage easily and efficiently.
In the embodiment of the present application, alternatively, the method also comprises:
When electronic equipment is in user interface lock-out state, electronic equipment is stoped to respond the user that the detects operation based on user interface; And/or
When electronic equipment is in user interface released state, the user that response electronic equipment detects is based on the operation of user interface.
In the embodiment of the present application, alternatively, eye motion information comprises following eye motion:
Eyes are opened; Eyes close; Left eye is opened, and right eye closes; Right eye is opened, and left eye closes; Eyes eyes front; Eyes look up; Eyes are looked down; Eyes eye left; Eyes eye right; Eyes are seen to upper left; Eyes are seen to left down; Eyes are seen to upper right; Eyes are seen to bottom right; Etc..
In the embodiment of the present application, alternatively, step 102, comprising:
If the eye motion information match of the eye motion information of user and benchmark, then determine that user is validated user;
If the eye motion information of user is not mated with the eye motion information of benchmark, then determine that user is disabled user.
In the embodiment of the present application, alternatively, the eye motion information of user comprises First view portion action sequence, and the first eye action sequence comprises at least two eye motion; The eye motion information of benchmark comprises Second Sight portion action sequence, and the second eye action sequence comprises at least two eye motion; Now, the eye motion information of user and the eye motion information match of benchmark, comprising:
Ratio between the situation occurrence number that eye motion in first eye action sequence and the second eye action sequence in same position is different and the eye motion sum that the second eye action sequence comprises is lower than the threshold value set.The object done so mainly considers that user is when doing eye motion, may insensitive due to eye muscle, and causes the nonstandard of eye motion.
Such as, first eye action sequence is: and eyes eye left-eyes eye right-eyes are closed-eyes look up-and eyes look down, second eye action sequence is: and eyes see to upper left-eyes eye right-eyes are closed-eyes look up-and eyes look down, the threshold value of setting is 30%, now, because the eye motion (eyes eye left) of first position in the first eye sequence is different from the eye motion (eyes are seen to upper left) of first position in the second eye sequence, the first eye action sequence situation occurrence number different with the eye motion in same position in the second eye action sequence is 1, the eye motion that second eye action sequence comprises adds up to 5, the ratio value of the two is 20%, this ratio value is lower than the threshold value 30% of setting, so, determine the eye motion information of user and the eye motion information match of benchmark, thus determine that user is validated user further.
In specific implementation process, the threshold value of this setting can be determined according to actual conditions, such as, can be: 40% or 30% or 20% or 5% etc., for the concrete value of the threshold value of this setting, the embodiment of the present application does not do concrete restriction.
In the embodiment of the present application, alternatively, the eye motion information of user is not mated with the eye motion information of benchmark, comprising:
Ratio between the situation occurrence number that eye motion in first eye action sequence and the second eye action sequence in same position is different and the eye motion sum that the second eye action sequence comprises is higher than the threshold value set.
Such as, first eye action sequence is: and eyes eye left-eyes eye right-eyes are closed-eyes look up-and eyes look down, second eye action sequence is: and eyes look down-eyes eye left-eyes are closed-eyes look up-and eyes look down, the threshold value of setting is 30%, now, because the eye motion (eyes eye left) of first position in the first eye sequence is different from the eye motion (eyes are looked down) of first position in the second eye sequence, and the eye motion (eyes eye right) of second position is different from the eye motion (eyes eye left) of second position in the second eye sequence in the first eye sequence, the first eye action sequence situation occurrence number different with the eye motion in same position in the second eye action sequence is 2, the eye motion that second eye action sequence comprises adds up to 5, the ratio value of the two is 40%, this ratio value is higher than the threshold value 30% of setting, so, determine that the eye motion information of user is not mated with the eye motion information of benchmark, thus determine that user is disabled user further.
In specific implementation process, the threshold value of this setting can be determined according to actual conditions, such as, can be: 40% or 30% or 20% or 5% etc., for the concrete value of the threshold value of this setting, the embodiment of the present application does not do concrete restriction.
In the embodiment of the present application, alternatively, the eye motion information of user comprises First view portion action sequence, and the first eye action sequence comprises at least two eye motion; The eye motion information of benchmark comprises Second Sight portion action sequence, and the second eye action sequence comprises at least two eye motion; Now, the eye motion information of user and the eye motion information match of benchmark, comprising:
First eye action sequence comprises each eye motion in the second eye action sequence, and the first eye action sequence is identical with the eye motion in same position in the second eye action sequence.
In the embodiment of the present application, alternatively, the eye motion information of user is not mated with the eye motion information of benchmark, comprising:
There is at least one eye motion in second eye action sequence not to be included in the first eye action sequence, and/or the first eye action sequence is not identical with the eye motion at least one same position in the second eye action sequence.
Such as, first eye action sequence is: and eyes eye left-eyes eye right-eyes look up-and eyes look down, and the second eye action sequence is: eyes eye left-eyes eye right-eyes are closed-and eyes open, due to, first eye action sequence does not comprise each eye motion in the second eye action sequence, so the eye motion information of user is not mated with the eye motion information of benchmark, user is not validated user, and namely electronic equipment needs to remain on user interface lock-out state.
Again such as, first eye action sequence is: and eyes eye left-eyes eye right-eyes look up-and eyes look down, and the second eye action sequence is: eyes eye left-eyes eye right-eyes look down-and eyes look up, due to, although comprise each eye motion in the second eye action sequence in the first eye action sequence, but the eye motion (eyes look up) in the first eye action sequence on the 3rd position is not identical with the eye motion (eyes are looked down) on the 3rd position in the second eye action sequence, moreover the eye motion (eyes are looked down) in the first eye action sequence on the 4th position is not identical with the eye motion (eyes look up) on the 4th position in the second eye action sequence yet, so, the eye motion information of user is not mated with the eye motion information of benchmark, user is not validated user, namely electronic equipment needs to remain on user interface lock-out state.
Again such as, first eye action sequence is: and eyes eye left-eyes eye right-eyes look up-and eyes look down, and the second eye action sequence is: eyes eye left-eyes eye right-eyes look up-and eyes look down, due to, first eye action sequence comprises each eye motion in the second eye action sequence, and the first eye action sequence is identical with the eye motion in same position in the second eye action sequence, so, the eye motion information of user and the eye motion information match of benchmark, user is validated user, namely electronic equipment needs to be switched to user interface released state.
In the embodiment of the present application, alternatively, in the image got from the image acquisition component of electronic equipment, before extracting the eye motion information of user, the method also comprises:
By the range sensor in electronic equipment, obtain the distance of user and electronic equipment;
When distance is less than predeterminable range, in preset time period, obtain image by image acquisition component.
In specific implementation process, this predeterminable range can be: the arbitrary value between 20 centimetres to 40 centimetres, also can be arranged by User Defined.
In specific implementation process, electronic equipment should complete image acquisition task in the preset time period after opening image acquisition, this preset time period can support that User Defined arranges (such as: arbitrary value between 5 seconds to 15 seconds), user's whole eye motion of having needed him to want during this period, otherwise, electronic equipment cannot obtain the whole eye motion of user, thus cause user to unlock failure.Unlocking unsuccessfully, electronic equipment should point out user to coordinate image acquisition component to re-start the collection of the eye motion information of user.
The technique effect of the embodiment of the present application is as follows:
1, in the embodiment of the present application, owing to have employed when electronic equipment is in user interface lock-out state, from the image that the image acquisition component electronic equipment gets, the eye motion information of user is extracted; According to the eye motion information of benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then electronic equipment is switched to user interface released state; If user is disabled user, then electronic equipment is remained on user interface lock-out state.So, efficiently solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury, achieve when unlocking user interface, without the need to user on the touchscreen carry out touch operation, thus alleviate the situation that frequent paddling causes physical injury carried out to touch-screen, extend the technique effect in the serviceable life of touch-screen.
2, in the embodiment of the present application, carry out user interface unblock by the eye motion of user, user can be freed from traditional uninteresting unlocking operation, make unlocking operation be full of enjoyment, thus improve the experience sense of user.
3, in the embodiment of the present application, because the eye motion information of benchmark can be arranged by User Defined or freely be recorded by image acquisition component, this makes the eye motion in the eye motion information of benchmark can have multiple array configuration, not easily cracked by other people, thus improve the security of unlocking operation.
4, in the embodiment of the present application, owing to being carry out user interface unblock by the eye motion of user, when the current finger of user is inconvenient to carry out unlocking operation (such as: during in finger injuries, or when having a foul on pointing), user only need move several lower eyes facing to electronic equipment, just can realize the unblock of user interface, there is advantage easily and efficiently.
Embodiment two
Based on same inventive concept, as shown in Figure 2, the embodiment of the present application provides a kind of electronic equipment 200, comprising:
Extraction unit 201, for when electronic equipment 200 is in user interface lock-out state, from the image that the image acquisition component electronic equipment 200 gets, extracts the eye motion information of user;
Identity authenticating unit 202, for receiving the eye motion information of user from extraction unit 201, and according to the eye motion information of benchmark and the eye motion information of user, carries out authentication to user;
User interface control unit 203, if be validated user for user, is then switched to user interface released state by electronic equipment 200; If user is disabled user, then electronic equipment 200 is remained on user interface lock-out state.
In specific implementation process, the eye motion information of benchmark can be arranged by User Defined, or is freely recorded by image acquisition component.
In the embodiment of the present application, alternatively, eye motion information comprises following eye motion:
Eyes are opened; Eyes close; Left eye is opened, and right eye closes; Right eye is opened, and left eye closes; Eyes eyes front; Eyes look up; Eyes are looked down; Eyes eye left; Eyes eye right; Eyes are seen to upper left; Eyes are seen to left down; Eyes are seen to upper right; Eyes are seen to bottom right; Etc..
In specific implementation process, electronic equipment 200 also should comprise:
Stop unit, for when electronic equipment 200 is in user interface lock-out state, stop electronic equipment 200 to respond the user that the detects operation based on user interface; And/or
Response unit, for when electronic equipment 200 is in user interface released state, the user that response electronic equipment 200 detects is based on the operation of user interface.
In the embodiment of the present application, alternatively, identity authenticating unit 202, comprising:
First determination module, if for the eye motion information of user and the eye motion information match of benchmark, then determines that user is validated user;
Second determination module, if do not mate with the eye motion information of benchmark for the eye motion information of user, then determines that user is disabled user.
In the embodiment of the present application, alternatively, the eye motion information of user comprises First view portion action sequence, and the first eye action sequence comprises at least two eye motion; The eye motion information of benchmark comprises Second Sight portion action sequence, and the second eye action sequence comprises at least two eye motion; Now, the first determination module, also for:
If the ratio between the situation occurrence number that the eye motion in the first eye action sequence and the second eye action sequence in same position is different and the eye motion sum that the second eye action sequence comprises lower than the threshold value set, then determines the eye motion information of user and the eye motion information match of benchmark.
In the embodiment of the present application, alternatively, the second determination module, also for:
If the ratio between the situation occurrence number that the eye motion in the first eye action sequence and the second eye action sequence in same position is different and the eye motion sum that the second eye action sequence comprises higher than the threshold value set, then determines that the eye motion information of user is not mated with the eye motion information of benchmark.
In the embodiment of the present application, alternatively, eye motion information comprises First view portion action sequence, and the first eye action sequence comprises at least two eye motion; The eye motion information of benchmark comprises Second Sight portion action sequence, and the second eye action sequence comprises at least two eye motion; Now, the first determination module, also for:
If the first eye action sequence comprises each eye motion in the second eye action sequence, and the first eye action sequence is identical with the eye motion in same position in the second eye action sequence, then determine the eye motion information of user and the eye motion information match of benchmark.
In the embodiment of the present application, alternatively, the second determination module, also for:
If there is at least one eye motion in the second eye action sequence is not included in the first eye action sequence, and/or, first eye action sequence is not identical with the eye motion at least one same position in the second eye action sequence, then determine that the eye motion information of user is not mated with the eye motion information of benchmark.
In the embodiment of the present application, alternatively, electronic equipment 200 also comprises:
First acquiring unit, in the image got from the image acquisition component of electronic equipment 200, before extracting the eye motion information of user, by the range sensor in electronic equipment 200, obtains the distance between user and electronic equipment;
Second acquisition unit, for when distance is less than predeterminable range, obtains image by image acquisition component in preset time period.
The technique effect of the embodiment of the present application is as follows:
1, in the embodiment of the present application, electronic equipment, when being in user interface lock-out state, by extraction unit from the image that image acquisition component gets, can extract the eye motion information of user; Again by the eye motion information of identity authenticating unit according to benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then by user interface control unit, electronic equipment is switched to user interface released state; If user is disabled user, then by user interface control unit, electronic equipment is remained on user interface lock-out state.So, efficiently solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury, achieve when unlocking user interface, without the need to user on the touchscreen carry out touch operation, thus alleviate the situation that frequent paddling causes physical injury carried out to touch-screen, extend the technique effect in the serviceable life of touch-screen.
2, in the embodiment of the present application, electronic equipment carries out user interface unblock by the eye motion of user, user can be freed from traditional uninteresting unlocking operation, make unlocking operation be full of enjoyment, thus improve the experience sense of user.
3, in the embodiment of the present application, because the eye motion information of benchmark can be arranged by User Defined or freely be recorded by image acquisition component, this makes the eye motion in the eye motion information of benchmark can have multiple array configuration, not easily cracked by other people, thus improve the security of unlocking operation.
4, in the embodiment of the present application, because electronic equipment carries out user interface unblock by the eye motion of user, when the current finger of user is inconvenient to carry out unlocking operation (such as: during in finger injuries, or when having a foul on pointing), user only need move several lower eyes facing to electronic equipment, just can realize the unblock of user interface, there is advantage easily and efficiently.
Embodiment three
Based on same inventive concept, as shown in Figure 3, the embodiment of the present application provides a kind of electronic equipment 300, comprising:
Image acquisition component 310(is as camera), for gathering image;
Storer 370, for eye motion information and the program code of Memory Reference;
One or more processor 320, is connected with image acquisition component 310 and storer 370, for reading the program code stored in storer, performs:
When electronic equipment 300 is in user interface lock-out state, from the image that image acquisition component 310 collects, extract the eye motion information of user; From storer 370, obtain the eye motion information of benchmark, and according to the eye motion information of benchmark and the eye motion information of user, the user in the image get image acquisition component 310 carries out authentication; If user is validated user, then electronic equipment 300 is switched to user interface released state; If user is disabled user, then electronic equipment 300 is remained on user interface lock-out state.
In specific implementation process, the eye motion information of benchmark can be arranged by User Defined, or is freely recorded by image acquisition component.
In specific implementation process, storer 370 can comprise: high-speed random access memory, and can comprise nonvolatile memory, such as one or more disk storage device, flash memory device or other non-volatile solid-state memory devices.In certain embodiments, storer 370 can also comprise the storer away from one or more processor 320, such as, via the network attached storage of communication network (not shown) access, wherein, this communication network can be the Internet, one or more in-house network, LAN (Local Area Network), wide area network, storage area network etc., or it is appropriately combined.
In the embodiment of the present application, alternatively, eye motion information comprises following eye motion:
Eyes are opened; Eyes close; Left eye is opened, and right eye closes; Right eye is opened, and left eye closes; Eyes eyes front; Eyes look up; Eyes are looked down; Eyes eye left; Eyes eye right; Eyes are seen to upper left; Eyes are seen to left down; Eyes are seen to upper right; Eyes are seen to bottom right; Etc..
In the embodiment of the present application, alternatively, processor 320, also for:
When electronic equipment 300 is in user interface lock-out state, electronic equipment 300 is stoped to respond the user that the detects operation based on user interface; And/or
When electronic equipment 300 is in user interface released state, the user that response electronic equipment 300 detects is based on the operation of user interface.
In the embodiment of the present application, alternatively, the eye motion information of user comprises First view portion action sequence, and the first eye action sequence comprises at least two eye motion; The eye motion information of benchmark comprises Second Sight portion action sequence, and the second eye action sequence comprises at least two eye motion; Now, processor 320 also for:
If the ratio between the situation occurrence number that the eye motion in the first eye action sequence and the second eye action sequence in same position is different and the eye motion sum that the second eye action sequence comprises lower than the threshold value set, then determines the eye motion information of user and the eye motion information match of benchmark.
In the embodiment of the present application, alternatively, processor 320, also for:
If the ratio between the situation occurrence number that the eye motion in the first eye action sequence and the second eye action sequence in same position is different and the eye motion sum that the second eye action sequence comprises higher than the threshold value set, then determines that the eye motion information of user is not mated with the eye motion information of benchmark.
In the embodiment of the present application, alternatively, eye motion information comprises First view portion action sequence, and the first eye action sequence comprises at least two eye motion; The eye motion information of benchmark comprises Second Sight portion action sequence, and the second eye action sequence comprises at least two eye motion; Now, processor 320, also for:
If the first eye action sequence comprises each eye motion in the second eye action sequence, and the first eye action sequence is identical with the eye motion in same position in the second eye action sequence, then determine the eye motion information of user and the eye motion information match of benchmark.
In the embodiment of the present application, alternatively, processor 320, also for:
If there is at least one eye motion in the second eye action sequence is not included in the first eye action sequence, and/or, first eye action sequence is not identical with the eye motion at least one same position in the second eye action sequence, then determine that the eye motion information of user is not mated with the eye motion information of benchmark.
In the embodiment of the present application, alternatively, electronic equipment 300, also comprises:
Range sensor 330(is such as: infrared distance sensor or ultrasonic distance sensor), be connected with processor 320, now, processor 320, also for:
In the image got from the image acquisition component 310 of electronic equipment 300, before extracting the eye motion information of user, by range sensor 330, obtain the distance between user and electronic equipment 300, and when this distance is less than predeterminable range, in preset time period, obtain image by image acquisition component 310.
In embodiments of the present invention, alternatively, electronic equipment 300, also comprises:
Capacitive touch screen 340, during the input media of capacitive touch screen 340 as electronic equipment 300, can based on capacitive sensing technology, the capacitance variations of the corresponding touch area that finger sensing touches and produces, thus determine the touch area of finger, and then enable user based on capacitive touch screen 340, realize the man-machine interaction with electronic equipment 300.In addition, during the display device of capacitive touch screen 340 as electronic equipment, can export to user's display of visually.This visual output can comprise text, figure, video and combination in any thereof.
In embodiments of the present invention, alternatively, electronic equipment 300, also comprises:
RF(Radio Frequency, radio frequency) circuit 350, be connected with processor 320, for receiving concurrent power transmission magnetic wave.Converting electrical signal is become electromagnetic wave by RF circuit 350, or electromagnetic waveform is become electric signal, and communicates with communication network and other communication facilitiess via electromagnetic wave.RF circuit 350 can comprise the known circuits for performing these functions, including, but not limited to antenna system, RF transceiver, one or more amplifier, tuner, one or more oscillator, digital signal processor, storer etc.RF circuit 350 can be communicated with other equipment with network by radio communication, and this network such as has another name called wireless network, the WLAN (wireless local area network) of the Internet of WWW, in-house network and/or such as cellular phone network and so on.
In embodiments of the present invention, alternatively, electronic equipment 300, also comprises:
WiFi module 360, is connected with processor 320, communicates in access network for being undertaken by WiFi signal.
In embodiments of the present invention, alternatively, electronic equipment 300, also comprises:
Voicefrequency circuit 390, is connected with processor 320, comprises: loudspeaker, microphone, audio interface between user and electronic equipment 300.Voicefrequency circuit 390 receives the voice data of self processor 320, voice data is transformed into electric signal, and electric signal is sent to loudspeaker.The sound wave that converting electrical signal becomes the mankind to hear by loudspeaker.Voicefrequency circuit 390 also receives the electric signal converted from sound wave by microphone.Voice data by converting electrical signal audio data, and is sent to processor 320 by this voicefrequency circuit 390, to process.
In embodiments of the present invention, alternatively, electronic equipment 300, also comprises:
Power-supply system 380, this power-supply system 380 can comprise power-supply management system, one or more power supply (such as battery, alternating current, charging system, power failure detection circuit, power supply changeover device or inverter, power supply status indicator, and other any assemblies be associated that generate with the electric energy in portable set, manage and distribute.
The technique effect of the embodiment of the present application is as follows:
1, in the embodiment of the present application, electronic equipment, when being in user interface lock-out state, by processor from the image that image acquisition component gets, can extract the eye motion information of user; Again by the eye motion information of processor according to benchmark and the eye motion information of user, authentication is carried out to user; If user is validated user, then by processor, electronic equipment is switched to user interface released state; If user is disabled user, then by processor, electronic equipment is remained on user interface lock-out state.So, efficiently solve user interface unlock method of the prior art, due to need user on the touchscreen carry out touch operation, and cause technical matters touch-screen being caused to physical injury, achieve when unlocking user interface, without the need to user on the touchscreen carry out touch operation, thus alleviate the situation that frequent paddling causes physical injury carried out to touch-screen, extend the technique effect in the serviceable life of touch-screen.
2, in the embodiment of the present application, electronic equipment carries out user interface unblock by the eye motion of user, user can be freed from traditional uninteresting unlocking operation, make unlocking operation be full of enjoyment, thus improve the experience sense of user.
3, in the embodiment of the present application, because the eye motion information of benchmark can be arranged by User Defined or freely be recorded by image acquisition component, this makes the eye motion in the eye motion information of benchmark can have multiple array configuration, not easily cracked by other people, thus improve the security of unlocking operation.
4, in the embodiment of the present application, because electronic equipment carries out user interface unblock by the eye motion of user, when the current finger of user is inconvenient to carry out unlocking operation (such as: during in finger injuries, or when having a foul on pointing), user only need move several lower eyes facing to electronic equipment, just can realize the unblock of user interface, there is advantage easily and efficiently.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt the form of complete hardware embodiment, completely software implementation or the embodiment in conjunction with software and hardware aspect.And the present invention can adopt in one or more form wherein including the upper computer program implemented of computer-usable storage medium (including but not limited to magnetic disk memory, CD-ROM, optical memory etc.) of computer usable program code.
The present invention describes with reference to according to the process flow diagram of the method for the embodiment of the present invention, equipment (system) and computer program and/or block scheme.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block scheme and/or square frame and process flow diagram and/or block scheme and/or square frame.These computer program instructions can being provided to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, making the instruction performed by the processor of computing machine or other programmable data processing device produce device for realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make on computing machine or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computing machine or other programmable devices is provided for the step realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
Although describe the preferred embodiments of the present invention, those skilled in the art once obtain the basic creative concept of cicada, then can make other change and amendment to these embodiments.So claims are intended to be interpreted as comprising preferred embodiment and falling into all changes and the amendment of the scope of the invention.
Obviously, those skilled in the art can carry out various change and modification to the present invention and not depart from the spirit and scope of the present invention.Like this, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.

Claims (18)

1. the method that unlocks of user interface, is characterized in that, comprising:
When electronic equipment is in user interface lock-out state, from the image that the image acquisition component of described electronic equipment gets, extract the eye motion information of user;
According to the eye motion information of benchmark and the eye motion information of described user, authentication is carried out to described user;
If described user is validated user, then described electronic equipment is switched to user interface released state;
If described user is disabled user, then described electronic equipment is remained on user interface lock-out state.
2. the method for claim 1, is characterized in that, described according to the eye motion information of benchmark and the eye motion information of described user, carries out authentication, comprising described user:
If the eye motion information match of the eye motion information of described user and described benchmark, then determine that described user is for validated user;
If the eye motion information of described user is not mated with the eye motion information of described benchmark, then determine that described user is disabled user.
3. method as claimed in claim 2, it is characterized in that, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, the eye motion information of described user and the eye motion information match of described benchmark, comprising:
Ratio between the situation occurrence number that eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises is lower than the threshold value set.
4. method as claimed in claim 3, it is characterized in that, the eye motion information of described user is not mated with the eye motion information of described benchmark, comprising:
Ratio between the situation occurrence number that eye motion in described first eye action sequence and described second eye action sequence in same position is different and the eye motion sum that described second eye action sequence comprises is higher than the threshold value set.
5. method as claimed in claim 2, it is characterized in that, the eye motion information of described user comprises First view portion action sequence, and described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, the eye motion information of described user and the eye motion information match of described benchmark, comprising:
Described first eye action sequence comprises each eye motion in described second eye action sequence, and described first eye action sequence is identical with the eye motion in same position in described second eye action sequence.
6. method as claimed in claim 5, it is characterized in that, the eye motion information of described user is not mated with the eye motion information of described benchmark, comprising:
There is at least one eye motion in described second eye action sequence not to be included in described first eye action sequence, and/or described first eye action sequence is different with the eye motion at least one same position in described second eye action sequence.
7. the method as described in as arbitrary in claim 1 ~ 6, is characterized in that, described from the image that the image acquisition component of described electronic equipment gets, and before extracting the eye motion information of user, also comprises:
By the range sensor of described electronic equipment, obtain the distance between described user and described electronic equipment;
When described distance is less than predeterminable range, in preset time period, obtain described image by described image acquisition component.
8. the method as described in as arbitrary in claim 1 ~ 7, it is characterized in that, described eye motion information comprises following eye motion:
Eyes are opened; And/or eyes close; And/or left eye is opened, right eye closes; And/or right eye is opened, left eye closes; And/or eyes are seen along preset direction.
9. an electronic equipment, is characterized in that, comprising:
Extraction unit, during for being in user interface lock-out state at electronic equipment, from the image that the image acquisition component of described electronic equipment gets, extracts the eye motion information of user;
Identity authenticating unit, for receiving the eye motion information of described user from described extraction unit, and according to the eye motion information of benchmark and the eye motion information of described user, carries out authentication to described user;
User interface control unit, if be validated user for described user, is then switched to user interface released state by described electronic equipment; If described user is disabled user, then described electronic equipment is remained on user interface lock-out state.
10. electronic equipment as claimed in claim 9, it is characterized in that, described identity authenticating unit, comprising:
First determination module, if for the eye motion information of described user and the eye motion information match of described benchmark, then determines that described user is for validated user;
Second determination module, if do not mate with the eye motion information of described benchmark for the eye motion information of described user, then determines that described user is disabled user.
11. electronic equipments as claimed in claim 10, it is characterized in that, the eye motion information of described user comprises First view portion action sequence, described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, described first determination module, also for:
If described first eye action sequence comprises each eye motion in described second eye action sequence, and described first eye action sequence is identical with the eye motion in same position in described second eye action sequence, then determine the eye motion information of described user and the eye motion information match of described benchmark.
12. electronic equipments as claimed in claim 11, is characterized in that, described second determination module, also for:
If there is at least one eye motion in described second eye action sequence is not included in described first eye action sequence, and/or, described first eye action sequence is different with the eye motion at least one same position in described second eye action sequence, then determine that the eye motion information of described user is not mated with the eye motion information of described benchmark.
13. as arbitrary in claim 9 ~ 12 as described in electronic equipment, it is characterized in that, described electronic equipment, also comprises:
First acquiring unit, for described from the image that the image acquisition component of described electronic equipment gets, before extracting the eye motion information of user, by the range sensor of described electronic equipment, obtain the distance between described user and described electronic equipment;
Second acquisition unit, for when described distance is less than predeterminable range, obtains image by described image acquisition component in preset time period.
14. 1 kinds of electronic equipments, is characterized in that, comprising:
Image acquisition component, for gathering image;
Storer, for eye motion information and the program code of Memory Reference;
Processor, is connected with described image acquisition component and described storer, for reading the program code stored in described storer, performs:
When electronic equipment is in user interface lock-out state, from the image that described image acquisition component collects, extract the eye motion information of user; From described storer, obtain the eye motion information of described benchmark, and according to the eye motion information of described benchmark and the eye motion information of described user, authentication is carried out to described user; If described user is validated user, then described electronic equipment is switched to user interface released state; If described user is disabled user, then described electronic equipment is remained on user interface lock-out state.
15. electronic equipments as claimed in claim 14, is characterized in that, described processor, also for:
If the eye motion information match of the eye motion information of described user and described benchmark, then determine that described user is for validated user; If the eye motion information of described user is not mated with the eye motion information of described benchmark, then determine that described user is disabled user.
16. electronic equipments as claimed in claim 15, it is characterized in that, the eye motion information of described user comprises First view portion action sequence, described first eye action sequence comprises at least two eye motion; The eye motion information of described benchmark comprises Second Sight portion action sequence, and described second eye action sequence comprises at least two eye motion; Now, described processor, also for:
If described first eye action sequence comprises each eye motion in described second eye action sequence, and described first eye action sequence is identical with the eye motion in same position in described second eye action sequence, then determine the eye motion information of described user and the eye motion information match of described benchmark.
17. electronic equipments as claimed in claim 16, is characterized in that, described processor, also for:
There is at least one eye motion in described second eye action sequence is not included in described first eye action sequence, and/or, described first eye action sequence is different with the eye motion at least one same position in described second eye action sequence, then determine that the eye motion information of described user is not mated with the eye motion information of described benchmark.
18. as arbitrary in claim 14 ~ 17 as described in electronic equipment, it is characterized in that, described electronic equipment also comprises range sensor, is connected with described processor, now, described processor, also for:
Described from the image that the image acquisition component of described electronic equipment gets, before extracting the eye motion information of user, by described range sensor, obtain the distance between described user and described electronic equipment; When described distance is less than predeterminable range, in preset time period, obtain described image by described image acquisition component.
CN201310567222.7A 2013-11-14 2013-11-14 The method and electronic equipment that a kind of user interface unlocks Active CN104636051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310567222.7A CN104636051B (en) 2013-11-14 2013-11-14 The method and electronic equipment that a kind of user interface unlocks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310567222.7A CN104636051B (en) 2013-11-14 2013-11-14 The method and electronic equipment that a kind of user interface unlocks

Publications (2)

Publication Number Publication Date
CN104636051A true CN104636051A (en) 2015-05-20
CN104636051B CN104636051B (en) 2018-04-27

Family

ID=53214867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310567222.7A Active CN104636051B (en) 2013-11-14 2013-11-14 The method and electronic equipment that a kind of user interface unlocks

Country Status (1)

Country Link
CN (1) CN104636051B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915589A (en) * 2015-06-24 2015-09-16 北京百纳威尔科技有限公司 Terminal unlocking method and terminal
CN105025018A (en) * 2015-07-06 2015-11-04 国网山东寿光市供电公司 Method for safety verification in communication process
CN105812143A (en) * 2016-03-29 2016-07-27 努比亚技术有限公司 Eye pattern decryption device and method
WO2021258948A1 (en) * 2020-06-23 2021-12-30 中兴通讯股份有限公司 Terminal control method and apparatus, and terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005036523A (en) * 2003-07-16 2005-02-10 Nec Corp Electronic lock control system and method, and portable information terminal and authentication device used for the same
CN101809581A (en) * 2007-09-24 2010-08-18 苹果公司 Embedded authentication systems in an electronic device
CN102981736A (en) * 2012-10-29 2013-03-20 华为终端有限公司 Screen unlocking method and screen unlocking terminal
US20130135196A1 (en) * 2011-11-29 2013-05-30 Samsung Electronics Co., Ltd. Method for operating user functions based on eye tracking and mobile device adapted thereto
US20130275309A1 (en) * 2012-04-13 2013-10-17 Francis King Hei KWONG Electronic-payment authentication process with an eye-positioning method for unlocking a pattern lock

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005036523A (en) * 2003-07-16 2005-02-10 Nec Corp Electronic lock control system and method, and portable information terminal and authentication device used for the same
CN101809581A (en) * 2007-09-24 2010-08-18 苹果公司 Embedded authentication systems in an electronic device
US20130135196A1 (en) * 2011-11-29 2013-05-30 Samsung Electronics Co., Ltd. Method for operating user functions based on eye tracking and mobile device adapted thereto
US20130275309A1 (en) * 2012-04-13 2013-10-17 Francis King Hei KWONG Electronic-payment authentication process with an eye-positioning method for unlocking a pattern lock
CN102981736A (en) * 2012-10-29 2013-03-20 华为终端有限公司 Screen unlocking method and screen unlocking terminal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915589A (en) * 2015-06-24 2015-09-16 北京百纳威尔科技有限公司 Terminal unlocking method and terminal
CN105025018A (en) * 2015-07-06 2015-11-04 国网山东寿光市供电公司 Method for safety verification in communication process
CN105812143A (en) * 2016-03-29 2016-07-27 努比亚技术有限公司 Eye pattern decryption device and method
WO2021258948A1 (en) * 2020-06-23 2021-12-30 中兴通讯股份有限公司 Terminal control method and apparatus, and terminal and storage medium

Also Published As

Publication number Publication date
CN104636051B (en) 2018-04-27

Similar Documents

Publication Publication Date Title
KR102080183B1 (en) Electronic device and method for unlocking in the electronic device
RU2637900C2 (en) Element activation method and device
CN104991721B (en) A kind of fingerprint operating method and device
KR102041984B1 (en) Mobile apparatus having function of face recognition with additional component
US9965086B2 (en) Method for enabling function module of terminal, and terminal device
CN106022072A (en) Method and device for achieving fingerprint unlocking and electronic equipment
CN104361272A (en) Fingerprint input information processing method and system and mobile terminal
CN103886237A (en) Control method and system for electronic device with fingerprint sensor and touch screen
CN103064606A (en) Screen unlocking method for mobile terminal
CN105094621A (en) Application starting method and terminal device
CN104809402B (en) A kind of information fuzzy display methods and terminal
CN105022955B (en) A kind of locking means and mobile terminal of application program
CN104992092A (en) Method, device and system for fingerprint information verification
CN104679387A (en) Privacy information protection method and terminal
CN103716309A (en) Security authentication method and terminal
CN104346063A (en) Information processing method and electronic equipment
CN103019599B (en) Electronic equipment and unlocking screen method thereof
CN106815509B (en) A kind of multimedia file guard method, device and electronic equipment
CN105550557A (en) Method for logging in different systems through fingerprint recognition and terminal device
CN104636051A (en) User interface unlocking method and electronic equipment
CN104035698A (en) Terminal equipment and state switchover method for same
CN103902029A (en) Mobile terminal and unlocking method thereof
CN107016337A (en) A kind of fingerprint identification method and mobile terminal
CN104754215A (en) Shooting method and terminal
CN106022071A (en) Fingerprint unlocking method and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant