CN101799717A - Man-machine interaction method based on hand action catch - Google Patents

Man-machine interaction method based on hand action catch Download PDF

Info

Publication number
CN101799717A
CN101799717A CN 201010118444 CN201010118444A CN101799717A CN 101799717 A CN101799717 A CN 101799717A CN 201010118444 CN201010118444 CN 201010118444 CN 201010118444 A CN201010118444 A CN 201010118444A CN 101799717 A CN101799717 A CN 101799717A
Authority
CN
China
Prior art keywords
mark
hand
coordinate
mouse
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201010118444
Other languages
Chinese (zh)
Inventor
何明霞
宁福星
李萌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN 201010118444 priority Critical patent/CN101799717A/en
Publication of CN101799717A publication Critical patent/CN101799717A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention belongs to a technology of vision measurement and the technical field of computers and relates to a man-machine interaction method based on hand action catch. The man-machine interaction method comprises the following steps of: adhering special marks to two different parts of a hand and arranging the hand into a visual field of a camera; carrying out program initialization comprising initializing video configuration and reading a mark sample database and a camera parameter; obtaining a video image, probing the marks and identifying different mark samples; calculating a three-dimensional coordinate of a probed mark center point in a mark coordinate system; and controlling a mouse based on variation of coordinate values of different marks. The invention realizes basic functions of the mouse by adopting three-dimensional information, has the advantages of simple structure, low cost and the like, gets rid of restraint of a communication wire by adopting non-contact measurement, can be used for a stereo vision system, a virtual reality system and an augmented reality system, develops civil products, and also provides a new technology for developing novel man-machine interaction mode and control mode.

Description

Based on hand behavior catcher machine interaction method
Technical field
The present invention relates to utilize and catch the method that the hand behavior realizes man-machine interaction, belong to vision measurement technology and field of computer technology.
Background technology
Be subjected to the restriction of main flow display device and application software, traditional man-machine interaction mode is a kind of operation that is limited in the two dimensional surface usually.Proposition along with notions such as stereoscopic vision, virtual reality, augmented realities, the user likes more natural mode, in the virtual environment of more imaginative power, carry out reciprocation and influence each other with operand, thereby produce " immersing " impression and experience in being equal to true environment, therefore the data glove Study on Technology is developed and uses.Data glove can be followed the tracks of flexible and changeable gesture of user and dimensional orientation, makes the operator naturally and understandably the consciousness of oneself be passed to computing machine.More existing research institutions are doing a lot of work aspect the research and development of data glove both at home and abroad, and have released some data glove product and application thereof.The data glove product all must utilize accurate sensor to realize the measurement of data, and the difference of used sensor makes differing greatly of measuring accuracy and these two technical indicators of maximum sample frequency.The data glove bulk material mostly is expensive snapback fibre, and all needs to be worn on operator's hand, data glove to manipulate between process and the computing machine in use to be connected with order wire, and communication interface adopts RS232, high bit rate 115.2kbps.These data glove valuable product and be used for scientific research do not see that as yet civilian level product releases more.
Summary of the invention
The objective of the invention is to adopt two-dimensional signal to carry out man-machine interaction at traditional man-machine interaction mode (conventional mouse), do not fit into novel display technique (as the 3D projection, 3-D display), and data glove equipment complexity, cost an arm and a leg and deficiencies such as connection constraint are arranged, proposed a kind of based on hand behavior catcher machine interactive mode, i.e. vision mouse.Adopt this non-contact measurement of vision measurement, realization is obtained operation hand specific markers three-dimensional information, and realizes man-machine interaction according to the changes in coordinates of mark, has realized the function of conventional mouse.
The present invention adopts following technical scheme:
A kind of based on hand behavior catcher machine interaction method, utilize camera collection to post the hand images of mark, it is sent in the computing machine, in computing machine, store marker samples parameter and camera characteristic parameter, comprise the following steps:
(1) at least two at hand can stick different marks in independently movable position, and hand are placed in the visual field of camera;
(2) utilize the camera continuous acquisition to comprise the hand images sequence of whole marks, it is sent in the computing machine;
(3) computing machine is according to the corresponding marker samples parameter of being stored, and snoop tag is also discerned not isolabeling;
(4) three-dimensional coordinate of mark center point in the mark coordinate system that is detected according to the hand images of being gathered and camera calculation of characteristic parameters;
(5) set a change threshold that is marked at the coordinate figure of hand images hand images follow-up, whether exceed the change threshold that sets according to isolabeling coordinate change amount not with it, and the direction of changes in coordinates, realization is to the control of mouse.
As preferred implementation, of the present invention based on hand behavior catcher machine interaction method, step wherein (3) comprises the following steps:
(1) sets binary-state threshold, convert the hand images that collects to bianry image;
(2) this bianry image is done the connected domain analysis, search for and discern the image-region that all have edge feature;
(3) image-region and each the marker samples parameter of each that will identify with edge feature compares, and identifies each not isolabeling.
Be located at and post two different marks of mark A and mark B on hand, the hand images of being gathered is the plane picture of hand, and step wherein (5) comprises the following steps:
(1) adjacent two two field pictures is calculated the D coordinates value of mark A and mark B respectively and try to achieve the change amount of each mark coordinate figure; Whether judge mark A surpasses described change threshold with respect to the left and right directions on manual manipulation plane or the coordinate change amount of fore-and-aft direction, if surpass, then call the mouse_event function in the Windows api function storehouse, according to coordinate change amount, moving direction and the mobile pixel number of mouse beacon on the basis of current location.
(2) if marker for determination A does not move, then continue judge mark B and whether surpass described change threshold with respect to the above-below direction coordinate change amount on manual manipulation plane, if surpass, then call the mouse_event function among the Windows API, mouse beacon produces the right button operation.
(3) if marker for determination B does not carry out the right button operation, then continue the coordinate change amount of judge mark B on the manual manipulation plane and whether surpass described change threshold, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the left button operation.
Man-computer mode of the present invention adopts vision technique to carry out three-dimensional coordinate and asks for, and is subjected to the restriction of camera sample rate and program runtime, and the highest sample frequency is 15.6Hz.The precision of cursor control has reached 2 pixels, error rate by key control is 4%, can realize that the cursor of mouse moves control and button control function, can finish the routine operation (browsing page to computing machine, file operations etc.), to take Installed System Memory be 40M in program run.The present invention adopts three-dimensional information to realize the basic function of mouse, has characteristics such as constraint simple in structure, that cost is low, the employing non-cpntact measurement is broken away from connection.
Description of drawings
The used mark synoptic diagram of Fig. 1 the present invention, (a): the back of the hand standard laid down by the ministries or commissions of the Central Government note; (b): forefinger end mark;
The pinhole imaging system model synoptic diagram that Fig. 2 the present invention uses;
Fig. 3 the present invention realizes the mouse control program flow diagram;
Fig. 4 the present invention realizes cursor of mouse positioning function process flow diagram;
Fig. 5 the present invention realizes mousebutton control function process flow diagram.
Embodiment
The invention provides a kind of camera that adopts and replace the special-purpose Fibre Optical Sensor or the man-machine interaction method of piezoelectric sensor, can realize obtaining operator's hand three-dimensional information.Only need to paste specific markers, adopt the non-contact information metering system, broken away from the constraint of gloves and order wire, improved the comfort level and the degree of freedom of operator's hand in the back of the hand portion and finger tip.The present invention can finish traditional mouse basic function, promptly the mouse control method of catching based on the hand behavior both can use jointly with mouse, also can not have independent use the under the situation of mouse, can realize each generic operation to computing machine fully, experiment test is satisfactory for result.
The invention will be further described below in conjunction with accompanying drawing and example.
Peripherals of the present invention is made up of camera, specific markers, fixed support, image data line and computing machine.Selected have automatic focus function and Ka Er Zeiss camera lens, and model is the camera that sieve skill is seen mini high definition type soon, technical parameter: 2,000,000 pixel ccd image sensors, 30 frame/seconds of sample rate.Camera is fixed on the support, takes over against hand place operation planar.The mark that is adopted as shown in Figure 1, two mark A and B are pasted on the back of the hand and forefinger end respectively, are affixed on the right hand usually by custom.Marker characteristic is black squares frame class mark, and distinguishes with the difference of marker character center pattern.The size of the back of the hand mark A is elected length of side 5cm as, forefinger end mark B length of side 1.5cm; Mark A center pattern is circle, and mark B center pattern is a right-angle triangle.
Verified the relation between tracking effective range (ultimate range on video camera and mark place plane under promptly can the capture of labels situation) and the label size by experiment.The capturing ability of this man-machine interaction mode adversary portion gauge point and the magnitude relationship of label size are tight.Adopt the mark of 7cm, 9cm, 11cm and four kinds of different sizes of 18cm, obtain to catch under the different size parameter effective range of hand gauge point.By experiment, can obtain as drawing a conclusion: label size is big more, and the tracking effective range is big more.But owing to influenced by the visual field, the two is not linear change, and along with the increase of label size, it is more little that the tracking effective range increases by on a year-on-year basis.Because mark must be affixed on palm back and finger tip, be subject to the size of the back of the hand and finger tip, the size of determining mark A and B at last is respectively 5cm and 1.5cm, the two all can effectively be hunted down in the tracking scope of 25cm, determine that thus support bracket fastened height is 30cm, camera is 25cm apart from mark place plan range.
The present invention adopts this Integrated Development Environment of Microsoft viscul c++6.0, by above-mentioned hardware and program, has realized the function of mouse action.Fig. 3 realizes the process flow diagram of mouse action for the present invention.Concrete steps are as follows:
(1) program initialization comprises the initialization video configuration, reads marker samples parameter and camera characteristic parameter.The initialization video configuration comprises photographic images size, sampling frame frequency and color space, is configured to 640*480,30 frame/seconds and RGB 24 respectively.
(2) obtain video image, snoop tag is also discerned different marker samples.Utilize single camera to obtain to comprise the image of whole marks to be input in the system, adopt binary-state threshold 100, convert the coloured image that collects to bianry image.This bianry image is done the connected domain analysis, search for and discern the image-region that all have the square edge feature.Utilize template matching algorithm, will extract to such an extent that the mark in image-region and the marker samples storehouse is compared, find out zone that matching degree is higher than setup parameter and promptly think the not isolabeling that the system that identifies adopts.
(3) calculate the three-dimensional coordinate of mark center point in the mark coordinate system that is detected.By the pin-hole imaging model, set up mark coordinate system O m-X mY mZ m, camera coordinate system O c-X cY cZ cWith the desirable screen coordinate system O of video camera u-X uY u, the mutual relationship between coordinate system as shown in Figure 4.Desirable screen coordinate (x by mark center point u, y u) and transition matrix T Cm(the mark coordinate is tied to the transition matrix between the camera coordinate system) calculates the three-dimensional coordinate (x of different mark center point in the mark coordinate system m, y m, z m).
(4) according to the not variation of the coordinate figure of isolabeling, call the mouse_event function in the Windows api function storehouse, realize the control to mouse, the flow process of cursor positioning operation as shown in Figure 4.The D coordinates value of record mark A is also calculated the change amount of adjacent two two field picture coordinate figures; Whether the coordinate change amount of the x direction of judge mark A (manual manipulation plane left and right directions) and y direction (manual manipulation plane fore-and-aft direction) surpasses threshold value 10, if surpass, then call the SetCursorPos (xvalue in the Windows api function storehouse, yvalue) function, mouse beacon moving direction and the mobile pixel number on the basis of current location.The pixel number and the coordinate change amount that move are linear, and scale-up factor can pass through program setting.
(5) if marker for determination A does not move, whether z direction (manual manipulation plane above-below direction) the coordinate change amount that then continues judge mark B surpasses threshold value 20, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the left button operation.If marker for determination B does not carry out the left button operation, then continue the coordinate change amount of judge mark B on x-y plane (hand place operation planar) and whether surpass setting threshold, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the right button operation.The process flow diagram of mouse beacon left and right press key operation as shown in Figure 5.
(6) close video acquisition, EOP (end of program).
At last, catch mutual experiment by the hand gauge point, some important technology parameters of the behavior of making type hand being caught human-computer interaction technology quantize, and estimate this technology actual effect.The maximum sample frequency that experiment test obtains this technology is 15.6Hz, the precision of cursor control has reached 2 pixels, error rate by key control is 4%, it is 40M that program run takies Installed System Memory, the cursor that can realize mouse moves control and button control function, can finish routine operation (browsing page, file operation etc.) to computing machine.
The present invention is because of system simply greatly reduces cost, and the civilian commercialization that helps technology is popularized.Different with traditional human-computer interaction device, this man-computer mode adopts space three-dimensional information to realize interactive operation, under the situation that series of concepts such as stereoscopic vision, virtual reality and augmented reality constantly are suggested, this man-computer mode has remarkable advantages more, will be more widely used and more long-range development.

Claims (3)

1. one kind based on hand behavior catcher machine interaction method, utilizes camera collection to post the hand images of mark, and it is sent in the computing machine, stores marker samples parameter and camera characteristic parameter in computing machine, comprises the following steps:
(1) at least two at hand can stick different marks in independently movable position, and hand are placed in the visual field of camera;
(2) utilize the camera continuous acquisition to comprise the hand images sequence of whole marks, it is sent in the computing machine;
(3) computing machine is according to the corresponding marker samples parameter of being stored, and snoop tag is also discerned not isolabeling;
(4) three-dimensional coordinate of mark center point in the mark coordinate system that is detected according to the hand images of being gathered and camera calculation of characteristic parameters;
(5) set a change threshold that is marked at the coordinate figure of hand images hand images follow-up, whether exceed the change threshold that sets according to isolabeling coordinate change amount not with it, and the direction of changes in coordinates, realization is to the control of mouse.
2. it is characterized in that based on hand behavior catcher machine interaction method that according to claim 1 is described step wherein (3) comprises the following steps:
(1) sets binary-state threshold, convert the hand images that collects to bianry image;
(2) this bianry image is done the connected domain analysis, search for and discern the image-region that all have edge feature;
(3) image-region and each the marker samples parameter of each that will identify with edge feature compares, and identifies each not isolabeling.
3. described based on hand behavior catcher machine interaction method according to claim 1, it is characterized in that, be located at and post two different marks of mark A and mark B on hand, the hand images of being gathered is the plane picture of hand, and step wherein (5) comprises the following steps:
(1) adjacent two two field pictures is calculated the D coordinates value of mark A and mark B respectively and try to achieve the change amount of each mark coordinate figure; Whether judge mark A surpasses described change threshold with respect to the left and right directions on manual manipulation plane or the coordinate change amount of fore-and-aft direction, if surpass, then call the mouse_event function in the Windows api function storehouse, according to coordinate change amount, moving direction and the mobile pixel number of mouse beacon on the basis of current location.
(2) if marker for determination A does not move, then continue judge mark B and whether surpass described change threshold with respect to the above-below direction coordinate change amount on manual manipulation plane, if surpass, then call the mouse_event function among the Windows API, mouse beacon produces the right button operation.
(3) if marker for determination B does not carry out the right button operation, then continue the coordinate change amount of judge mark B on the manual manipulation plane and whether surpass described change threshold, if surpass, then call the mouse_event function in the Windows api function storehouse, mouse beacon produces the left button operation.
CN 201010118444 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch Pending CN101799717A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010118444 CN101799717A (en) 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010118444 CN101799717A (en) 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch

Publications (1)

Publication Number Publication Date
CN101799717A true CN101799717A (en) 2010-08-11

Family

ID=42595416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010118444 Pending CN101799717A (en) 2010-03-05 2010-03-05 Man-machine interaction method based on hand action catch

Country Status (1)

Country Link
CN (1) CN101799717A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101923433A (en) * 2010-08-17 2010-12-22 北京航空航天大学 Man-computer interaction mode based on hand shadow identification
CN102156808A (en) * 2011-03-30 2011-08-17 北京触角科技有限公司 System and method for improving try-on effect of reality real-time virtual ornament
CN102411453A (en) * 2010-09-21 2012-04-11 北京市通州区科学技术协会 Method and device for enhancing outdoor practicability of virtual touch-screen system
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN103135882A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Method and system for control of display of window image
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103699213A (en) * 2012-12-19 2014-04-02 苏州贝腾特电子科技有限公司 Control method for double click of virtual mouse
CN103941861A (en) * 2014-04-02 2014-07-23 北京理工大学 Multi-user cooperation training system adopting mixed reality technology
CN104199548A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199549A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN106293078A (en) * 2016-08-02 2017-01-04 福建数博讯信息科技有限公司 Virtual reality exchange method based on photographic head and device
CN106796649A (en) * 2014-05-24 2017-05-31 远程信息技术发展中心 Use the man-machine interface based on attitude of label
CN107341818A (en) * 2016-04-29 2017-11-10 北京博酷科技有限公司 Image analysis algorithm for the test of touch-screen response performance
CN107450714A (en) * 2016-05-31 2017-12-08 大唐电信科技股份有限公司 Man-machine interaction support test system based on augmented reality and image recognition
CN108523281A (en) * 2017-03-02 2018-09-14 腾讯科技(深圳)有限公司 Gloves peripheral hardware, method, apparatus and system for virtual reality system
CN109243575A (en) * 2018-09-17 2019-01-18 华南理工大学 A kind of virtual acupuncture-moxibustion therapy method and system based on mobile interaction and augmented reality
US10262197B2 (en) 2015-11-17 2019-04-16 Huawei Technologies Co., Ltd. Gesture-based object measurement method and apparatus
WO2019134606A1 (en) * 2018-01-05 2019-07-11 Oppo广东移动通信有限公司 Terminal control method, device, storage medium, and electronic apparatus
WO2022126775A1 (en) * 2020-12-14 2022-06-23 安徽鸿程光电有限公司 Cursor control method and apparatus, device and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
CN1664755A (en) * 2005-03-11 2005-09-07 西北工业大学 Video recognition input system
EP1852774A2 (en) * 2006-05-03 2007-11-07 Mitsubishi Electric Corporation Method and system for emulating a mouse on a multi-touch sensitive surface
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
CN1664755A (en) * 2005-03-11 2005-09-07 西北工业大学 Video recognition input system
EP1852774A2 (en) * 2006-05-03 2007-11-07 Mitsubishi Electric Corporation Method and system for emulating a mouse on a multi-touch sensitive surface
CN101344816A (en) * 2008-08-15 2009-01-14 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《中国优秀硕士学位论文全文数据库》 20080831 褥铜 人手捕捉虚拟物体交互技术研究 12-43 1-3 , 2 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101923433A (en) * 2010-08-17 2010-12-22 北京航空航天大学 Man-computer interaction mode based on hand shadow identification
CN102411453A (en) * 2010-09-21 2012-04-11 北京市通州区科学技术协会 Method and device for enhancing outdoor practicability of virtual touch-screen system
CN102156808A (en) * 2011-03-30 2011-08-17 北京触角科技有限公司 System and method for improving try-on effect of reality real-time virtual ornament
CN103135882A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Method and system for control of display of window image
CN103135882B (en) * 2011-12-02 2016-08-03 深圳泰山体育科技股份有限公司 Control the method and system that window picture shows
CN102662462B (en) * 2012-03-12 2016-03-30 中兴通讯股份有限公司 Electronic installation, gesture identification method and gesture application process
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103389793B (en) * 2012-05-07 2016-09-21 深圳泰山在线科技有限公司 Man-machine interaction method and system
CN103699213A (en) * 2012-12-19 2014-04-02 苏州贝腾特电子科技有限公司 Control method for double click of virtual mouse
CN103941861B (en) * 2014-04-02 2017-02-08 北京理工大学 Multi-user cooperation training system adopting mixed reality technology
CN103941861A (en) * 2014-04-02 2014-07-23 北京理工大学 Multi-user cooperation training system adopting mixed reality technology
CN106796649A (en) * 2014-05-24 2017-05-31 远程信息技术发展中心 Use the man-machine interface based on attitude of label
CN104199549A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199548A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104199547B (en) * 2014-08-29 2017-05-17 福州瑞芯微电子股份有限公司 Virtual touch screen operation device, system and method
CN104199548B (en) * 2014-08-29 2017-08-25 福州瑞芯微电子股份有限公司 A kind of three-dimensional man-machine interactive operation device, system and method
CN104199549B (en) * 2014-08-29 2017-09-26 福州瑞芯微电子股份有限公司 A kind of virtual mouse action device, system and method
US10262197B2 (en) 2015-11-17 2019-04-16 Huawei Technologies Co., Ltd. Gesture-based object measurement method and apparatus
CN107341818A (en) * 2016-04-29 2017-11-10 北京博酷科技有限公司 Image analysis algorithm for the test of touch-screen response performance
CN107450714A (en) * 2016-05-31 2017-12-08 大唐电信科技股份有限公司 Man-machine interaction support test system based on augmented reality and image recognition
CN106293078A (en) * 2016-08-02 2017-01-04 福建数博讯信息科技有限公司 Virtual reality exchange method based on photographic head and device
CN108523281A (en) * 2017-03-02 2018-09-14 腾讯科技(深圳)有限公司 Gloves peripheral hardware, method, apparatus and system for virtual reality system
WO2019134606A1 (en) * 2018-01-05 2019-07-11 Oppo广东移动通信有限公司 Terminal control method, device, storage medium, and electronic apparatus
CN109243575A (en) * 2018-09-17 2019-01-18 华南理工大学 A kind of virtual acupuncture-moxibustion therapy method and system based on mobile interaction and augmented reality
CN109243575B (en) * 2018-09-17 2022-04-22 华南理工大学 Virtual acupuncture method and system based on mobile interaction and augmented reality
WO2022126775A1 (en) * 2020-12-14 2022-06-23 安徽鸿程光电有限公司 Cursor control method and apparatus, device and medium

Similar Documents

Publication Publication Date Title
CN101799717A (en) Man-machine interaction method based on hand action catch
CN104460951A (en) Human-computer interaction method
US11030237B2 (en) Method and apparatus for identifying input features for later recognition
CN100585329C (en) Location system of video finger and location method based on finger tip marking
US9767563B2 (en) Image processing apparatus and method for obtaining position and orientation of imaging apparatus
JP4768196B2 (en) Apparatus and method for pointing a target by image processing without performing three-dimensional modeling
CN105934775B (en) Method and system for constructing virtual images anchored on real-world objects
CN102508578B (en) Projection positioning device and method as well as interaction system and method
Lee et al. 3D natural hand interaction for AR applications
CN103365411A (en) Information input apparatus, information input method, and computer program
CN104102343A (en) Interactive Input System And Method
CN101673161A (en) Visual, operable and non-solid touch screen system
CN104364733A (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
CN103472916A (en) Man-machine interaction method based on human body gesture recognition
CN104246664B (en) The transparent display virtual touch device of pointer is not shown
CN102868811B (en) Mobile phone screen control method based on real-time video processing
CN105094335A (en) Scene extracting method, object positioning method and scene extracting system
CN108027656B (en) Input device, input method, and program
CN112304248A (en) Tactile sensor, robot, elastic body, object sensing method, and computing device
CN108022264A (en) Camera pose determines method and apparatus
WO2011146070A1 (en) System and method for reporting data in a computer vision system
CN105701828A (en) Image-processing method and device
CN104952104A (en) Three-dimensional human body gesture estimating method and device thereof
CN103176606B (en) Based on plane interaction system and the method for binocular vision identification
CN113487674B (en) Human body pose estimation system and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20100811