WO2004032095A1 - Interactive medical training system and method - Google Patents

Interactive medical training system and method Download PDF

Info

Publication number
WO2004032095A1
WO2004032095A1 PCT/CH2002/000556 CH0200556W WO2004032095A1 WO 2004032095 A1 WO2004032095 A1 WO 2004032095A1 CH 0200556 W CH0200556 W CH 0200556W WO 2004032095 A1 WO2004032095 A1 WO 2004032095A1
Authority
WO
WIPO (PCT)
Prior art keywords
organ
instrument
medical procedure
user
computer system
Prior art date
Application number
PCT/CH2002/000556
Other languages
French (fr)
Inventor
Ivan Vecerina
Murielle Launay
Jurjen Zoethout
Original Assignee
Xitact S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xitact S.A. filed Critical Xitact S.A.
Priority to EP02764475A priority Critical patent/EP1550099A1/en
Priority to PCT/CH2002/000556 priority patent/WO2004032095A1/en
Publication of WO2004032095A1 publication Critical patent/WO2004032095A1/en
Priority to US11/101,154 priority patent/US8591236B2/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Definitions

  • the invention relates generally to an interactive medical training system.
  • US 6,131,097 shows a haptic/visual authoring tool comprising a disclosure of controlling an avatar interacting with a virtual object, generating responsive forces.
  • US 5,791,907 discloses a method for training a user in a medical procedure utilizing an interactive computer system, said medical procedure having a plurality of steps. Following the answers of the user on specific questions of the system, the system provides an ongoing display history in connection with the correct answers and the errors.
  • the present invention relates on the insight that a method for training a user in a medical procedure has to divide the training session into smaller parts, that the user should be able to start every of these parts repeatedly in an environment of a perfectly completed former part, that the method has to guide the user within the different steps, and that the method has to provide an assessment at the end of a completed training session clearly showing the results and especially possible weaknesses of the user in view of further training sessions.
  • the result (i.e. the assessment) of a training session is available for a supervisor of the training session in connection with a stored session history to be able to evaluate the errors together with the user at any later moment .
  • Fig. 1 shows a schematic view of a virtual organ, two instruments and several graphical identification markers
  • Fig. 2 shows a screen display showing different steps of the training session available in connection with the simulated treatment of the organ according to Fig. 1,
  • Fig. 3 shows a screen display showing different sub-steps of a step according to Fig. 2, and
  • Fig. 4 shows a schematic view of another virtual organ with further graphical identification markers .
  • Fig. 1 shows a schematic view of an organ 1 together with two instruments 2 and 3 as it is generated on a screen visible to the user in the training session. Said representations of organ
  • instruments 2 and 3 are virtual geometrical elements within a computer controlled virtual environment .
  • the instruments 2 and 3 are connected through a control arrangement to physical devices, namely handles of (fake) surgical instruments, which are manipulated by the user in said training session.
  • the background of the image according to Fig. 1 shows a virtual representation of the abdominal cavity (not shown) .
  • Fig. 2 shows a screen display of different steps of the training session available in connection with the simulated treatment of the organ 1 according to Fig. 1.
  • this surgical interaction may comprise: step 51: gripping' of the organ in the area 4 with instrument
  • step 52 setting clips in the area 5 with help of the instrument 3 , and. step 53: cutting a vessel in an area 7.
  • the training is separated into different training steps 51 to 53.
  • the different steps 51 to 53 can be chosen independently from a menu as shown in Fig. 2.
  • the menu comprises a step 50: information about the virtual patient, step 55: tutored intervention of all steps 51 to 53 within one step, step 56: complete "free" intervention of all steps 51 to 53 within one step, and step 57: assessment of the training session.
  • Further steps for maintenance .of the user information, storing or retrieving parameters for the intervention e.g. parameters of the virtual patient, surgical problems arising with said patient, etc.
  • storing or retrieving of different training sessions e.g. for later assessment
  • step 52 "clipping of the organ" is the header of the menu wherein each step comprises preferably the following identical sub-steps, which can be chosen independently from the menu: sub-step 61: a prerecorded video representation of said step
  • sub-step 62 a prerecorded virtual reality video representation of the intervention to be performed within said step 51 or
  • sub-step 63 instructions for the intervention of said step 51 or 52 or 53
  • sub-step 64 a prerecorded virtual reality graphical (stills) or video representation of common errors within said step 51 or 52 or 53, which can usually be picked from a further listing submenu
  • sub-step 65 training session of the step in question with guidance
  • sub-step 66 training session of the step in question without guidance.
  • the graphical representation comprises a direct correlation with the action of the instrument handled by the user. This can be shown within the embodiment of the intervention as follows .
  • step 52 "clipping” the user has to move the instrument 2 towards the organ 1 and grip it in an area 4 (same procedure as in the step 51 "gripping") . Then he has to hold the organ 1 in a certain position and to apply three clips in the two areas 5 (in order to enable the cutting of the vessel in step 53 "cutting" between the clips in area 7) .
  • the area within which the instrument 2 has to be applied is marked through a blue dot 14 (the area shaded in the drawing) .
  • the dot changes its color, e.g. into yellow until the user has pulled the organ 1 into the corrected position, upon which occasion the dot changes its color a further time, e.g. into green. Since the user has to hold instrument 2 throughout the step in an extended period of time, he may deviate from the correct position, and the controller changes the color from green to red, when the position is not correct anymore. A less correct position may be marked yellow.
  • Such graphical feedback enables the user to find the correct position and the dot 14 becomes green again.
  • the step 52 "clipping” asks the user to put three clips into predefined positions, which are marked 15A, 15B and 15C.
  • the necessity to apply the same number of clips to the second vessel in the background is not treated within this simplified approach of the description.
  • the ring 15A is blue.
  • the user virtually loads a clip 13 into instrument 3 and approaches the blue ring 15A.
  • the blue ring 15A becomes green.
  • the ring may become yellow or red.
  • the color red is associated with a hazardous situation and yellow a situation which is not correct according to the teached procedure but not "dangerous" for the virtual patient.
  • next ring here 15B or 15C becomes blue and the process is repeated until all clips are set. Then all areas 14, 15A to 15C which are shaded in the drawings and which should be green vanish as indication that the step is completed.
  • step 53 cutting
  • exchange instrument 3 towards scissors and to cut the vessel in area 7, which is afterwards indicated through a (separated) green ring 7.
  • Fig. 4 shows a schematic vie ' of another virtual organ 1 with further graphical identification markers 18A, 18B and 19.
  • Organ 1 has a cylindrical form.
  • a blue line 18A is appearing on the organ.
  • This line 18A has preferably a starting point and an end point which can be seen through enlarged dots.
  • the relevant part of the line 18A turns green. If the user burns the area or enters the instrument too deep into the organ, then beside the effect of the virtual representation showing the results of this treatment (smoke, burned points, bleeding) this part of the line turns red.
  • the line and the dots turns into green and the next segment 18B is lit up in blue colour. This advises the user where to continue his task until the related step is completed.
  • Area 19 shows a region in which the user has to open the surface within a larger area.
  • the area to be opened is blue and all cut off pieces will loose there blue color and turn green.
  • the whole area will have green borders and then the marker areas will vanish.
  • the concept of guidance is to project a (two-dimensional) surface area information in the area which has to be treated through the user in training.
  • This may be a larger spot, as spot 14 for a gripping action.
  • This may be a ring, as ring 18A for the clipping action.
  • This may be a line or a segment of a line as lines 18A and 18B for marking or cutting purposes.
  • this can be of any geometrical form as polygon 19 which marks the area which has to be treated.
  • the correlation between the instrument and the graphical representation is immediate and gives a direct visual assessment in the training. In case of the line it had been shown that it is possible to directly show the three states: a.) parts still to be treated are in blue b.) parts which have been correctly treated are in green and c .
  • parts which has been maltreated are in red (or yellow) . Beside the possibility to show the errors in color they can also be ex- plained through text which are correlated to the errors through a suitable table. Such a direct computer calculated assessment gives the user the opportunity to correct and ameliorate his performance in one or more of the steps before trying the final (not guided) stage of the intervention.

Abstract

A method for training a user in a medical procedure utilizes an interactive computer system with a virtual representation of at least one organ (1) and at least one instrument (2, 3). The representation of surface area parts (4, 5, 7, 8, 9) of the virtual organ (1) to be treated are covered with graphical identification marker (14, 15A to 15C, 17). The computer system comprises a table stored in his memory comprising situations arising within the medical procedure predefined through positions of the instrument(s) (2, 3), positions of the organ(s) (1), the logical sequence of steps to be executed within said medical procedure, different aspect values for the graphical identification markers (14, 15A to 15C, 17) and assessment values for said situation. Upon interaction of the user with the at least one organ (1) and the at least one instrument (2, 3) one or more graphical identification markers (14, 15A to 15C, 17, 18A + 18B, 19) change their aspect according to the stored value in said table for the predefined situation and the corresponding assessment value is stored in a history log table within the computer system.

Description

Interactive medical training system and method
Field of the invention
The invention relates generally to an interactive medical training system.
Background of the invention
Specialized force-feedback devices are more and more proposed for medical training of surgeons, especially in view of endo- scopic interventions. Haptic interfaces accommodate the different controllers and environments.
US 6,131,097 shows a haptic/visual authoring tool comprising a disclosure of controlling an avatar interacting with a virtual object, generating responsive forces.
It is known from prior art that the user in training manipulates handles of actual endoscopic instruments in front of a box being the simulation of a torso of a patient. The images of the users actions with said instruments are shown on a video display. Said video display shows computer generated images of the instruments used and the environment (i.e. organs etc.).
US 5,791,907 discloses a method for training a user in a medical procedure utilizing an interactive computer system, said medical procedure having a plurality of steps. Following the answers of the user on specific questions of the system, the system provides an ongoing display history in connection with the correct answers and the errors.
The need remains, however, for a method and system to organize the training to guide the user and to provide a measurable result at the end of a training session, usable by the person trained and the training center.
Summary of the invention
The present invention relates on the insight that a method for training a user in a medical procedure has to divide the training session into smaller parts, that the user should be able to start every of these parts repeatedly in an environment of a perfectly completed former part, that the method has to guide the user within the different steps, and that the method has to provide an assessment at the end of a completed training session clearly showing the results and especially possible weaknesses of the user in view of further training sessions.
Preferably the result (i.e. the assessment) of a training session is available for a supervisor of the training session in connection with a stored session history to be able to evaluate the errors together with the user at any later moment .
Brief description of the drawings
Fig. 1 shows a schematic view of a virtual organ, two instruments and several graphical identification markers, Fig. 2 shows a screen display showing different steps of the training session available in connection with the simulated treatment of the organ according to Fig. 1,
Fig. 3 shows a screen display showing different sub-steps of a step according to Fig. 2, and
Fig. 4 shows a schematic view of another virtual organ with further graphical identification markers .
Detailed description of the invention
Fig. 1 shows a schematic view of an organ 1 together with two instruments 2 and 3 as it is generated on a screen visible to the user in the training session. Said representations of organ
1 and instruments 2 and 3 are virtual geometrical elements within a computer controlled virtual environment . The instruments 2 and 3 are connected through a control arrangement to physical devices, namely handles of (fake) surgical instruments, which are manipulated by the user in said training session. The background of the image according to Fig. 1 shows a virtual representation of the abdominal cavity (not shown) .
The user has to perform a surgical interaction, i.e. a simulated treatment of said organ 1. Fig. 2 shows a screen display of different steps of the training session available in connection with the simulated treatment of the organ 1 according to Fig. 1. Within the embodiment shown, and as an example, this surgical interaction may comprise: step 51: gripping' of the organ in the area 4 with instrument
2 and step 52 : setting clips in the area 5 with help of the instrument 3 , and. step 53: cutting a vessel in an area 7.
In reality this surgical operation has to be performed in a row within a preferred time limit. In the simulated environment the training is separated into different training steps 51 to 53. There are three training steps within this embodiment. The different steps 51 to 53 can be chosen independently from a menu as shown in Fig. 2. Additionally the menu comprises a step 50: information about the virtual patient, step 55: tutored intervention of all steps 51 to 53 within one step, step 56: complete "free" intervention of all steps 51 to 53 within one step, and step 57: assessment of the training session. Further steps for maintenance .of the user information, storing or retrieving parameters for the intervention (e.g. parameters of the virtual patient, surgical problems arising with said patient, etc. ) and/or storing or retrieving of different training sessions (e.g. for later assessment) may be included within the menu display and are summarized under reference numeral 58: administration.
Every of the three steps 51 to 53 is organized within sub-steps which have preferably the identical layout .within every step 51 to 53 for easy reference. Therefore, choosing one of the steps 51 to 53 on the screen leads to a new menu. In the case depicted in Fig. 3 step 52 "clipping of the organ" is the header of the menu wherein each step comprises preferably the following identical sub-steps, which can be chosen independently from the menu: sub-step 61: a prerecorded video representation of said step
51 or 52 or 53 within a real intervention, sub-step 62: a prerecorded virtual reality video representation of the intervention to be performed within said step 51 or
52 or 53 based on the structure of the virtual reality environment, sub-step 63: instructions for the intervention of said step 51 or 52 or 53, sub-step 64: a prerecorded virtual reality graphical (stills) or video representation of common errors within said step 51 or 52 or 53, which can usually be picked from a further listing submenu, sub-step 65: training session of the step in question with guidance, sub-step 66: training session of the step in question without guidance. The main advantage of this procedure is based on the insight that the user can be directed within each step and repeat the steps in which he needs the most training. This preparation together with the guidance through the below mentioned graphical codes gives a better training result. Additionally the result of the simulated intervention is stored and the evaluation is possible on parts of the intervention or the whole process .
Within every sub-step 65 the graphical representation comprises a direct correlation with the action of the instrument handled by the user. This can be shown within the embodiment of the intervention as follows .
Within the step 52 "clipping" , the user has to move the instrument 2 towards the organ 1 and grip it in an area 4 (same procedure as in the step 51 "gripping") . Then he has to hold the organ 1 in a certain position and to apply three clips in the two areas 5 (in order to enable the cutting of the vessel in step 53 "cutting" between the clips in area 7) .
Before the user has gripped the organ 1, the area within which the instrument 2 has to be applied is marked through a blue dot 14 (the area shaded in the drawing) . When the area 4 is gripped, the dot changes its color, e.g. into yellow until the user has pulled the organ 1 into the corrected position, upon which occasion the dot changes its color a further time, e.g. into green. Since the user has to hold instrument 2 throughout the step in an extended period of time, he may deviate from the correct position, and the controller changes the color from green to red, when the position is not correct anymore. A less correct position may be marked yellow. Such graphical feedback enables the user to find the correct position and the dot 14 becomes green again. The step 52 "clipping" asks the user to put three clips into predefined positions, which are marked 15A, 15B and 15C. The necessity to apply the same number of clips to the second vessel in the background is not treated within this simplified approach of the description. At the beginning only one part of area 5, i.e. the ring 15A is blue. The user virtually loads a clip 13 into instrument 3 and approaches the blue ring 15A. Upon application of the clip 13 the blue ring 15A becomes green. If the clip is misplaced, the ring may become yellow or red. Preferably the color red is associated with a hazardous situation and yellow a situation which is not correct according to the teached procedure but not "dangerous" for the virtual patient. Upon fulfillment of the first clipping action the next ring, here 15B or 15C becomes blue and the process is repeated until all clips are set. Then all areas 14, 15A to 15C which are shaded in the drawings and which should be green vanish as indication that the step is completed.
It is then subject of step 53 "cutting" to exchange instrument 3 towards scissors and to cut the vessel in area 7, which is afterwards indicated through a (separated) green ring 7.
Although the intervention of opening surface tissues in an area is not part of the menus according to Fig. 2 or Fig. 3, these further graphical indications are shown in connection with a further schematic drawing, Fig. 4.
Fig. 4 shows a schematic vie ' of another virtual organ 1 with further graphical identification markers 18A, 18B and 19. Organ 1 has a cylindrical form. In case that the user has to mark a path on the organ 1 in the area 8 through treating the surface of the organ with current, a blue line 18A is appearing on the organ. This line 18A has preferably a starting point and an end point which can be seen through enlarged dots. In case that the user marks correctly, the relevant part of the line 18A turns green. If the user burns the area or enters the instrument too deep into the organ, then beside the effect of the virtual representation showing the results of this treatment (smoke, burned points, bleeding) this part of the line turns red. When all blue parts of the line 18A had been treated, the line and the dots turns into green and the next segment 18B is lit up in blue colour. This advises the user where to continue his task until the related step is completed.
Area 19 shows a region in which the user has to open the surface within a larger area. Here the area to be opened is blue and all cut off pieces will loose there blue color and turn green. Upon completion of the task the whole area will have green borders and then the marker areas will vanish.
The concept of guidance is to project a (two-dimensional) surface area information in the area which has to be treated through the user in training. This may be a larger spot, as spot 14 for a gripping action. This may be a ring, as ring 18A for the clipping action. This may be a line or a segment of a line as lines 18A and 18B for marking or cutting purposes. Finally this can be of any geometrical form as polygon 19 which marks the area which has to be treated. The correlation between the instrument and the graphical representation is immediate and gives a direct visual assessment in the training. In case of the line it had been shown that it is possible to directly show the three states: a.) parts still to be treated are in blue b.) parts which have been correctly treated are in green and c . ) parts which has been maltreated are in red (or yellow) . Beside the possibility to show the errors in color they can also be ex- plained through text which are correlated to the errors through a suitable table. Such a direct computer calculated assessment gives the user the opportunity to correct and ameliorate his performance in one or more of the steps before trying the final (not guided) stage of the intervention.

Claims

Claims
1. Method for training a user in a medical procedure utilizing an interactive computer system with a virtual representation of at least one organ (1) and at least one instrument (2, 3) , wherein the representation of surface area parts (4, 5, 7, 8, 9) of the virtual organ (1) to be treated are covered with at least one graphical identification marker (14, 15A to 15C, 17, 18A + 18B, 19) , wherein the computer system comprises a table stored in his memory comprising situations arising within the medical procedure predefined through positions of the instrument (s) (2, 3) , positions of the organ (s) (1) , the logical sequence of steps to be executed within said medical procedure, one or more different aspect values for the one or more graphical identification markers (14, 15A to 15C, 17, 18A + 18B, 19) and one or more assessment values for said situation, wherein upon interaction of the user with the at least one organ (1) and the at least one instrument (2, 3) one or more graphical identification markers (14, 15A to 15C, 17, 18A + 18B, 19) change their aspect according to the stored value in said table for the predefined situation and the corresponding assessment value is stored in a history log table within the computer system.
2. Method according to claim 1, wherein the graphical identification marker is a spot (14) , a ring contour (15A to 15C, 17) , a line (18A + 18B) or a polygonal surface (19) .
3. Method according to claim 1 or 2 , wherein the change of aspect of a graphical identification marker is a change of color according to a color table.
4. Method according to one of claims 1 to 3, wherein the as- sessment value is correlated to a success-and-error-table comprising predefined assessment information.
5. Method according to one of claims 1 to 4 , wherein said table of situations comprise one-, two-, or three-dimensional interval values for the position of the organs (1) and instruments (2, 3) .
6. Method for training a user in a medical procedure utilizing an interactive computer system with a virtual representation of an organ (1) and at least one instrument (2, 3), said medical procedure having a plurality of steps (51, 52, 53) to be performed one after another, wherein each of the plurality of steps (51, 52, 53) is subdivided into identical sub-steps (61 to 66), wherein at least one sub-step comprises a method according to one of claims 1 to 5.
7. Method according to claim 6, wherein the sub-steps (61 to 66) comprise one or more of the following steps: a prerecorded video representation of said step (51 or 52 or 53) within a real intervention (61) , a prerecorded virtual reality video representation of the intervention to be performed within said step (51 or 52 or 53) based on the structure of the virtual reality environment (62) , instructions for the intervention of said step (51 or 52 or 53) , a prerecorded virtual reality graphical stills or video representation of common errors within said step (51 or 52 or 53) , a training session of said step (51 or 52 or 53) without guidance .
8. System for training a user in a medical procedure comprising an interactive computer system, able to create a virtual representation of at least one organ (1) and at least one instrument (2, 3), wherein the representation of surface area parts (4, 5, 7, 8, 9) of the virtual organ (1) to be treated are covered with at least one graphical identification marker (14, 15A to 15C, 17, 18A + 18B, 19), wherein the computer system comprises a table stored in his memory comprising situations arising within the medical procedure predefined through positions of the instrument (s) (2, 3), positions of the organ(s) (1), the logical sequence of steps to be executed within said medical procedure, one or more different aspect values for the one or more graphical identification markers (14, 15A to 15C, 17, 18A + 18B, 19) and one or more assessment values for said situation, wherein upon interaction of the user with the at least one organ (1) and the at least one instrument (2, 3) one or more graphical identification markers (14, 15A to 15C, 17, 18A + 18B, 19) change their aspect according to the stored value in said table for the predefined situation and the corresponding assessment value is stored in a history log table within the computer system.
PCT/CH2002/000556 2002-10-07 2002-10-07 Interactive medical training system and method WO2004032095A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP02764475A EP1550099A1 (en) 2002-10-07 2002-10-07 Interactive medical training system and method
PCT/CH2002/000556 WO2004032095A1 (en) 2002-10-07 2002-10-07 Interactive medical training system and method
US11/101,154 US8591236B2 (en) 2002-10-07 2005-04-07 Interactive medical training system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CH2002/000556 WO2004032095A1 (en) 2002-10-07 2002-10-07 Interactive medical training system and method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/101,154 Continuation-In-Part US8591236B2 (en) 2002-10-07 2005-04-07 Interactive medical training system and method

Publications (1)

Publication Number Publication Date
WO2004032095A1 true WO2004032095A1 (en) 2004-04-15

Family

ID=32046617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CH2002/000556 WO2004032095A1 (en) 2002-10-07 2002-10-07 Interactive medical training system and method

Country Status (2)

Country Link
EP (1) EP1550099A1 (en)
WO (1) WO2004032095A1 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014022815A1 (en) * 2012-08-03 2014-02-06 Applied Medical Resources Corporation Simulated stapling and energy based ligation for surgical training
US9472121B2 (en) 2010-10-01 2016-10-18 Applied Medical Resources Corporation Portable laparoscopic trainer
US9548002B2 (en) 2013-07-24 2017-01-17 Applied Medical Resources Corporation First entry model
US9898937B2 (en) 2012-09-28 2018-02-20 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US9922579B2 (en) 2013-06-18 2018-03-20 Applied Medical Resources Corporation Gallbladder model
US9940849B2 (en) 2013-03-01 2018-04-10 Applied Medical Resources Corporation Advanced surgical simulation constructions and methods
US9959786B2 (en) 2012-09-27 2018-05-01 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10081727B2 (en) 2015-05-14 2018-09-25 Applied Medical Resources Corporation Synthetic tissue structures for electrosurgical training and simulation
US10121391B2 (en) 2012-09-27 2018-11-06 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10140889B2 (en) 2013-05-15 2018-11-27 Applied Medical Resources Corporation Hernia model
US10198966B2 (en) 2013-07-24 2019-02-05 Applied Medical Resources Corporation Advanced first entry model for surgical simulation
US10223936B2 (en) 2015-06-09 2019-03-05 Applied Medical Resources Corporation Hysterectomy model
US10332425B2 (en) 2015-07-16 2019-06-25 Applied Medical Resources Corporation Simulated dissectible tissue
US10354556B2 (en) 2015-02-19 2019-07-16 Applied Medical Resources Corporation Simulated tissue structures and methods
US10395559B2 (en) 2012-09-28 2019-08-27 Applied Medical Resources Corporation Surgical training model for transluminal laparoscopic procedures
US10490105B2 (en) 2015-07-22 2019-11-26 Applied Medical Resources Corporation Appendectomy model
US10535281B2 (en) 2012-09-26 2020-01-14 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10679520B2 (en) 2012-09-27 2020-06-09 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10706743B2 (en) 2015-11-20 2020-07-07 Applied Medical Resources Corporation Simulated dissectible tissue
US10720084B2 (en) 2015-10-02 2020-07-21 Applied Medical Resources Corporation Hysterectomy model
US10796606B2 (en) 2014-03-26 2020-10-06 Applied Medical Resources Corporation Simulated dissectible tissue
US10818201B2 (en) 2014-11-13 2020-10-27 Applied Medical Resources Corporation Simulated tissue models and methods
US10847057B2 (en) 2017-02-23 2020-11-24 Applied Medical Resources Corporation Synthetic tissue structures for electrosurgical training and simulation
US11030922B2 (en) 2017-02-14 2021-06-08 Applied Medical Resources Corporation Laparoscopic training system
US11120708B2 (en) 2016-06-27 2021-09-14 Applied Medical Resources Corporation Simulated abdominal wall
US11158212B2 (en) 2011-10-21 2021-10-26 Applied Medical Resources Corporation Simulated tissue structure for surgical training
US11403968B2 (en) 2011-12-20 2022-08-02 Applied Medical Resources Corporation Advanced surgical simulation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4742815A (en) 1986-01-02 1988-05-10 Ninan Champil A Computer monitoring of endoscope
DE3834553A1 (en) * 1988-10-11 1990-04-12 Siegfried Dr Med Kubin Coloscopy simulator
US5791907A (en) 1996-03-08 1998-08-11 Ramshaw; Bruce J. Interactive medical training system
WO1999038141A1 (en) * 1998-01-26 1999-07-29 Simbionix Ltd. Endoscopic tutorial system
US6113395A (en) 1998-08-18 2000-09-05 Hon; David C. Selectable instruments with homing devices for haptic virtual reality medical simulation
US6131097A (en) 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
WO2001078039A2 (en) 2000-04-12 2001-10-18 Simbionix Ltd. Endoscopic tutorial system for urology

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4742815A (en) 1986-01-02 1988-05-10 Ninan Champil A Computer monitoring of endoscope
DE3834553A1 (en) * 1988-10-11 1990-04-12 Siegfried Dr Med Kubin Coloscopy simulator
US6131097A (en) 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
US5791907A (en) 1996-03-08 1998-08-11 Ramshaw; Bruce J. Interactive medical training system
WO1999038141A1 (en) * 1998-01-26 1999-07-29 Simbionix Ltd. Endoscopic tutorial system
US6113395A (en) 1998-08-18 2000-09-05 Hon; David C. Selectable instruments with homing devices for haptic virtual reality medical simulation
WO2001078039A2 (en) 2000-04-12 2001-10-18 Simbionix Ltd. Endoscopic tutorial system for urology

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9472121B2 (en) 2010-10-01 2016-10-18 Applied Medical Resources Corporation Portable laparoscopic trainer
US10854112B2 (en) 2010-10-01 2020-12-01 Applied Medical Resources Corporation Portable laparoscopic trainer
US11158212B2 (en) 2011-10-21 2021-10-26 Applied Medical Resources Corporation Simulated tissue structure for surgical training
US11403968B2 (en) 2011-12-20 2022-08-02 Applied Medical Resources Corporation Advanced surgical simulation
US10198965B2 (en) 2012-08-03 2019-02-05 Applied Medical Resources Corporation Simulated stapling and energy based ligation for surgical training
WO2014022815A1 (en) * 2012-08-03 2014-02-06 Applied Medical Resources Corporation Simulated stapling and energy based ligation for surgical training
US11514819B2 (en) 2012-09-26 2022-11-29 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10535281B2 (en) 2012-09-26 2020-01-14 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US9959786B2 (en) 2012-09-27 2018-05-01 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US11361679B2 (en) 2012-09-27 2022-06-14 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10121391B2 (en) 2012-09-27 2018-11-06 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US11869378B2 (en) 2012-09-27 2024-01-09 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10679520B2 (en) 2012-09-27 2020-06-09 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US10395559B2 (en) 2012-09-28 2019-08-27 Applied Medical Resources Corporation Surgical training model for transluminal laparoscopic procedures
US9898937B2 (en) 2012-09-28 2018-02-20 Applied Medical Resources Corporation Surgical training model for laparoscopic procedures
US9940849B2 (en) 2013-03-01 2018-04-10 Applied Medical Resources Corporation Advanced surgical simulation constructions and methods
US10140889B2 (en) 2013-05-15 2018-11-27 Applied Medical Resources Corporation Hernia model
US9922579B2 (en) 2013-06-18 2018-03-20 Applied Medical Resources Corporation Gallbladder model
US11735068B2 (en) 2013-06-18 2023-08-22 Applied Medical Resources Corporation Gallbladder model
US11049418B2 (en) 2013-06-18 2021-06-29 Applied Medical Resources Corporation Gallbladder model
US11854425B2 (en) 2013-07-24 2023-12-26 Applied Medical Resources Corporation First entry model
US9548002B2 (en) 2013-07-24 2017-01-17 Applied Medical Resources Corporation First entry model
US10657845B2 (en) 2013-07-24 2020-05-19 Applied Medical Resources Corporation First entry model
US11450236B2 (en) 2013-07-24 2022-09-20 Applied Medical Resources Corporation Advanced first entry model for surgical simulation
US10026337B2 (en) 2013-07-24 2018-07-17 Applied Medical Resources Corporation First entry model
US10198966B2 (en) 2013-07-24 2019-02-05 Applied Medical Resources Corporation Advanced first entry model for surgical simulation
US10796606B2 (en) 2014-03-26 2020-10-06 Applied Medical Resources Corporation Simulated dissectible tissue
US11887504B2 (en) 2014-11-13 2024-01-30 Applied Medical Resources Corporation Simulated tissue models and methods
US10818201B2 (en) 2014-11-13 2020-10-27 Applied Medical Resources Corporation Simulated tissue models and methods
US11100815B2 (en) 2015-02-19 2021-08-24 Applied Medical Resources Corporation Simulated tissue structures and methods
US10354556B2 (en) 2015-02-19 2019-07-16 Applied Medical Resources Corporation Simulated tissue structures and methods
US11034831B2 (en) 2015-05-14 2021-06-15 Applied Medical Resources Corporation Synthetic tissue structures for electrosurgical training and simulation
US10081727B2 (en) 2015-05-14 2018-09-25 Applied Medical Resources Corporation Synthetic tissue structures for electrosurgical training and simulation
US11721240B2 (en) 2015-06-09 2023-08-08 Applied Medical Resources Corporation Hysterectomy model
US10733908B2 (en) 2015-06-09 2020-08-04 Applied Medical Resources Corporation Hysterectomy model
US10223936B2 (en) 2015-06-09 2019-03-05 Applied Medical Resources Corporation Hysterectomy model
US10755602B2 (en) 2015-07-16 2020-08-25 Applied Medical Resources Corporation Simulated dissectible tissue
US10332425B2 (en) 2015-07-16 2019-06-25 Applied Medical Resources Corporation Simulated dissectible tissue
US11587466B2 (en) 2015-07-16 2023-02-21 Applied Medical Resources Corporation Simulated dissectible tissue
US10490105B2 (en) 2015-07-22 2019-11-26 Applied Medical Resources Corporation Appendectomy model
US10720084B2 (en) 2015-10-02 2020-07-21 Applied Medical Resources Corporation Hysterectomy model
US11721242B2 (en) 2015-10-02 2023-08-08 Applied Medical Resources Corporation Hysterectomy model
US10706743B2 (en) 2015-11-20 2020-07-07 Applied Medical Resources Corporation Simulated dissectible tissue
US11120708B2 (en) 2016-06-27 2021-09-14 Applied Medical Resources Corporation Simulated abdominal wall
US11830378B2 (en) 2016-06-27 2023-11-28 Applied Medical Resources Corporation Simulated abdominal wall
US11030922B2 (en) 2017-02-14 2021-06-08 Applied Medical Resources Corporation Laparoscopic training system
US10847057B2 (en) 2017-02-23 2020-11-24 Applied Medical Resources Corporation Synthetic tissue structures for electrosurgical training and simulation

Also Published As

Publication number Publication date
EP1550099A1 (en) 2005-07-06

Similar Documents

Publication Publication Date Title
EP1550099A1 (en) Interactive medical training system and method
US8591236B2 (en) Interactive medical training system and method
Abdi et al. Control of a supernumerary robotic hand by foot: An experimental study in virtual reality
Panait et al. The role of haptic feedback in laparoscopic simulation training
Eadie et al. Telemedicine in surgery
Chang et al. Robotic surgery: identifying the learning curve through objective measurement of skill
Mori et al. Significance of``hands-on training''in laparoscopic surgery
US20110306986A1 (en) Surgical robot system using augmented reality, and method for controlling same
CA2168555A1 (en) Computerized device useful in teaching problem solving
CN110381873A (en) Robotic surgical system, instrument and control
Loukas et al. Deconstructing laparoscopic competence in a virtual reality simulation environment
KR20200048830A (en) Cataract surgery Simulation System for education based on virtual reality
Gopher Skill training in multimodal virtual environments
Hong et al. Simulation-based surgical training systems in laparoscopic surgery: a current review
US20130302765A1 (en) Methods and systems for assessing and developing the mental acuity and behavior of a person
WO2004029908A1 (en) Improved computer-based minimal-invasive surgery simulation system
JP2024016211A (en) Surgical content evaluation system, surgical content evaluation method, and computer program
Feng et al. Surgical training and performance assessment using a motion tracking system
de Almeida et al. Impact of 3D laparoscopic surgical training on performance in standard 2D laparoscopic simulation: a randomised prospective study
CN114288519A (en) Training display method
Haase et al. Virtual reality and habitats for learning microsurgical skills
JP2020071418A (en) Operation simulation result evaluation method using computer simulator, education sequence processing method in endoscope operation, operation training device, and program for operation training device
Hance et al. Skills training in telerobotic surgery
Krauthausen Robotic surgery training in AR: multimodal record and replay
Jones et al. The effectiveness of virtual reality for training human-robot teams

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2002764475

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 11101154

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 2002764475

Country of ref document: EP