US20110050852A1 - Stereo telestration for robotic surgery - Google Patents

Stereo telestration for robotic surgery Download PDF

Info

Publication number
US20110050852A1
US20110050852A1 US12/941,038 US94103810A US2011050852A1 US 20110050852 A1 US20110050852 A1 US 20110050852A1 US 94103810 A US94103810 A US 94103810A US 2011050852 A1 US2011050852 A1 US 2011050852A1
Authority
US
United States
Prior art keywords
telestration
stereo
image
surgical site
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/941,038
Inventor
Ben Lamprecht
William C. Nowlin
John D. Stern
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Priority to US12/941,038 priority Critical patent/US20110050852A1/en
Publication of US20110050852A1 publication Critical patent/US20110050852A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • A61B1/3132Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the embodiments of the invention relate generally to telestration systems. More particularly, the embodiments of the invention relate to telestration mentoring systems for robotic surgery.
  • a telestrator is a device that allows its operator to draw a freehand sketch over a motion picture image.
  • the act of drawing a freehand sketch over a motion picture image is often referred to as telestration.
  • the freehand sketch may be referred to as a telestration image.
  • Telestrators have been used to annotate televised weather reports and televised sporting events.
  • Telestration systems are often used in television broadcasts of football games to make a point to a television audience regarding one or more plays during the game.
  • a sports commentator may draw sketches of objects, such as X and O, circles or lines, that is overlaid and displayed on still or moving video images of the play on the television monitor.
  • the telestration image is displayed on a single television monitor in a mono-visual (“mono-view”) format and viewed by both eyes of the television viewer.
  • the mono-view provided by the single television monitor is limited to two dimensional images.
  • FIG. 1 is a block diagram of a robotic surgery system including a stereo viewer and a stereo telestration system to provide annotated stereo images to a surgeon.
  • FIG. 2A is a block diagram of a first system to provide a stereo telestration image overlay in both left and right video channels to provide three-dimensional images in a stereo viewer.
  • FIG. 2B is a block diagram of a second system to provide a stereo telestration image overlay in both left and right video channels to provide three-dimensional images in a stereo viewer.
  • FIG. 3 is a perspective view of a robotic surgical master control console including the stereo viewer.
  • FIG. 4 illustrates the stereo viewer of the master control console of FIG. 3 with a stereo telestration image overlay in both left and right monitors to provide three-dimensional images of the surgical site and the telestration images.
  • FIG. 5A illustrates a block diagram of a digital composite video mixer to mix a surgical site video signal and a telestration video signal together.
  • FIG. 5B illustrates a block diagram of a digital component video mixer to mix a surgical site video signal and a telestration video signal together.
  • FIG. 5C illustrates a block diagram of an analog video mixer to mix an analog surgical site video signal and an analog telestration video signal together.
  • FIG. 6A illustrates left and right annotated surgical site images.
  • FIG. 6B illustrates a three dimensional coordinate system for the stereo images of a background object and the stereo telestration images.
  • FIG. 6C illustrates a stereo window of left and right images in the stereo viewer to show the horizontal offset between the left telestration image and the right telestration image to achieve fusing and the same depth.
  • FIG. 7 is a block diagram of an exemplary endoscopic camera.
  • FIG. 8 is a magnified perspective view of the exemplary endoscopic camera and the plane of a tissue or objection.
  • FIGS. 9A-9C are diagrams to illustrate the generation of a disparity map.
  • FIG. 10 is side perspective stereo view illustrating differences in a telestration mark generated at an apparent constant depth and a telestration mark generated with an apparent depth continuum to appear painted onto a surface.
  • telestration systems One application for telestration systems is robotic surgery.
  • two monitors are used to provide a stereo-visual (“stereo-view”) and a three-dimensional image to a pair of eyes.
  • the three-dimensional image is important for depth perception of the surgical site and viewing the robotic surgical tools perform surgery on a patient within the surgical site.
  • a mono-visual image in a single monitor to a single eye is less desirable in robotic surgery.
  • a stereo image of the surgical site is desirable
  • a mono-visual telestration image in only one monitor of the pair of monitors is less desirable during robotic surgery.
  • a surgeon may be confused, as one eye sees one half of a stereo image without the telestration image.
  • it may be hard on the surgeon's eyes and brain to view a mono-visual telestration image for extended periods and cause fatigue during surgery which is undesirable.
  • a video frame or a frame of pixel data may be used interchangeably with image herein.
  • an image is what is perceived by a user when viewing the video frame or pixel frame of data on the viewing device.
  • a stereo image with a pair of images e.g., a left image and a right image
  • a mono-visual image or mono-image has one of a left image or a right image and one of a left or right video frame or a left or right frame of pixel data.
  • the embodiments of the invention include a method, apparatus, and system for stereo telestration for robotic surgery.
  • Robotic surgery generally involves the use of a robot manipulator that has multiple robotic manipulator arms.
  • One or more of the robotic manipulator arms often support a surgical tool which may be articulated (such as jaws, scissors, graspers, needle holders, micro dissectors, staple appliers, tackers, suction/irrigation tools, clip appliers, or the like) or non-articulated (such as cutting blades, cautery probes, irrigators, catheters, suction orifices, or the like).
  • At least one of the robotic manipulator arms 153 is used to support a stereo or three dimensional surgical image capture device 110 such as a stereo endoscope (which may be any of a variety of structures such as a stereo laparoscope, arthroscope, hysteroscope, or the like), or, optionally, some other stereo imaging modality (such as ultrasound, fluoroscopy, magnetic resonance imaging, or the like).
  • a stereo endoscope which may be any of a variety of structures such as a stereo laparoscope, arthroscope, hysteroscope, or the like
  • some other stereo imaging modality such as ultrasound, fluoroscopy, magnetic resonance imaging, or the like.
  • Robotic surgery may be used to perform a wide variety of surgical procedures, including but not limited to open surgery, neurosurgical procedures (such as stereotaxy), endoscopic procedures (such as laparoscopy, arthroscopy, thoracoscopy), and the like.
  • a user or operator O (generally a surgeon) performs a minimally invasive surgical procedure on patient P by manipulating input devices at a master control console 150 .
  • a computer 151 of the console 150 directs movement of robotically controlled endoscopic surgical instruments 101 A- 101 B and 110 , by means of one or more feedback/control cables 159 , effecting movement of the instruments using a robotic surgical manipulator 152 .
  • the robotic surgical manipulator 152 may also be referred to as robotic patient-side cart system or simply as a cart.
  • the robotic surgical manipulator 152 has one or more robotic arms 153 .
  • the robotic surgical manipulator 152 includes at least three robotic manipulator arms 153 supported by linkages 156 , 156 ′, with a central arm 153 supporting an endoscopic camera 110 and the robotic arms 153 to left and right of center supporting tissue manipulation tools 101 A- 101 B.
  • the robotic arms 153 of robotic surgical manipulator 152 include a positioning portion and a driven portion.
  • the positioning portion of the robotic surgical manipulator 152 remains in a fixed configuration during surgery while manipulating tissue.
  • the driven portion of the robotic surgical manipulator 152 is actively articulated under the direction of the operator O generating control signals at the surgeon's console 150 during surgery.
  • the actively driven portion of the arms 153 is herein referred to as an end effector 158 .
  • the positioning portion of the robotic arms 153 that are in a fixed configuration during surgery may be referred to as positioning linkage and/or “set-up joint” 156 , 156 ′.
  • An assistant A may assist in pre-positioning of the robotic surgical manipulator 152 relative to patient P as well as swapping tools or instruments 101 for alternative tool structures, and the like, while viewing the internal surgical site via an assistant's display 154 .
  • the image of the internal surgical site shown to A by the assistant's display 154 is provided by a left or right channel 176 of the stereo endoscopic camera 110 supported by the robotic surgical manipulator 152 .
  • both left and right channels of the stereo endoscopic camera 110 are provided to the operator O in a stereo display 164 at the surgeon's console 150 , one channel for each eye.
  • a teacher, instructor, or other person may be on site or at a remote location and use a telestrator to generate telestration and provide comments and instructions to the operator O regarding the robotic surgical procedure in the surgical site of the patient P.
  • a telestrator may be on site or at a remote location and use a telestrator to generate telestration and provide comments and instructions to the operator O regarding the robotic surgical procedure in the surgical site of the patient P.
  • an expert on the robotic surgical procedure such as mentor M, may guide a less experienced operating surgeon O.
  • a typical telestration system provides mono-view images.
  • the robotic surgical system has a stereo viewer which displays a three dimensional image of the surgical site to the surgeon O. If the telestration image is displayed in only one eye, confusion can result since the other eye is seeing the other image of the stereo pair without the telestration image overlay.
  • the robotic surgical system 100 includes a stereo telestration system 160 coupled between the console 150 and remote located telestration equipment 161 .
  • the remote located telestration equipment 161 may be located remotely in the same room as the patient and surgeon or in a different room, a different hospital, or a different city, country, continent or other differing location.
  • the stereo telestration system 160 processes left and right channels of stereo video signals and optionally, full duplex audio signals for audio/video communication.
  • the stereo telestration system 160 receives stereo images of the surgical site (“stereo surgical images”) from the stereo endoscopic camera 110 over the stereo video communication link 170 .
  • a mono-view of telestration images (“mono telestration images”) is generated by a telestrator or telestration generator 162 (such as a drawing tablet 262 and drawing pen 263 illustrated in FIG. 2B for example) and coupled into the stereo telestration system 160 over the communication link 172 .
  • the telestration generator 162 digitizes a telestration mark or telestration graphic into a digital telestration graphic image for communication over the link 172 .
  • the stereo telestration system 160 overlays the mono telestration images onto the stereo surgical images of the surgical site generated by the stereo endoscopic camera 110 to form annotated stereo surgical images.
  • the telestration system 160 couples the annotated stereo surgical images into the stereo display 164 of the console 150 over the stereo video communication link 175 for viewing by the operator O.
  • the telestration system 160 may also couple the annotated stereo surgical images over a video communication link 176 to a stereo viewer at a remote location for viewing by the person generating the telestration.
  • a single left or right channel of the annotated stereo surgical images may be coupled by the telestration system 160 over the video communication link 176 to a single video monitor 165 A at a remote location for viewing by the person generating the telestration.
  • telestration system 160 While the telestration system 160 generates video images, it may optionally provide a full duplex audio communication channel between the operator O and the person generating the telestration. Alternatively, a wireless or wired telephone system, such as cellular telephone system, internet protocol telephone system, or plain old telephone system may be used to provide the full duplex audio communication channel.
  • a wireless or wired telephone system such as cellular telephone system, internet protocol telephone system, or plain old telephone system may be used to provide the full duplex audio communication channel.
  • the remote located telestration equipment 161 may include a video monitor 165 A, a telestration generator, a microphone 180 , a speaker 181 , and an audio processor 182 coupled together as shown. Telestration images are generated by the telestration generating device 162 that is coupled to the stereo telestration system 160 over the communication link 172 .
  • a telestration generating device may also be referred to herein as a telestrator or a telestration generator.
  • the video monitor 165 A receives either the left or right channel of the annotated stereo surgical images over the video communication link 176 for viewing by the mentor M at the remote location.
  • the stereo viewer receives both of the left and right channels of the annotated stereo surgical images over the video communication link 176 at the remote location so that the mentor M may view stereo images similar to the stereo viewer 164 in the console 150 . That is, the communication link 176 may carry either one or both of a left or right channel of annotated surgical images.
  • the stereo telestration system 160 may overlay a mono telestration image onto stereo images of the surgical site (referred to as “stereo surgical images”) generated by the stereo endoscopic camera 110 to form annotated stereo surgical images.
  • stereo surgical images stereo images generated by the stereo endoscopic camera 110
  • the mono telestration image is not immediately overlayed onto the stereo surgical images.
  • the mentor M privately previews his telestration graphics overlayed onto the surgical site images on the monitor 165 A, 165 B before the telestration graphics are overlayed onto the surgical site images displayed at the stereo viewer 164 to the operator O. That is, the mentor M views the annotated surgical images before the telestration goes “live” on the stereo viewer for the operator O to see.
  • the telestration system 160 may optionally provide a full duplex audio communication channel 184 between the operator O and the mentor M.
  • the remote located telestration equipment 161 may include a microphone 180 , a speaker 181 , and an audio processor 182 coupled to the communication channel 184 .
  • the console may also include a microphone 180 , a speaker 181 , and an audio processor 186 coupled to the channel 184 to support full duplex communication.
  • the communication network 190 is a wide area network such as the internet and the communication devices 191 , 192 are wide area network routers.
  • the communication devices 191 , 192 are wide area network routers.
  • hands-free telephones may be used at each end to communication between remote locations over the plain old telephone system (POTS) of communication.
  • POTS plain old telephone system
  • the master control console 150 of the robotic surgical system 100 may include the computer 151 , a binocular or stereo viewer 312 , an arm support 314 , a pair of control input wrists and control input arms in a workspace 316 , foot pedals 318 (including foot pedals 318 A- 318 B), and a viewing sensor 320 .
  • the master control console 150 may further include the telestration system 160 for providing the telestration images overlaid on the surgical site images.
  • the master control console 150 may also include an audio processor or transceiver 317 coupled to a speaker 320 and a microphone 315 for a bi-directional voice communication system to provide full duplex voice communication between the operating surgeon O and the mentor M.
  • the audio processor or transceiver 317 may couple to or be a part of the telestration system 160 in embodiments of the invention.
  • the stereo viewer 312 has two displays where stereo three-dimensional images of the telestration and surgical site may be viewed to perform minimally invasive surgery.
  • the operator O typically sits in a chair, moves his or her head into alignment with the stereo viewer 312 to view the three-dimensional annotated images of the surgical site.
  • the master control console 150 may include the viewing sensor 320 disposed adjacent the binocular display 312 .
  • the system operator aligns his or her eyes with the binocular eye pieces of the display 312 to view a stereoscopic image of the telestration and surgical worksite, the operator's head sets off the viewing sensor 320 to enable the control of the robotic surgical tools 101 .
  • the viewing sensor 320 can disable or stop generating new control signals in response to movements of the touch sensitive handles in order to hold the state of the robotic surgical tools.
  • the arm support 314 can be used to rest the elbows or forearms of the operator O (typically a surgeon) while gripping touch sensitive handles of the control input wrists, one in each hand, in the workspace 316 to generate control signals.
  • the touch sensitive handles are positioned in the workspace 316 disposed beyond the arm support 314 and below the viewer 312 . This allows the touch sensitive handles to be moved easily in the control space 316 in both position and orientation to generate control signals.
  • the operator O can use his feet to control the foot-pedals 318 to change the configuration of the surgical system and generate additional control signals to control the robotic surgical instruments.
  • the computer 151 may include one or microprocessors 302 to execute instructions and a storage device 304 to store software with executable instructions that may be used to generate control signals to control the robotic surgical system 100 .
  • the computer 151 with its microprocessors 302 interprets movements and actuation of the touch sensitive handles (and other inputs from the operator O or other personnel) to generate control signals to control the robotic surgical instruments 101 in the surgical worksite.
  • the computer 151 and the stereo viewer 312 map the surgical worksite into the controller workspace 316 so it feels and appears to the operator that the touch sensitive handles are working over the surgical worksite.
  • the viewer 312 includes stereo images for each eye including a left image 400 L and a right image 400 R of the surgical site including any robotic surgical tools 400 respectively in a left viewfinder 401 L and a right viewfinder 401 R.
  • the images 400 L and 400 R in the viewfinders may be provided by a left display device 402 L and a right display device 402 R, respectively.
  • the display devices 402 L, 402 R may optionally be pairs of cathode ray tube (CRT) monitors, liquid crystal displays (LCDs), or other type of image display devices (e.g., plasma, digital light projection, etc.).
  • the images are provided in color by a pair of color display devices 402 L, 402 R; such as color CRTs or color LCDs.
  • telestration images may be provided to a surgeon by overlaying them onto the three dimensional image of the surgical site.
  • a right telestration image (RTI) 410 R is merged into or overlaid on the right image 400 R being displayed by the display device 402 R.
  • a left telestration image (LTI) 410 L is merged into or overlaid on the left image 400 L of the surgical site provided by the display device 402 L. In this manner, a stereo telestration image may be displayed to provide instructions to the operator O in the control of the robotic surgical tools in the surgical site.
  • a first embodiment of the stereo telestration imaging system includes the stereo endoscopic camera 110 , the telestration system 160 , remote telestration equipment 161 A, and the stereo viewer 164 .
  • the remote telestration equipment 161 A includes a telestration generator 162 A and a single video monitor 165 A for the mentor M to view a mono view of the annotated surgical site generated by the telestration system 160 .
  • the remote telestration equipment 161 A may further include a part of a full duplex audio communication system such as a telephone or speaker phone described previously with reference to FIG. 1 .
  • the telestration generator 162 A may include a drawing tablet 262 and a drawing pen 263 , to generate the mono view telestration images for overlay onto the stereo images of the surgical site.
  • the drawing tablet 262 and drawing pen 263 may also be referred to herein as a digitizing tablet and digitizing pen as they digitize a sketched drawing into a digital telestration graphic image.
  • the telestration generator 162 A may also include a keyboard 264 .
  • the telestration generator 162 A may additionally or in the alternative include one or more elements of the telestration generator 162 B described in greater detail below.
  • a mentor M may preview the telestration graphics that are to overlayed onto the surgical site images on the monitor 165 A, 165 B before the telestration graphics are overlayed onto the surgical site images displayed at the stereo viewer 164 to the operator O.
  • a mono-view telestration image may be generated for multiple video frames until an erase command is issued to the drawing tablet. That is, as the sketch is made on the drawing tablet, the mono view telestration images show the growth of the sketch until completion, which is then shown in a steady state until erased.
  • the stereo endoscopic camera 110 includes an endoscope 202 for insertion into a patient, a camera head 204 , a left image forming device (e.g., a charge coupled device (CCD)) 206 L, a right image forming device 206 R, a left camera control unit (CCU) 208 L, and a right camera control unit (CCU) 208 R coupled together as shown.
  • the stereo endoscopic camera 110 generates a left video channel 211 L and a right video channel 211 R of frames of images of the surgical site.
  • a lock reference signal is coupled between the left and right camera control units 208 L, 208 R.
  • the right camera control unit generates the lock signal that is coupled to the left camera control unit to synchronize the left view channel to the right video channel.
  • the left camera control unit generates the lock reference signal and the right video channel synchronizes to the left video channel.
  • the stereo display 164 includes a left monitor 230 L and a right monitor 230 R.
  • the viewfinders or monitors 230 L, 230 R may be provided by a left display device 402 L and a right display device 402 R, respectively.
  • the stereo images are provided in color by a pair of color display devices 402 L, 402 R.
  • Stereo images of a surgical site may be captured by other types of endoscopic devices and cameras with different structures. For example, a single optical channel may be used with a pair of spatially offset sensors to capture stereo images of the surgical site.
  • the telestration device or system 160 for the left video channel includes a left video combiner 210 L and a left synchronizer/noise reducer/enhancer device 214 L coupled to a VSD board 218 ; while the right channel includes a right video combiner 210 R and a left synchronizer/noise reducer/enhancer device 214 L coupled to the VSD board 218 .
  • the telestration device or system 160 may further include left and right power transformers 240 L- 240 R coupled to an isolation transformer 242 to receive power.
  • the left video combiner 210 L combines the telestration graphics or images with the left video images of the surgical site on the left video channel 211 L.
  • the right video combiner 210 R combines the telestration graphics or images with the right video images of the surgical site on the right video channel 211 R.
  • the left and right synchronizer/noise reducer/enhancer devices 214 L- 214 R perform analog-to digital conversion as necessary, plus electronic noise reduction and image enhancement/sharpening in order to improve (“sweeten”) the left and right images. Synchronization may also be provided by the devices 214 L- 214 R however is not strictly necessary since the camera control units (CCUS) are already synchronized.
  • CCUS camera control units
  • the VSD board 218 performs interlaced-to-progressive video scan conversion; electronic image-shifting to correct endoscope and camera optical misalignment as is described further in U.S. Pat. No. 6,720,988 by Gere et al. (previously incorporated by reference); and control graphic overlay for the respective left and right video channels.
  • the left and right video combiners 210 L, 210 R may combine video signals in various ways depending upon the type of video signals being provided.
  • the stereo video signals of the surgical site provided on the left and right video channels 211 L, 211 R are analog video signals.
  • the stereo video signals of the surgical site provided on the left and right video channels 211 L, 211 R are digital video signals.
  • the mono telestration video signals on the link 172 are analog video signals in one embodiment of the invention and are digital video signals in another embodiment of the invention.
  • various mixing techniques may be employed to mix the stereo surgical site video signals with the telestration video signals to form the stereo annotated surgical site video signals.
  • the type of mixing techniques used may vary to mix the stereo surgical site video signals and the telestration video signals together.
  • an alpha synchronizing signal may be provided that can be used to overlay the graphic telestration images onto the video signal of the surgical site.
  • Mixing two digital video sources may be simply performed by using a multiplexer to switch between sources or by soft keying by implementing full alpha mixing.
  • FIG. 5A two digital composite video signals each having their own alpha channel are mixed together.
  • the digital video signal of the surgical site is coupled into the mixer 500 A as one source and the digital video signal of the telestration image is coupled into the mixer 500 A as a second source.
  • the sources are keyed by their respective alpha signals alpha_ 0 and alpha_ 1 by the keying device (e.g., multiplier) 504 A- 504 B and then added together at the summer or adder 506 .
  • the keying device e.g., multiplier
  • the result from the summer 506 is then rounded and limited by a rounding/limiting device 508 to an appropriate number of bits of digital video.
  • the black level is then added back into the digital video signal at the adder or summer 510 to generate the annotated surgical site video signal as the resultant output from the mixer 500 A.
  • RGB component digital video signals For RGB component digital video signals, the mixing may be somewhat similar for each component signal.
  • an RGB component digital video signal is provided for the surgical site video signal (Surgical Site R_ 1 , G_ 1 , and B_ 1 ) and the telestration video signal (Telestration R_ 1 , G_ 1 , and B_ 1 ) and coupled into the video mixer 500 B.
  • the resultant output from the video mixer 500 B are the RGB components of the annotated surgical site video signal (Annotated Surgical Site R_out, G_out, and B_out).
  • the black level is typically zero by convention and therefore of little concern and this can be simplified from that of mixer 500 A.
  • the sources are keyed by their respective alpha signals alpha_ 0 and alpha_ 1 by the keying devices (e.g., multipliers) 504 A- 504 B to synchronize when the signals are to be added.
  • the synchronized signals are then added together at the summer or adders 506 A- 506 C for each respective component signal.
  • the result from each of the summers 506 A- 506 C is then rounded and limited by the rounding/limiting devices 508 A- 508 C to an appropriate number of bits of digital video to generate each respective RGB component of the annotated surgical site video signal (Annotated Surgical Site R_out, G_out, and B_out).
  • FIG. 5C illustrates a simple analog video mixer 500 C consisting of an analog multiplexer 520 that is responsive to a keying signal coupled to its select terminal.
  • the multiplexer 520 selects to output a video signal from two input video signals.
  • the multiplexer selects between the surgical video signal coupled to one input terminal and the telestration video signal coupled to a second input terminal.
  • the multiplexer 520 can generate the annotated surgical site video signal.
  • the keying signal is generated in response to a level of the input telestration video signal.
  • the luminance level of the telestration video signal may be used as the keying signal. With the luminance of telestration video signal above a predetermined level, the telestration video signal is selected to be output from the multiplexer 520 . With the luminance level of the telestration video signal below the predetermined level, the surgical site video signal is selected to be output from the multiplexer 520 . In this manner, the Annotated surgical site video signal can be generated by the mixer 500 C.
  • the analog video signal may be converted into a digital video signal and mixed according to digital mixing techniques.
  • the digital video signal may be used to key the analog video signal to select a monochrome image in the analog mixing technique or the digital video signal may be converted to an analog video signal and mixed according to analog mixing techniques.
  • the right video combiner 210 R may be a master video combiner feeding through the telestration graphics or images to a slave video combiner, the left video combiner 210 L, over a communication link 272 .
  • the right video combiner 210 R receives control/data signals and the telestration images on the communication link 172 (at COMM IN input) from the remote telestration generator 162 A.
  • the COMM-OUT output of the right video combiner 210 R is coupled to the COMM-IN input of the left video combiner 210 L by means of the communication link 272 .
  • the left video combiner may be the master combiner and the right video combiner may be the slave combiner.
  • the remote telestration device 162 A may couple to the telestration system 160 through the communication link 172 over the communication system 190 by means of the communication devices 191 , 192 .
  • the telestration images on the communication link 172 are in a digital data format in a preferred embodiment of the invention.
  • the communication link 172 may use a standard RS-232 digital communication protocol as the telestration data may be simple X and Y coordinates which are not of high bandwidth.
  • the right video combiner 210 R may be coupled to the left video combiner 210 L by way of the communication link 272 .
  • the communication link 272 may be another RS-232 link, for example.
  • the right video combiner 210 R simply relays the control/data signals and the telestration images on the communication link 172 to the left video combiner 210 L over the communication link 272 .
  • the remote telestration equipment 161 A includes the single video monitor 165 A for a mono view of the annotated surgical site generated by the telestration system 160 .
  • the video monitor 165 A couples to either a left annotated video channel 212 L or a right annotated video channel 212 R of the annotated surgical images to generate the mono view.
  • the video monitor 165 A may couple to either the left annotated video channel 212 L or the right annotated video channel 212 R over the communication system 190 by means of the communication devices 191 , 192 .
  • the stereo telestration imaging system includes the stereo endoscopic camera 110 , the telestration system 160 , remote telestration equipment 161 B, and the stereo viewer 164 .
  • the stereo telestration imaging system of FIG. 2B while substantially similar to that of FIG. 2A , differs in the remote telestration equipment 161 B (e.g., includes a stereo viewer 165 B instead of a monitor 165 A) and how it may be connected.
  • the annotated stereo surgical images from the telestration system 160 may be coupled over the video communication link 176 to a stereo viewer 165 B at a remote location for viewing by the person generating the telestration, such as the mentor M.
  • the stereo viewer 165 B may couple to the left and right video channels 220 L, 220 R to receive the stereo annotated surgical images and display them in the left display L and the right display R for viewing by the left and right eyes, respectively.
  • the stereo viewer 165 B may couple to the left and right video channels elsewhere in the telestration system 160 after the telestration images are mixed with the surgical site images, such as at left and right video channels 212 L, 212 R after the devices 210 L, 210 R or the left and right video channels 216 L, 216 R after the devices 214 L, 214 R.
  • the remote stereo viewer may couple to the telestration system 160 through the video link 176 over the communication system 190 by means of the communication devices 191 , 192 .
  • the remote telestration equipment 161 may be connected differently. Instead of the left and right video combiners being connected to the telestrator device in a master-slave configuration, they may be coupled in parallel to it. In this case, both of the left and right video combiners 210 L, 210 R receive control/data signals and the telestration data signals over the communication link 172 (at the COMM-IN inputs) from the remote telestration generator 162 B. If for some reason analog video signals are used, the communication link 172 may be split in two. If digital signals are used, the digital signal can be readily fanned out into two signals as illustrated and coupled into each communication input of the left and right video combiners 210 L, 210 R. The remote telestration generator 162 B may couple to the telestration system 160 through the communication link 172 over the communication system 190 by means of the communication devices 191 , 192 .
  • the telestration generator 162 B may include a computer 265 , a keyboard 264 , and an input device 266 (such as a mouse, for example) to generate the mono view telestration images for overlay onto the stereo images of the surgical site.
  • the telestration generator 162 B may additionally, or in the alternative, include one or more elements of the telestration generator 162 A, such as the drawing tablet 262 and the drawing pen 263 described in greater detail above.
  • the stereo telestration imaging system of FIG. 2B is modified to include a three-dimensional input device 266 as part of the remote telestration equipment 161 B with the stereo viewer 165 B.
  • the three-dimensional input device 266 may be a three-D mouse or a duplicate of the three-D input control devices at the master console 150 .
  • a mentoring surgeon M could view a three dimensional surgical site and draw one or more telestration marks at a depth he/she desires by means of the three-dimensional input device without need of any depth perception correction.
  • FIGS. 2A-2B illustrate separate functional blocks for the telestration device or system 160 , such as the left video combiner 210 L and the right video combiner 210 R, a plurality of the functional blocks may be incorporated into one integral electronic system, one integrated printed circuit board, or one integrated circuit, such as the VSD board 218 for example.
  • telestration graphic image In typical telestration systems, a telestration graphic image is typically placed in the foreground while the image being telestrated or sketched on is placed in the background.
  • the telestration graphic image may be a pure opaque overlay so that background objects may be visible. This implies that the depth of the telestration graphic is no deeper than the depth of the background object in order to preserve a foreground/background illusion.
  • the telestration image is displayed to both left and right eyes as is discussed above.
  • the left and right telestration images derived from the mono view of the telestration image may not fuse into a stereo or three dimensional image. In some cases, this may not matter and no depth perception correction is needed.
  • a mono-view telestration image is used to generate stereo telestration, it is desirable to correct for the differences in depth perception between the surgical site image and the telestration image in most applications. That is, it is desirable to fuse the left and right telestration images together in the stereo viewer at the same apparent depth of the surgical site stereo image when using a mono-view telestration image.
  • the telestration images are placed at a depth less than or equal to the surgical site image and not greater, if the surgical site image is the background. Placing the telestration images at a depth equal to the dept of the surgical site image is particularly useful when a mono view telestration image is generated by the mentor from a mono view. However, if the mentor has a stereo view and can directly generate a stereo image of the telestration graphics, placing the telestration images at a depth equal to the depth of the surgical site is less important. In which case, stereo image of the telestration graphics can be placed at a depth less than the depth of the surgical image because both mentor and operator viewing stereo telestration images can agree on the interpretation of the telestration graphic.
  • the right image 602 R includes a right telestration image 610 R in the surgical site around the needle 605 .
  • Simply mixing the mono telestration image drawn with respect to the right channel may result in a left telestration image 610 L offset within the surgical site from the needle 605 as illustrated in FIG. 6A .
  • the telestration graphic is positioned at a depth other than the foreground depth and it cannot uniquely identify any particular point to an operator O.
  • the telestration image is adjusted to the same depth of the background object so that the stereo telestration image may uniquely identify a background location.
  • the horizontal position of one half of the stereo pair of images is adjusted further away from the other so as to move the stereo telestration image 612 A down towards the perceived depth of the object of interest 611 .
  • the horizontal position of one half of the stereo pair of images is adjusted closer to the other so as to move the stereo telestration image 612 A up above the perceived depth of the object of interest 611 .
  • a stereo window 620 of the annotated stereo surgical site is illustrated having a left image 621 L and a right image 621 R that may be viewed in the stereo viewer.
  • the images in the stereo window may be moved in depth with respect to the plane of the stereo window by adjusting the stereo base or horizontal offset of the images.
  • the left telestration image 612 L 1 or 612 L 2 is horizontally adjusted to fuse and form a stereo telestration image at the perceived depth of the stereo image 611 of the object of interest.
  • the horizontal separation distance 625 between the left telestration image 612 L 1 or 612 R and the right telestration image 612 R may also be referred to herein as the horizontal offset or stereo base.
  • the horizontal position of the left telestration image 612 L 2 in the left image 621 L is adjusted further away from the right telestration image 612 R to a position of the left telestration image 612 L 1 , for example, to fuse and form the stereo telestration image at the perceived depth of the stereo image 611 of the object of interest. That is, the horizontal separation or horizontal offset is increased.
  • the horizontal position of the left telestration image 612 L 1 in the left image 621 L is adjusted closer to the right telestration image 612 R to a position of the left telestration image 612 L 2 , for example, to fuse and form the stereo telestration image at the perceived depth of the stereo image 611 of the object of interest. That is, the horizontal separation or horizontal offset is decreased.
  • the left or right image of the surgical site associated with the non-view channel is adjusted horizontally to move the perceived depth of the surgical image deeper in the stereo window or shallower in the stereo window.
  • the left and right telestration images are both adjusted horizontally to move close together or farther apart so as to adjust the perceived depth in the stereo window.
  • the left and right surgical site images are both adjusted horizontally to move close together or farther apart so as to adjust the perceived depth in the stereo window. Moving the left and right images further apart in the stereo window, increasing the horizontal offset, moves the stereo image farther away, increasing the perceived depth of the stereo image. Moving the left and right images closer together in the stereo window, decreasing the horizontal offset, moves the stereo image closer, reducing the perceived depth of the stereo image.
  • the telestration image associated with the video channel not viewed by the mentor is positionally adjusted.
  • the right channel 212 R of the annotated surgical site video signal is viewed by the mentor M over the video monitor 165 A.
  • the mentor generates the telestration graphic images relative to the right video channel 211 R images of the surgical site video signal so that it appears at the correct position therein.
  • the left video channel 211 L images of the surgical site video signal may not viewed by the Mentor M and may be referred to as the “non-viewed channel”.
  • the position of the telestration image associated with the non-viewed channel, left video channel 211 L of the surgical site is positionally adjusted.
  • the position of the left telestration image 610 L is adjusted to correct for the offset so that it is similarly positioned around the needle 605 as illustrated in the right image 602 R.
  • the telestration images for the non-viewed channel are positionally (i.e., horizontally assuming parallel camera and viewer/eyes) adjusted so that telestration images and the surgical site images are fusible and appear at the same depth, as located by the mentor.
  • the telestration images for the non-viewed channel may be automatically adjusted in position by the stereo telestration video system or it may be manually performed.
  • the robotic surgery system 100 may further include a control input 187 , such as a control knob, at the console 150 .
  • the control input may generate one or more control signals onto one or more control lines 186 to control the stereo telestration system 160 .
  • the control input may mechanically or electromechanically control the stereo endoscopic camera 110 through one or more control lines 159 .
  • a manual control input such as a control knob in the console 150 may be provided to allow the surgeon O in some embodiments of the invention to adjust the horizontal position of at least one of the left or right telestration images until they are fusible together.
  • the control knob may be used to generate an electronic control signal to control the mixing of the telestration image with the surgical site image for one channel.
  • the electronic control signal may alter the alpha signal in a digital mixer or the keying signal in an analog mixer, as to where the telestration image is to be overlaid onto the surgical site image.
  • the electronic control signal may cause a horizontal shift in the position of the digital pixel data of the telestration image in the video signal on one channel with respect to the surgical site image.
  • control knob may be used to mechanically or electro-mechanically (e.g., by electric motor control) control the left or right channels of the endoscopic camera 110 to move a left or right image of the surgical site to be properly located under the telestration image.
  • the robotic surgery system 100 may further include a control input 187 ′, such as a control knob, that may be manipulated by the mentor M at the remote telestration equipment 161 to generate an electronic control signal transmitted to the telestration system 160 .
  • the control input 187 ′ may generate one or more control signals onto one or more control lines 186 ′ to control the stereo telestration system 160 as further described herein.
  • the control input may mechanically or electromechanically control the stereo endoscopic camera 110 as further described herein through the one or more control lines 186 ′. If local cabling is unavailable, the control signals for the one or more control lines 186 ′ may be communicated over the communication link 190 by means of the communication devices 191 , 192 .
  • the exemplary endoscopic camera 110 includes a left observation optical system 702 L and a right observation optical system 702 R in the endoscope 202 .
  • the exemplary endoscopic camera 110 further includes a first mirror 711 L and a second mirror 712 L and one or more image formation lenses 703 L- 704 L in the left channel and a first mirror 711 R and a second mirror 712 R and one or more image formation lenses 703 R- 704 R in the right channel as part of the camera head 204 .
  • the exemplary endoscopic camera 110 further includes a focusing arrangement.
  • the lenses 704 L and 704 R may be adjusted in position by a position adjustment mechanism 724 to focus left and right images into the left and right cameras 206 L, 206 R, respectively.
  • the position adjustment mechanism 724 may be moved by an electric motor 784 through an appropriate transmission coupled there-between.
  • a position sensor 785 may be coupled to the position adjustment mechanism 724 , the motor 784 or the transmission coupled there-between to obtain a measure of focus position.
  • the motor 784 is controlled by means of a focus controller 786 that is typically connected to an input device at the console.
  • the left and right cameras 206 L, 206 R couple to the camera head 204 to receive the respective left and right images of the surgical site to provide a stereo image thereof.
  • the cameras 206 L, 206 R in one embodiment of the invention are charge coupled devices to generate a digital video signal.
  • the exemplary endoscopic camera 110 further includes the left and right camera control units 208 L, 208 R coupled to the left and right cameras 206 L, 206 R.
  • the left and right cameras 206 L, 206 R are movable about the respective optical axes 750 L, 750 R of the camera head 204 by position adjustment mechanisms 706 L, 705 R. That is, the position adjustment mechanisms 706 L, 705 R adjust the relative positions of the cameras 206 L, 206 R with respect to the left and right optical systems. In this manner, the position adjustment mechanisms 706 L, 705 R can be used to manually adjust the horizontal position of the left or right cameras 206 L- 206 R by a control knob 187 , 187 ′ to move a left or right image of the surgical site so that it is properly located under the telestration image.
  • the mirror 712 L and the and one or more image formation lenses 703 L- 704 L in the left channel are movable by a position adjustment mechanism 714 L while the mirror 712 R and the one or more image formation lenses 703 R- 704 R in the right channel are movable by a position adjustment mechanism 714 R.
  • the position adjustment mechanisms 714 L, 714 R may move the left or right optical axes 750 L, 750 R of the camera head 204 under the left and right cameras 206 L, 206 R by a control knob 187 , 187 ′ to move a left or right image of the surgical site so that it is properly located under the telestration image.
  • control knob for adjusting the position of the left or right telestration image may also be manipulated by the mentor M at the remote telestration equipment instead of the operator O at the console.
  • the control knob 187 ′ of the remote telestration equipment under control of the mentor M generates an electronic control signal transmitted to the telestration system 160 over communication link 190 through the communication devices 191 - 192 .
  • the mentor M views both left and right channels of the stereo pair of images such as illustrated in FIG. 2B . This allows the mentoring surgeon M to view the same stereo pair of images as the operating surgeon O.
  • the mentoring surgeon M “marks” one half or side (i.e., one of the left or right channel) of the stereo pair with a telestration marking instrument.
  • the telestration system duplicates the mark in the other half or side of the stereo pair displays both to the operating surgeon O and mentor M.
  • the mentoring surgeon M uses the control knob 187 ′ of the remote telestration equipment 161 to adjust the horizontal offset of the second mark (with respect to the first mark) until the stereo representation of the mark appears to be at the correct depth with respect to whatever the mentoring surgeon M determines is appropriate.
  • the control knob 187 , 187 ′ may be a generic control input device, which could be replaced with some other input device capable of representing a continuum of choices in the horizontal offset of the telestration image.
  • the automatic positional adjustment of the telestration image in the non-viewed channel uses a plurality of values for the position of the endoscopic camera in relationship to the surgical site, such as a plurality of distances between the endoscopic camera and the tissue at a plurality of points of the surgical site and a plurality of angles between lines from the endoscopic camera to the points in the tissue and line segments between the respective points.
  • FIG. 8 illustrates a first distance 801 A between the end of the endoscopic camera 110 and a first point P 1 on a plane of tissue 800 .
  • the first distance 801 A represents the depth of the object of interest in the stereo field of view at the first point P 1 , with P 1 being in the tissue plane and along the centerline C of the endoscopic camera 110 .
  • FIG. 8 further illustrates a second point P 2 and a third point P 3 on the tissue 800 with respective distances 801 B- 801 C between a line of sight of the range finder 805 and the plane of the tissue 800 . Additionally one may define angles 802 A- 802 D representing the angles between the various line-of-sight line segments 801 A- 801 C and the line segments between the points P 1 -P 3 as illustrated.
  • a plurality of points P on the tissue 800 with respective angles 802 and distances 801 may be used to determine the horizontal offset. If one angle 802 A, 802 D between the camera and the tissue is known, at least two distances ( 801 A, 801 B or 801 A, 801 C) between at least three points (P 1 ,P 2 , and P 3 ) may be used to determine the orientation of the tissue plane 800 and hence the horizontal offset at any point on that plane. Otherwise, at least three distances ( 801 A, 801 B, 801 C) between the camera and the tissue to at least three points (P 1 ,P 2 ,P 3 ) may be used to determine the horizontal offset.
  • sensing or computing modalities may be used to determine or estimate the distance 801 that represents the depth of the object of interest in the stereo field.
  • the sensing techniques may use hardware, software, or a combination thereof.
  • one or more range finders 805 similar to that used in auto-focus cameras may be used to determine the distances 801 A- 801 C.
  • the distances 801 A- 801 C may be computed from the position sensed by the focus sensor 785 associated with the focus motor 784 of the focusing arrangement of the endoscopic camera.
  • the one or more angles 802 A- 802 C between the endoscopic camera 110 and the respective one or more points P 1 -P 3 on the tissue plane 800 may be determined by using a plurality of range finders 805 .
  • the one or more angles may be determined by using a scanning range finder that scans in a circle around an axis on the tissue plane 800 .
  • angles may be determined using known tool tip locations in the surgical site acquired during an initialization sequence, for example.
  • Such an initialization sequence may ask the operator O to provide the location of the tissue plane to the electronics system by touching it with the system's surgical instruments, which may be positionally encoded to supply joint angles.
  • one may deduce the position of the instrument tips relative to the endoscopic camera tip if all joints are encoded and the kinematics are known.
  • image processing is used in that left and right images of the tissue in a surgical site are captured or registered as digital pixels into respective left and right digital arrays similar to the one array illustrated in FIG. 13 of U.S. Pat. No. 6,720,988.
  • a three dimensional model of the left and right images are further formed similar to that described and illustrated in FIGS. 15 and 16 of U.S. Pat. No. 6,720,988.
  • the depth of the central feature in the three-dimensional model at point 128 in FIG. 16 of U.S. Pat. No. 6,720,988 may be used to represent the distance 801 A, for example.
  • image processing methods may be used to compare the left and right images of the tissue in a surgical site to determine a measure for the distance 801 , such as spatial correlation, where the spatial delay provides an indication of the desired horizontal offset (“the crucial number”) between the left and right telestration images to fuse them together at an appropriate depth.
  • a depth map may be generated by software to judge the depth of a surgical site and render the telestration images at that depth.
  • a depth map may be constructed by several ways known in the field of computer vision depth estimation including generating a depth map from the stereo images of the surgical site using left and right image correlation. Alternately, a depth map could be generated by a scanning range sensor, or similar raster depth measurement instrument, attached or otherwise registered to the endoscope tip.
  • a disparity map may be used to indicate how a pixel in the left eye should be associated with a pixel in the right eye.
  • a depth map is formed by first creating a disparity map. With a disparity map, a depth map need not be created as the disparity map may be used directly to generate a stereo telestration graphic at desired depths. In some cases, a disparity map is created from a pure depth map (such as from a scanning range finder for example) to generate the stereo telestration mark.
  • FIGS. 9A-9C ignoring well known issues of occlusion for the purpose of simplification, diagrams illustrating the generation of a disparity map are now described.
  • the endoscopic camera 110 scans the surgical site 900 within its field of view using the its left and right image forming devices 206 L, 206 R.
  • a feature A 902 in the surgical site 900 is received and scanned by different areas and pixels of the left and right image forming devices 206 L, 206 R.
  • FIG. 9B illustrates left pixels of an exemplary left image 906 L and right pixels of an exemplary right image 906 R in the field of view of surgical site including the feature A 902 scanned in FIG. 9A .
  • the exemplary left image 906 L includes a matrix of a plurality of left pixels LP 0 , 0 through LPM,N on N left scan lines 910 L.
  • the exemplary right image 906 R includes a matrix of a plurality of right pixels RP 0 , 0 through RPM,N on N right scan lines 910 R.
  • the feature A 902 scans into the left and right images 906 L, 906 R at different horizontal pixel locations along respective scan lines 910 L-A and 910 R-A. From an edge (e.g., the left edge) of the left image, a left horizontal distance d 1 along the scan line 910 L-A can be determined to the scanned location of the feature A 902 . From a similar edge (e.g., the left edge) of the right image, a right horizontal distance d r along the scan line 910 R-A can be determined to the scanned location of the feature A 902 .
  • a disparity DP x,y for each pixel along scan lines in the right image 906 R may be determined in comparison with pixels in corresponding scan lines in the left image 906 L to form a disparity map.
  • a disparity DP x,y for each pixel along scan lines in the left image 906 L may be determined in comparison with pixels in corresponding scan lines in the right image 906 R to form a disparity map.
  • a mixture of feature-based matching and interpolation is employed to provide a DP x,y for every single point in one image, relative to the other image, where interpolation is useful to match points with which no feature is clearly associated.
  • a matrix 950 of disparities DP x,y for each pixel in one image forms the disparity map between right and left images.
  • DP 0 , 0 represents the disparity for one of the left pixel LPX 0 , 0 or right pixel RPX 0 , 0 .
  • DPm,n represents the disparity for one of the left pixel LPXm,n or right pixel RPXm,n.
  • a depth map is related to the disparity map by elementary geometric relationships. Given the optics of the viewer and/or endoscope, a depth map can be deduced from the disparity map. With the depth map and pixels of the left image 906 L as the base image, most of the right image 906 R may be generated, but for right-eye scenes that are occluded in the left eye.
  • a two-dimensional (“depth-less”) telestration mark or image may be “painted” onto a surgical site over a continuum of depths. That is, a telestration mark, drawing, or image may be drawn on top of one (e.g., the left) image of the stereo pair, and the artificial disparity in the other image (e.g., the right) of the stereo pair is created at a variety of depths, including different depths for different parts of the telestration mark. Digital image processing techniques may be applied to generate a continuum of depths for the stereo telestration image.
  • FIG. 10 a side perspective view of a surgical site to illustrate differences between a telestration mark having an apparent constant depth and a telestration mark having a depth continuum generated by a disparity map, such as the disparity map matrix 950 , between left and right images of the surgical site.
  • a disparity map such as the disparity map matrix 950
  • a surface 1000 of tissue for example in a surgical site is captured by a camera from above the tissue and viewed in stereo by a stereo viewer.
  • the surface 1000 is uneven having varying surface characteristics that are viewed at differing depths in the field of vision of the stereo viewer.
  • a mentor M generates a mono-view of a telestration mark 1002 A using a two dimensional input device.
  • the mono view telestration is transformed into a stereo view of left and right telestration images that are fused together and overlayed over the surface 1000 in the surgical site using a single horizontal offset value.
  • the mentor M may generate a stereo view of the telestration mark using a three dimensional input device but it is constrained to be above the surface 1000 .
  • the telestration mark 1002 A may appear to be hovering at an apparent constant depth over the varying surface 1000 .
  • a “painted” telestration mark 1002 B may be generated that appears to be painted onto the varying surface 1000 over its depth continuum.
  • the constant depth telestration mark 1002 A may be generated using a single horizontal offset value and a mono-view telestration image as previously discussed with reference to FIGS. 6B-6C .
  • the “painted” telestration mark 1002 B may be generated using the pixels of the mono-view telestration image and a disparity map with disparities for each pixel.
  • the mono-view of the telestration image is directly coupled to the left image for viewing by a left eye of the operator.
  • the disparity map is applied to the pixels of the left image to transform them into pixels for the right image.
  • the transformed pixels of the right image are viewed by the right eye of the operator.
  • the right image can be generated on a pixel-by-pixel basis so that when viewed by a stereo viewer, the mark 1002 B appears to be painted on top of the surface 1000 .
  • Visual feedback may be provided to show the difference between the placements of the constant depth telestration mark 1002 A and the painted telestration mark 1002 B.
  • the constant depth telestration mark 1002 A may be viewed as a red color image in the stereo viewer and the “painted” telestration mark 1002 B may be viewed as a blue color image on the surface 1000 in the stereo viewer.
  • the horizontal offset between the left and right telestration images may be a function of one or more distances 801 A- 801 C and one or more angles 802 A- 802 C. Regardless of how the distances and angles are determined, it is desirable to determine the amount of horizontal offset between the left and right telestration images to represent a point in space as points in a stereo pair, such that the left and right telestration images fuse together and the operator O perceives the point as being at the appropriate depth, which in some cases is at the same apparent depth as the object of interest in the stereo pair image. It is advantageous to adjust the position of the telestration image so that the operator O can view a three-dimensional image on a stereo viewer with a telestration overlay, without being confused or distracted by a non-fused stereo telestration image.
  • control knob 187 , 187 ′ to control the position of a left or right telestration image may be one or more of control buttons, keys, wheels, track ball, or other control input device.

Abstract

In one embodiment of the invention, a robotic surgical system includes a master control console having a stereo viewer to view stereo images; a surgical manipulator having a stereo endoscopic camera coupled to a robotic arm to generate the stereo images of a surgical site; a stereo telestration device coupled between the stereo endoscopic camera and the stereo viewer to mix telestration graphics and the stereo images of the surgical site together for viewing by the stereo viewer; and a telestration generator coupled to the stereo telestration device to generate the telestration graphics for overlay on the stereo images of the surgical site.

Description

    FIELD
  • The embodiments of the invention relate generally to telestration systems. More particularly, the embodiments of the invention relate to telestration mentoring systems for robotic surgery.
  • BACKGROUND
  • A telestrator is a device that allows its operator to draw a freehand sketch over a motion picture image. The act of drawing a freehand sketch over a motion picture image is often referred to as telestration. The freehand sketch may be referred to as a telestration image. Telestrators have been used to annotate televised weather reports and televised sporting events.
  • Telestration systems are often used in television broadcasts of football games to make a point to a television audience regarding one or more plays during the game. A sports commentator may draw sketches of objects, such as X and O, circles or lines, that is overlaid and displayed on still or moving video images of the play on the television monitor. Typically, the telestration image is displayed on a single television monitor in a mono-visual (“mono-view”) format and viewed by both eyes of the television viewer. The mono-view provided by the single television monitor is limited to two dimensional images.
  • BRIEF SUMMARY
  • The embodiments of the invention are summarized by the claims that follow below.
  • BRIEF DESCRIPTIONS OF THE DRAWINGS
  • FIG. 1 is a block diagram of a robotic surgery system including a stereo viewer and a stereo telestration system to provide annotated stereo images to a surgeon.
  • FIG. 2A is a block diagram of a first system to provide a stereo telestration image overlay in both left and right video channels to provide three-dimensional images in a stereo viewer.
  • FIG. 2B is a block diagram of a second system to provide a stereo telestration image overlay in both left and right video channels to provide three-dimensional images in a stereo viewer.
  • FIG. 3 is a perspective view of a robotic surgical master control console including the stereo viewer.
  • FIG. 4 illustrates the stereo viewer of the master control console of FIG. 3 with a stereo telestration image overlay in both left and right monitors to provide three-dimensional images of the surgical site and the telestration images.
  • FIG. 5A illustrates a block diagram of a digital composite video mixer to mix a surgical site video signal and a telestration video signal together.
  • FIG. 5B illustrates a block diagram of a digital component video mixer to mix a surgical site video signal and a telestration video signal together.
  • FIG. 5C illustrates a block diagram of an analog video mixer to mix an analog surgical site video signal and an analog telestration video signal together.
  • FIG. 6A illustrates left and right annotated surgical site images.
  • FIG. 6B illustrates a three dimensional coordinate system for the stereo images of a background object and the stereo telestration images.
  • FIG. 6C illustrates a stereo window of left and right images in the stereo viewer to show the horizontal offset between the left telestration image and the right telestration image to achieve fusing and the same depth.
  • FIG. 7 is a block diagram of an exemplary endoscopic camera.
  • FIG. 8 is a magnified perspective view of the exemplary endoscopic camera and the plane of a tissue or objection.
  • FIGS. 9A-9C are diagrams to illustrate the generation of a disparity map.
  • FIG. 10 is side perspective stereo view illustrating differences in a telestration mark generated at an apparent constant depth and a telestration mark generated with an apparent depth continuum to appear painted onto a surface.
  • DETAILED DESCRIPTION
  • In the following detailed description of the embodiments of the invention, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one skilled in the art that the embodiments of the invention may be practiced without these specific details. In other instances well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments of the invention.
  • One application for telestration systems is robotic surgery. In robotic surgery, two monitors are used to provide a stereo-visual (“stereo-view”) and a three-dimensional image to a pair of eyes. The three-dimensional image is important for depth perception of the surgical site and viewing the robotic surgical tools perform surgery on a patient within the surgical site.
  • A mono-visual image in a single monitor to a single eye is less desirable in robotic surgery. Similarly, while a stereo image of the surgical site is desirable, a mono-visual telestration image in only one monitor of the pair of monitors is less desirable during robotic surgery. With only a mono-visual telestration image, a surgeon may be confused, as one eye sees one half of a stereo image without the telestration image. Moreover, it may be hard on the surgeon's eyes and brain to view a mono-visual telestration image for extended periods and cause fatigue during surgery which is undesirable.
  • A video frame or a frame of pixel data may be used interchangeably with image herein. However, at a viewing device, an image is what is perceived by a user when viewing the video frame or pixel frame of data on the viewing device. A stereo image with a pair of images (e.g., a left image and a right image) has left and right video frames or left and right frames of pixel data. A mono-visual image or mono-image has one of a left image or a right image and one of a left or right video frame or a left or right frame of pixel data.
  • The embodiments of the invention include a method, apparatus, and system for stereo telestration for robotic surgery.
  • Robotic Surgical System
  • Referring now to FIG. 1, a block diagram of a robotic surgery system 100 is illustrated to perform minimally invasive robotic surgical procedures using a stereo telestration system. Robotic surgery generally involves the use of a robot manipulator that has multiple robotic manipulator arms. One or more of the robotic manipulator arms often support a surgical tool which may be articulated (such as jaws, scissors, graspers, needle holders, micro dissectors, staple appliers, tackers, suction/irrigation tools, clip appliers, or the like) or non-articulated (such as cutting blades, cautery probes, irrigators, catheters, suction orifices, or the like). At least one of the robotic manipulator arms 153 (e.g., the center robotic manipulator arm 153) is used to support a stereo or three dimensional surgical image capture device 110 such as a stereo endoscope (which may be any of a variety of structures such as a stereo laparoscope, arthroscope, hysteroscope, or the like), or, optionally, some other stereo imaging modality (such as ultrasound, fluoroscopy, magnetic resonance imaging, or the like). Robotic surgery may be used to perform a wide variety of surgical procedures, including but not limited to open surgery, neurosurgical procedures (such as stereotaxy), endoscopic procedures (such as laparoscopy, arthroscopy, thoracoscopy), and the like.
  • A user or operator O (generally a surgeon) performs a minimally invasive surgical procedure on patient P by manipulating input devices at a master control console 150. A computer 151 of the console 150 directs movement of robotically controlled endoscopic surgical instruments 101A-101B and 110, by means of one or more feedback/control cables 159, effecting movement of the instruments using a robotic surgical manipulator 152. The robotic surgical manipulator 152 may also be referred to as robotic patient-side cart system or simply as a cart. The robotic surgical manipulator 152 has one or more robotic arms 153. Typically, the robotic surgical manipulator 152 includes at least three robotic manipulator arms 153 supported by linkages 156,156′, with a central arm 153 supporting an endoscopic camera 110 and the robotic arms 153 to left and right of center supporting tissue manipulation tools 101A-101B.
  • Generally, the robotic arms 153 of robotic surgical manipulator 152 include a positioning portion and a driven portion. The positioning portion of the robotic surgical manipulator 152 remains in a fixed configuration during surgery while manipulating tissue. The driven portion of the robotic surgical manipulator 152 is actively articulated under the direction of the operator O generating control signals at the surgeon's console 150 during surgery. The actively driven portion of the arms 153 is herein referred to as an end effector 158. The positioning portion of the robotic arms 153 that are in a fixed configuration during surgery may be referred to as positioning linkage and/or “set-up joint” 156, 156′.
  • An assistant A may assist in pre-positioning of the robotic surgical manipulator 152 relative to patient P as well as swapping tools or instruments 101 for alternative tool structures, and the like, while viewing the internal surgical site via an assistant's display 154.
  • The image of the internal surgical site shown to A by the assistant's display 154 is provided by a left or right channel 176 of the stereo endoscopic camera 110 supported by the robotic surgical manipulator 152. In contrast, both left and right channels of the stereo endoscopic camera 110 are provided to the operator O in a stereo display 164 at the surgeon's console 150, one channel for each eye.
  • Stereo Telestration
  • A teacher, instructor, or other person, referred to generally as mentor M, may be on site or at a remote location and use a telestrator to generate telestration and provide comments and instructions to the operator O regarding the robotic surgical procedure in the surgical site of the patient P. In this manner an expert on the robotic surgical procedure, such as mentor M, may guide a less experienced operating surgeon O.
  • A typical telestration system provides mono-view images. The robotic surgical system has a stereo viewer which displays a three dimensional image of the surgical site to the surgeon O. If the telestration image is displayed in only one eye, confusion can result since the other eye is seeing the other image of the stereo pair without the telestration image overlay. To support stereo telestration from the mentor M, the robotic surgical system 100 includes a stereo telestration system 160 coupled between the console 150 and remote located telestration equipment 161. The remote located telestration equipment 161 may be located remotely in the same room as the patient and surgeon or in a different room, a different hospital, or a different city, country, continent or other differing location.
  • The stereo telestration system 160 processes left and right channels of stereo video signals and optionally, full duplex audio signals for audio/video communication. The stereo telestration system 160 receives stereo images of the surgical site (“stereo surgical images”) from the stereo endoscopic camera 110 over the stereo video communication link 170. A mono-view of telestration images (“mono telestration images”) is generated by a telestrator or telestration generator 162 (such as a drawing tablet 262 and drawing pen 263 illustrated in FIG. 2B for example) and coupled into the stereo telestration system 160 over the communication link 172. The telestration generator 162 digitizes a telestration mark or telestration graphic into a digital telestration graphic image for communication over the link 172.
  • The stereo telestration system 160 overlays the mono telestration images onto the stereo surgical images of the surgical site generated by the stereo endoscopic camera 110 to form annotated stereo surgical images. The telestration system 160 couples the annotated stereo surgical images into the stereo display 164 of the console 150 over the stereo video communication link 175 for viewing by the operator O. The telestration system 160 may also couple the annotated stereo surgical images over a video communication link 176 to a stereo viewer at a remote location for viewing by the person generating the telestration. Alternatively, a single left or right channel of the annotated stereo surgical images may be coupled by the telestration system 160 over the video communication link 176 to a single video monitor 165A at a remote location for viewing by the person generating the telestration.
  • While the telestration system 160 generates video images, it may optionally provide a full duplex audio communication channel between the operator O and the person generating the telestration. Alternatively, a wireless or wired telephone system, such as cellular telephone system, internet protocol telephone system, or plain old telephone system may be used to provide the full duplex audio communication channel.
  • The remote located telestration equipment 161 may include a video monitor 165A, a telestration generator, a microphone 180, a speaker 181, and an audio processor 182 coupled together as shown. Telestration images are generated by the telestration generating device 162 that is coupled to the stereo telestration system 160 over the communication link 172. A telestration generating device may also be referred to herein as a telestrator or a telestration generator.
  • In the case of a mono-view monitor, the video monitor 165A receives either the left or right channel of the annotated stereo surgical images over the video communication link 176 for viewing by the mentor M at the remote location. In the case of a stereo viewer for the mentor M, the stereo viewer receives both of the left and right channels of the annotated stereo surgical images over the video communication link 176 at the remote location so that the mentor M may view stereo images similar to the stereo viewer 164 in the console 150. That is, the communication link 176 may carry either one or both of a left or right channel of annotated surgical images.
  • As discussed previously, the stereo telestration system 160 may overlay a mono telestration image onto stereo images of the surgical site (referred to as “stereo surgical images”) generated by the stereo endoscopic camera 110 to form annotated stereo surgical images. However in an alternate embodiment of the invention, the mono telestration image is not immediately overlayed onto the stereo surgical images. Instead, the mentor M privately previews his telestration graphics overlayed onto the surgical site images on the monitor 165A,165B before the telestration graphics are overlayed onto the surgical site images displayed at the stereo viewer 164 to the operator O. That is, the mentor M views the annotated surgical images before the telestration goes “live” on the stereo viewer for the operator O to see.
  • As previously discussed, the telestration system 160 may optionally provide a full duplex audio communication channel 184 between the operator O and the mentor M. To support full duplex communication, the remote located telestration equipment 161 may include a microphone 180, a speaker 181, and an audio processor 182 coupled to the communication channel 184. The console may also include a microphone 180, a speaker 181, and an audio processor 186 coupled to the channel 184 to support full duplex communication.
  • If cables cannot be used to reach the remote located telestration equipment 161, modems, transceivers, or other communication devices 191,192 may be used to form data/audio/ video communication channels 172,176,184 over a communication network 190. In one embodiment of the invention, the communication network 190 is a wide area network such as the internet and the communication devices 191,192 are wide area network routers. For the audio channel, hands-free telephones may be used at each end to communication between remote locations over the plain old telephone system (POTS) of communication.
  • Referring now to FIG. 3, a perspective view of the robotic surgical master control console 150 is illustrated. The master control console 150 of the robotic surgical system 100 may include the computer 151, a binocular or stereo viewer 312, an arm support 314, a pair of control input wrists and control input arms in a workspace 316, foot pedals 318 (including foot pedals 318A-318B), and a viewing sensor 320. The master control console 150 may further include the telestration system 160 for providing the telestration images overlaid on the surgical site images. The master control console 150 may also include an audio processor or transceiver 317 coupled to a speaker 320 and a microphone 315 for a bi-directional voice communication system to provide full duplex voice communication between the operating surgeon O and the mentor M. The audio processor or transceiver 317 may couple to or be a part of the telestration system 160 in embodiments of the invention.
  • The stereo viewer 312 has two displays where stereo three-dimensional images of the telestration and surgical site may be viewed to perform minimally invasive surgery. When using the master control console, the operator O typically sits in a chair, moves his or her head into alignment with the stereo viewer 312 to view the three-dimensional annotated images of the surgical site. To ensure that the operator is viewing the surgical site when controlling the robotic surgical tools 101, the master control console 150 may include the viewing sensor 320 disposed adjacent the binocular display 312. When the system operator aligns his or her eyes with the binocular eye pieces of the display 312 to view a stereoscopic image of the telestration and surgical worksite, the operator's head sets off the viewing sensor 320 to enable the control of the robotic surgical tools 101. When the operator's head is removed the area of the display 312, the viewing sensor 320 can disable or stop generating new control signals in response to movements of the touch sensitive handles in order to hold the state of the robotic surgical tools.
  • The arm support 314 can be used to rest the elbows or forearms of the operator O (typically a surgeon) while gripping touch sensitive handles of the control input wrists, one in each hand, in the workspace 316 to generate control signals. The touch sensitive handles are positioned in the workspace 316 disposed beyond the arm support 314 and below the viewer 312. This allows the touch sensitive handles to be moved easily in the control space 316 in both position and orientation to generate control signals. Additionally, the operator O can use his feet to control the foot-pedals 318 to change the configuration of the surgical system and generate additional control signals to control the robotic surgical instruments.
  • The computer 151 may include one or microprocessors 302 to execute instructions and a storage device 304 to store software with executable instructions that may be used to generate control signals to control the robotic surgical system 100. The computer 151 with its microprocessors 302 interprets movements and actuation of the touch sensitive handles (and other inputs from the operator O or other personnel) to generate control signals to control the robotic surgical instruments 101 in the surgical worksite. In one embodiment of the invention, the computer 151 and the stereo viewer 312 map the surgical worksite into the controller workspace 316 so it feels and appears to the operator that the touch sensitive handles are working over the surgical worksite.
  • Referring now to FIG. 4, a perspective view of the stereo viewer 312 of the master control console 150 is illustrated. To provide a three-dimensional perspective, the viewer 312 includes stereo images for each eye including a left image 400L and a right image 400R of the surgical site including any robotic surgical tools 400 respectively in a left viewfinder 401L and a right viewfinder 401R. The images 400L and 400R in the viewfinders may be provided by a left display device 402L and a right display device 402R, respectively. The display devices 402L,402R may optionally be pairs of cathode ray tube (CRT) monitors, liquid crystal displays (LCDs), or other type of image display devices (e.g., plasma, digital light projection, etc.). In the preferred embodiment of the invention, the images are provided in color by a pair of color display devices 402L,402R; such as color CRTs or color LCDs.
  • In the stereo viewer, three dimensional telestration images may be provided to a surgeon by overlaying them onto the three dimensional image of the surgical site. In a right viewfinder 401R, a right telestration image (RTI) 410R is merged into or overlaid on the right image 400R being displayed by the display device 402R. In a left viewfinder 401L, a left telestration image (LTI) 410L is merged into or overlaid on the left image 400L of the surgical site provided by the display device 402L. In this manner, a stereo telestration image may be displayed to provide instructions to the operator O in the control of the robotic surgical tools in the surgical site.
  • Referring now to FIGS. 2A-2B, embodiments of stereo telestration imaging systems are illustrated. In FIG. 2A, a first embodiment of the stereo telestration imaging system includes the stereo endoscopic camera 110, the telestration system 160, remote telestration equipment 161A, and the stereo viewer 164.
  • As discussed previously, the remote telestration equipment 161A includes a telestration generator 162A and a single video monitor 165A for the mentor M to view a mono view of the annotated surgical site generated by the telestration system 160. The remote telestration equipment 161A may further include a part of a full duplex audio communication system such as a telephone or speaker phone described previously with reference to FIG. 1.
  • The telestration generator 162A may include a drawing tablet 262 and a drawing pen 263, to generate the mono view telestration images for overlay onto the stereo images of the surgical site. The drawing tablet 262 and drawing pen 263 may also be referred to herein as a digitizing tablet and digitizing pen as they digitize a sketched drawing into a digital telestration graphic image. The telestration generator 162A may also include a keyboard 264. The telestration generator 162A may additionally or in the alternative include one or more elements of the telestration generator 162B described in greater detail below.
  • As discussed previously for one embodiment of the invention, a mentor M may preview the telestration graphics that are to overlayed onto the surgical site images on the monitor 165A,165B before the telestration graphics are overlayed onto the surgical site images displayed at the stereo viewer 164 to the operator O. Additionally, a mono-view telestration image may be generated for multiple video frames until an erase command is issued to the drawing tablet. That is, as the sketch is made on the drawing tablet, the mono view telestration images show the growth of the sketch until completion, which is then shown in a steady state until erased.
  • The stereo endoscopic camera 110 includes an endoscope 202 for insertion into a patient, a camera head 204, a left image forming device (e.g., a charge coupled device (CCD)) 206L, a right image forming device 206R, a left camera control unit (CCU) 208L, and a right camera control unit (CCU) 208R coupled together as shown. The stereo endoscopic camera 110 generates a left video channel 211L and a right video channel 211R of frames of images of the surgical site. To initially synchronize left and right frames of data, a lock reference signal is coupled between the left and right camera control units 208L,208R. In one embodiment of the invention, the right camera control unit generates the lock signal that is coupled to the left camera control unit to synchronize the left view channel to the right video channel. However in another embodiment of the invention, the left camera control unit generates the lock reference signal and the right video channel synchronizes to the left video channel.
  • The stereo display 164 includes a left monitor 230L and a right monitor 230R. As discussed previously with reference to FIG. 4, the viewfinders or monitors 230L,230R may be provided by a left display device 402L and a right display device 402R, respectively. In the preferred embodiment of the invention, the stereo images are provided in color by a pair of color display devices 402L,402R.
  • Additional details of a stereo endoscopic camera and a stereo display may be found in U.S. Pat. No. 5,577,991 entitled “Three Dimensional Vision Endoscope with Position Adjustment Means for Imaging Device and Visual Field Mask” filed on Jul. 7, 1995 by Akui et al; U.S. Pat. No. 6,139,490 entitled “Stereoscopic Endoscope with Virtual Reality Viewing” filed on Nov. 10, 1997 by Breidenthal et al; and U.S. Pat. No. 6,720,988 entitled “Stereo Imaging System and Method for use in Telerobotic Systems” filed on Aug. 20, 1999 by Gere et al.; all of which are incorporated herein by reference. Stereo images of a surgical site may be captured by other types of endoscopic devices and cameras with different structures. For example, a single optical channel may be used with a pair of spatially offset sensors to capture stereo images of the surgical site.
  • The telestration device or system 160 for the left video channel includes a left video combiner 210L and a left synchronizer/noise reducer/enhancer device 214L coupled to a VSD board 218; while the right channel includes a right video combiner 210R and a left synchronizer/noise reducer/enhancer device 214L coupled to the VSD board 218. The telestration device or system 160 may further include left and right power transformers 240L-240R coupled to an isolation transformer 242 to receive power.
  • The left video combiner 210L combines the telestration graphics or images with the left video images of the surgical site on the left video channel 211L. The right video combiner 210R combines the telestration graphics or images with the right video images of the surgical site on the right video channel 211R. For the respective left and right video channels, the left and right synchronizer/noise reducer/enhancer devices 214L-214R perform analog-to digital conversion as necessary, plus electronic noise reduction and image enhancement/sharpening in order to improve (“sweeten”) the left and right images. Synchronization may also be provided by the devices 214L-214R however is not strictly necessary since the camera control units (CCUS) are already synchronized. The VSD board 218 performs interlaced-to-progressive video scan conversion; electronic image-shifting to correct endoscope and camera optical misalignment as is described further in U.S. Pat. No. 6,720,988 by Gere et al. (previously incorporated by reference); and control graphic overlay for the respective left and right video channels.
  • The left and right video combiners 210L,210R may combine video signals in various ways depending upon the type of video signals being provided. In one embodiment of the invention, the stereo video signals of the surgical site provided on the left and right video channels 211L,211R are analog video signals. In another embodiment of the invention, the stereo video signals of the surgical site provided on the left and right video channels 211L,211R are digital video signals. Similarly, the mono telestration video signals on the link 172 are analog video signals in one embodiment of the invention and are digital video signals in another embodiment of the invention. Depending upon whether analog, digital, or mixed analog and digital video signals are used, various mixing techniques may be employed to mix the stereo surgical site video signals with the telestration video signals to form the stereo annotated surgical site video signals. Additionally, depending upon the format of the video signals (composite video or component video and their respective video formats e.g., RGB, S-Video or Y/C, YUV, YIQ, YCrCb), the type of mixing techniques used may vary to mix the stereo surgical site video signals and the telestration video signals together. In any case, an alpha synchronizing signal may be provided that can be used to overlay the graphic telestration images onto the video signal of the surgical site.
  • Mixing two digital video sources may be simply performed by using a multiplexer to switch between sources or by soft keying by implementing full alpha mixing. In FIG. 5A, two digital composite video signals each having their own alpha channel are mixed together. The digital video signal of the surgical site is coupled into the mixer 500A as one source and the digital video signal of the telestration image is coupled into the mixer 500A as a second source. After subtracting out the digital value of the black level at the subtractors 502A-502B, the sources are keyed by their respective alpha signals alpha_0 and alpha_1 by the keying device (e.g., multiplier) 504A-504B and then added together at the summer or adder 506. The result from the summer 506 is then rounded and limited by a rounding/limiting device 508 to an appropriate number of bits of digital video. The black level is then added back into the digital video signal at the adder or summer 510 to generate the annotated surgical site video signal as the resultant output from the mixer 500A.
  • For RGB component digital video signals, the mixing may be somewhat similar for each component signal. In FIG. 5B, an RGB component digital video signal is provided for the surgical site video signal (Surgical Site R_1, G_1, and B_1) and the telestration video signal (Telestration R_1, G_1, and B_1) and coupled into the video mixer 500B. The resultant output from the video mixer 500B are the RGB components of the annotated surgical site video signal (Annotated Surgical Site R_out, G_out, and B_out). With the component video signals, the black level is typically zero by convention and therefore of little concern and this can be simplified from that of mixer 500A. For each component signal, the sources are keyed by their respective alpha signals alpha_0 and alpha_1 by the keying devices (e.g., multipliers) 504A-504B to synchronize when the signals are to be added. The synchronized signals are then added together at the summer or adders 506A-506C for each respective component signal. The result from each of the summers 506A-506C is then rounded and limited by the rounding/limiting devices 508A-508C to an appropriate number of bits of digital video to generate each respective RGB component of the annotated surgical site video signal (Annotated Surgical Site R_out, G_out, and B_out).
  • FIG. 5C illustrates a simple analog video mixer 500C consisting of an analog multiplexer 520 that is responsive to a keying signal coupled to its select terminal. The multiplexer 520 selects to output a video signal from two input video signals. The multiplexer selects between the surgical video signal coupled to one input terminal and the telestration video signal coupled to a second input terminal.
  • In response to the keying signal, the multiplexer 520 can generate the annotated surgical site video signal. The keying signal is generated in response to a level of the input telestration video signal. In one embodiment, the luminance level of the telestration video signal may be used as the keying signal. With the luminance of telestration video signal above a predetermined level, the telestration video signal is selected to be output from the multiplexer 520. With the luminance level of the telestration video signal below the predetermined level, the surgical site video signal is selected to be output from the multiplexer 520. In this manner, the Annotated surgical site video signal can be generated by the mixer 500C.
  • If mixed analog and digital video signals are provided, the analog video signal may be converted into a digital video signal and mixed according to digital mixing techniques. Alternatively, the digital video signal may be used to key the analog video signal to select a monochrome image in the analog mixing technique or the digital video signal may be converted to an analog video signal and mixed according to analog mixing techniques.
  • The right video combiner 210R may be a master video combiner feeding through the telestration graphics or images to a slave video combiner, the left video combiner 210L, over a communication link 272. In this case, the right video combiner 210R receives control/data signals and the telestration images on the communication link 172 (at COMM IN input) from the remote telestration generator 162A. The COMM-OUT output of the right video combiner 210R is coupled to the COMM-IN input of the left video combiner 210L by means of the communication link 272. Alternatively, the left video combiner may be the master combiner and the right video combiner may be the slave combiner.
  • The remote telestration device 162A may couple to the telestration system 160 through the communication link 172 over the communication system 190 by means of the communication devices 191,192.
  • The telestration images on the communication link 172 are in a digital data format in a preferred embodiment of the invention. The communication link 172 may use a standard RS-232 digital communication protocol as the telestration data may be simple X and Y coordinates which are not of high bandwidth.
  • As discussed previously, the right video combiner 210R may be coupled to the left video combiner 210L by way of the communication link 272. The communication link 272 may be another RS-232 link, for example. In this case, the right video combiner 210R simply relays the control/data signals and the telestration images on the communication link 172 to the left video combiner 210L over the communication link 272.
  • As discussed previously, the remote telestration equipment 161A includes the single video monitor 165A for a mono view of the annotated surgical site generated by the telestration system 160. The video monitor 165A couples to either a left annotated video channel 212L or a right annotated video channel 212R of the annotated surgical images to generate the mono view. The video monitor 165A may couple to either the left annotated video channel 212L or the right annotated video channel 212R over the communication system 190 by means of the communication devices 191,192.
  • Referring now to FIG. 2B, a second embodiment of the stereo telestration imaging system is illustrated. The stereo telestration imaging system includes the stereo endoscopic camera 110, the telestration system 160, remote telestration equipment 161B, and the stereo viewer 164. The stereo telestration imaging system of FIG. 2B, while substantially similar to that of FIG. 2A, differs in the remote telestration equipment 161B (e.g., includes a stereo viewer 165B instead of a monitor 165A) and how it may be connected.
  • As previously discussed, the annotated stereo surgical images from the telestration system 160 may be coupled over the video communication link 176 to a stereo viewer 165B at a remote location for viewing by the person generating the telestration, such as the mentor M. In this case, the stereo viewer 165B may couple to the left and right video channels 220L,220R to receive the stereo annotated surgical images and display them in the left display L and the right display R for viewing by the left and right eyes, respectively. Alternatively, the stereo viewer 165B may couple to the left and right video channels elsewhere in the telestration system 160 after the telestration images are mixed with the surgical site images, such as at left and right video channels 212L,212R after the devices 210L,210R or the left and right video channels 216L,216R after the devices 214L,214R. In any case, the remote stereo viewer may couple to the telestration system 160 through the video link 176 over the communication system 190 by means of the communication devices 191,192.
  • As mentioned previously, the remote telestration equipment 161 may be connected differently. Instead of the left and right video combiners being connected to the telestrator device in a master-slave configuration, they may be coupled in parallel to it. In this case, both of the left and right video combiners 210L,210R receive control/data signals and the telestration data signals over the communication link 172 (at the COMM-IN inputs) from the remote telestration generator 162B. If for some reason analog video signals are used, the communication link 172 may be split in two. If digital signals are used, the digital signal can be readily fanned out into two signals as illustrated and coupled into each communication input of the left and right video combiners 210L,210R. The remote telestration generator 162B may couple to the telestration system 160 through the communication link 172 over the communication system 190 by means of the communication devices 191,192.
  • The telestration generator 162B may include a computer 265, a keyboard 264, and an input device 266 (such as a mouse, for example) to generate the mono view telestration images for overlay onto the stereo images of the surgical site. The telestration generator 162B may additionally, or in the alternative, include one or more elements of the telestration generator 162A, such as the drawing tablet 262 and the drawing pen 263 described in greater detail above.
  • In yet another embodiment of the invention, the stereo telestration imaging system of FIG. 2B is modified to include a three-dimensional input device 266 as part of the remote telestration equipment 161B with the stereo viewer 165B. The three-dimensional input device 266 may be a three-D mouse or a duplicate of the three-D input control devices at the master console 150. In this manner, a mentoring surgeon M could view a three dimensional surgical site and draw one or more telestration marks at a depth he/she desires by means of the three-dimensional input device without need of any depth perception correction.
  • While FIGS. 2A-2B illustrate separate functional blocks for the telestration device or system 160, such as the left video combiner 210L and the right video combiner 210R, a plurality of the functional blocks may be incorporated into one integral electronic system, one integrated printed circuit board, or one integrated circuit, such as the VSD board 218 for example.
  • Depth Perception Correction for Stereo Telestration
  • In typical telestration systems, a telestration graphic image is typically placed in the foreground while the image being telestrated or sketched on is placed in the background. The telestration graphic image may be a pure opaque overlay so that background objects may be visible. This implies that the depth of the telestration graphic is no deeper than the depth of the background object in order to preserve a foreground/background illusion.
  • In stereo telestration, the telestration image is displayed to both left and right eyes as is discussed above.
  • By simply mixing the stereo surgical site with a mono-view telestration image, there may be a perceived difference in depth between the surgical site image and the telestration image in the annotated stereo surgical site image. Moreover, the left and right telestration images derived from the mono view of the telestration image may not fuse into a stereo or three dimensional image. In some cases, this may not matter and no depth perception correction is needed. However if a mono-view telestration image is used to generate stereo telestration, it is desirable to correct for the differences in depth perception between the surgical site image and the telestration image in most applications. That is, it is desirable to fuse the left and right telestration images together in the stereo viewer at the same apparent depth of the surgical site stereo image when using a mono-view telestration image.
  • Note that typically the telestration images are placed at a depth less than or equal to the surgical site image and not greater, if the surgical site image is the background. Placing the telestration images at a depth equal to the dept of the surgical site image is particularly useful when a mono view telestration image is generated by the mentor from a mono view. However, if the mentor has a stereo view and can directly generate a stereo image of the telestration graphics, placing the telestration images at a depth equal to the depth of the surgical site is less important. In which case, stereo image of the telestration graphics can be placed at a depth less than the depth of the surgical image because both mentor and operator viewing stereo telestration images can agree on the interpretation of the telestration graphic.
  • Referring now to FIG. 6A, a left image 602L and a right image 602R of an annotated stereo surgical site image is illustrated. The right image 602R includes a right telestration image 610R in the surgical site around the needle 605. Simply mixing the mono telestration image drawn with respect to the right channel may result in a left telestration image 610L offset within the surgical site from the needle 605 as illustrated in FIG. 6A. In this case, the telestration graphic is positioned at a depth other than the foreground depth and it cannot uniquely identify any particular point to an operator O.
  • Referring now to FIG. 6B, it is desirable to adjust the perceived depth of the stereo telestration image 612A or 612B to the perceived depth of the object of interest 611. The telestration image is adjusted to the same depth of the background object so that the stereo telestration image may uniquely identify a background location. In one case, the horizontal position of one half of the stereo pair of images is adjusted further away from the other so as to move the stereo telestration image 612A down towards the perceived depth of the object of interest 611. In another case, the horizontal position of one half of the stereo pair of images is adjusted closer to the other so as to move the stereo telestration image 612A up above the perceived depth of the object of interest 611.
  • Referring now to FIG. 6C, a stereo window 620 of the annotated stereo surgical site is illustrated having a left image 621L and a right image 621R that may be viewed in the stereo viewer. The images in the stereo window may be moved in depth with respect to the plane of the stereo window by adjusting the stereo base or horizontal offset of the images.
  • Assuming the right channel was used by the mentor to generate a right telestration image 612R around the right image 611R of the object of interest, the left telestration image 612L1 or 612L2 is horizontally adjusted to fuse and form a stereo telestration image at the perceived depth of the stereo image 611 of the object of interest. The horizontal separation distance 625 between the left telestration image 612L1 or 612R and the right telestration image 612R may also be referred to herein as the horizontal offset or stereo base.
  • To move the stereo telestration image 612A down towards the perceived depth of the object of interest 611, the horizontal position of the left telestration image 612L2 in the left image 621L is adjusted further away from the right telestration image 612R to a position of the left telestration image 612L1, for example, to fuse and form the stereo telestration image at the perceived depth of the stereo image 611 of the object of interest. That is, the horizontal separation or horizontal offset is increased. Alternatively, to move the stereo telestration image 612B up towards the perceived depth of the object of interest 611, the horizontal position of the left telestration image 612L1 in the left image 621L is adjusted closer to the right telestration image 612R to a position of the left telestration image 612L2, for example, to fuse and form the stereo telestration image at the perceived depth of the stereo image 611 of the object of interest. That is, the horizontal separation or horizontal offset is decreased.
  • In an alternate embodiment of the invention, the left or right image of the surgical site associated with the non-view channel is adjusted horizontally to move the perceived depth of the surgical image deeper in the stereo window or shallower in the stereo window. In yet another embodiment of the invention, the left and right telestration images are both adjusted horizontally to move close together or farther apart so as to adjust the perceived depth in the stereo window. In yet another embodiment of the invention, the left and right surgical site images are both adjusted horizontally to move close together or farther apart so as to adjust the perceived depth in the stereo window. Moving the left and right images further apart in the stereo window, increasing the horizontal offset, moves the stereo image farther away, increasing the perceived depth of the stereo image. Moving the left and right images closer together in the stereo window, decreasing the horizontal offset, moves the stereo image closer, reducing the perceived depth of the stereo image.
  • In the case of a mono-view being provided to the mentor, for the operator O to view a telestration image on the stereo viewer so that it is fusible with the left and right images of the surgical site, the telestration image associated with the video channel not viewed by the mentor is positionally adjusted. For example, in FIG. 2A the right channel 212R of the annotated surgical site video signal is viewed by the mentor M over the video monitor 165A. The mentor generates the telestration graphic images relative to the right video channel 211R images of the surgical site video signal so that it appears at the correct position therein.
  • The left video channel 211L images of the surgical site video signal may not viewed by the Mentor M and may be referred to as the “non-viewed channel”. In which case, the position of the telestration image associated with the non-viewed channel, left video channel 211L of the surgical site, is positionally adjusted. For example, in FIG. 6 the position of the left telestration image 610L is adjusted to correct for the offset so that it is similarly positioned around the needle 605 as illustrated in the right image 602R.
  • The telestration images for the non-viewed channel are positionally (i.e., horizontally assuming parallel camera and viewer/eyes) adjusted so that telestration images and the surgical site images are fusible and appear at the same depth, as located by the mentor. The telestration images for the non-viewed channel may be automatically adjusted in position by the stereo telestration video system or it may be manually performed.
  • For the surgeon O to adjust the horizontal offset of the left and right images, the robotic surgery system 100 may further include a control input 187, such as a control knob, at the console 150. The control input may generate one or more control signals onto one or more control lines 186 to control the stereo telestration system 160. Alternately, the control input may mechanically or electromechanically control the stereo endoscopic camera 110 through one or more control lines 159.
  • For manual adjustment, a manual control input such as a control knob in the console 150 may be provided to allow the surgeon O in some embodiments of the invention to adjust the horizontal position of at least one of the left or right telestration images until they are fusible together.
  • The control knob may be used to generate an electronic control signal to control the mixing of the telestration image with the surgical site image for one channel. In this case, the electronic control signal may alter the alpha signal in a digital mixer or the keying signal in an analog mixer, as to where the telestration image is to be overlaid onto the surgical site image. Alternatively, the electronic control signal may cause a horizontal shift in the position of the digital pixel data of the telestration image in the video signal on one channel with respect to the surgical site image. In some embodiments of the invention, the control knob may be used to mechanically or electro-mechanically (e.g., by electric motor control) control the left or right channels of the endoscopic camera 110 to move a left or right image of the surgical site to be properly located under the telestration image.
  • In other embodiments of the invention, the robotic surgery system 100 may further include a control input 187′, such as a control knob, that may be manipulated by the mentor M at the remote telestration equipment 161 to generate an electronic control signal transmitted to the telestration system 160. The control input 187′ may generate one or more control signals onto one or more control lines 186′ to control the stereo telestration system 160 as further described herein. Alternately, the control input may mechanically or electromechanically control the stereo endoscopic camera 110 as further described herein through the one or more control lines 186′. If local cabling is unavailable, the control signals for the one or more control lines 186′ may be communicated over the communication link 190 by means of the communication devices 191,192.
  • Referring now to FIG. 7, a block diagram of an exemplary endoscopic camera 110 is illustrated. The exemplary endoscopic camera 110 includes a left observation optical system 702L and a right observation optical system 702R in the endoscope 202. The exemplary endoscopic camera 110 further includes a first mirror 711L and a second mirror 712L and one or more image formation lenses 703L-704L in the left channel and a first mirror 711R and a second mirror 712R and one or more image formation lenses 703R-704R in the right channel as part of the camera head 204.
  • The exemplary endoscopic camera 110 further includes a focusing arrangement. The lenses 704L and 704R may be adjusted in position by a position adjustment mechanism 724 to focus left and right images into the left and right cameras 206L,206R, respectively. The position adjustment mechanism 724 may be moved by an electric motor 784 through an appropriate transmission coupled there-between. A position sensor 785 may be coupled to the position adjustment mechanism 724, the motor 784 or the transmission coupled there-between to obtain a measure of focus position. The motor 784 is controlled by means of a focus controller 786 that is typically connected to an input device at the console.
  • The left and right cameras 206L,206R couple to the camera head 204 to receive the respective left and right images of the surgical site to provide a stereo image thereof. The cameras 206L,206R in one embodiment of the invention are charge coupled devices to generate a digital video signal. The exemplary endoscopic camera 110 further includes the left and right camera control units 208L,208R coupled to the left and right cameras 206L,206R.
  • In one embodiment of the invention, the left and right cameras 206L,206R are movable about the respective optical axes 750L,750R of the camera head 204 by position adjustment mechanisms 706L,705R. That is, the position adjustment mechanisms 706L,705R adjust the relative positions of the cameras 206L,206R with respect to the left and right optical systems. In this manner, the position adjustment mechanisms 706L,705R can be used to manually adjust the horizontal position of the left or right cameras 206L-206R by a control knob 187,187′ to move a left or right image of the surgical site so that it is properly located under the telestration image.
  • In another embodiment of the invention, the mirror 712L and the and one or more image formation lenses 703L-704L in the left channel are movable by a position adjustment mechanism 714L while the mirror 712R and the one or more image formation lenses 703R-704R in the right channel are movable by a position adjustment mechanism 714R. In this manner, the position adjustment mechanisms 714L,714R may move the left or right optical axes 750L,750R of the camera head 204 under the left and right cameras 206L,206R by a control knob 187,187′ to move a left or right image of the surgical site so that it is properly located under the telestration image.
  • As discussed previously, the control knob for adjusting the position of the left or right telestration image may also be manipulated by the mentor M at the remote telestration equipment instead of the operator O at the console. The control knob 187′ of the remote telestration equipment under control of the mentor M generates an electronic control signal transmitted to the telestration system 160 over communication link 190 through the communication devices 191-192. In this case, the mentor M views both left and right channels of the stereo pair of images such as illustrated in FIG. 2B. This allows the mentoring surgeon M to view the same stereo pair of images as the operating surgeon O.
  • Closing one eye (or using some functionally similar technology such as a shutter on the left or right video image), the mentoring surgeon M “marks” one half or side (i.e., one of the left or right channel) of the stereo pair with a telestration marking instrument. The telestration system duplicates the mark in the other half or side of the stereo pair displays both to the operating surgeon O and mentor M. The mentoring surgeon M then uses the control knob 187′ of the remote telestration equipment 161 to adjust the horizontal offset of the second mark (with respect to the first mark) until the stereo representation of the mark appears to be at the correct depth with respect to whatever the mentoring surgeon M determines is appropriate.
  • The control knob 187,187′ may be a generic control input device, which could be replaced with some other input device capable of representing a continuum of choices in the horizontal offset of the telestration image.
  • The automatic positional adjustment of the telestration image in the non-viewed channel uses a plurality of values for the position of the endoscopic camera in relationship to the surgical site, such as a plurality of distances between the endoscopic camera and the tissue at a plurality of points of the surgical site and a plurality of angles between lines from the endoscopic camera to the points in the tissue and line segments between the respective points.
  • FIG. 8 illustrates a first distance 801A between the end of the endoscopic camera 110 and a first point P1 on a plane of tissue 800. The first distance 801A represents the depth of the object of interest in the stereo field of view at the first point P1, with P1 being in the tissue plane and along the centerline C of the endoscopic camera 110. FIG. 8 further illustrates a second point P2 and a third point P3 on the tissue 800 with respective distances 801B-801C between a line of sight of the range finder 805 and the plane of the tissue 800. Additionally one may define angles 802A-802D representing the angles between the various line-of-sight line segments 801A-801C and the line segments between the points P1-P3 as illustrated.
  • As previously discussed, a plurality of points P on the tissue 800 with respective angles 802 and distances 801 may be used to determine the horizontal offset. If one angle 802A,802D between the camera and the tissue is known, at least two distances (801A,801B or 801A,801C) between at least three points (P1,P2, and P3) may be used to determine the orientation of the tissue plane 800 and hence the horizontal offset at any point on that plane. Otherwise, at least three distances (801A,801B,801C) between the camera and the tissue to at least three points (P1,P2,P3) may be used to determine the horizontal offset.
  • Several sensing or computing modalities may be used to determine or estimate the distance 801 that represents the depth of the object of interest in the stereo field. The sensing techniques may use hardware, software, or a combination thereof.
  • In one embodiment of the invention, one or more range finders 805 similar to that used in auto-focus cameras may be used to determine the distances 801A-801C. In another embodiment of the invention, the distances 801A-801C may be computed from the position sensed by the focus sensor 785 associated with the focus motor 784 of the focusing arrangement of the endoscopic camera.
  • The one or more angles 802A-802C between the endoscopic camera 110 and the respective one or more points P1-P3 on the tissue plane 800 may be determined by using a plurality of range finders 805. Alternatively, the one or more angles may be determined by using a scanning range finder that scans in a circle around an axis on the tissue plane 800. Without a range finder, angles may be determined using known tool tip locations in the surgical site acquired during an initialization sequence, for example. Such an initialization sequence may ask the operator O to provide the location of the tissue plane to the electronics system by touching it with the system's surgical instruments, which may be positionally encoded to supply joint angles. As is appreciated by those in the art, one may deduce the position of the instrument tips relative to the endoscopic camera tip if all joints are encoded and the kinematics are known.
  • In yet another embodiment of the invention, image processing is used in that left and right images of the tissue in a surgical site are captured or registered as digital pixels into respective left and right digital arrays similar to the one array illustrated in FIG. 13 of U.S. Pat. No. 6,720,988. A three dimensional model of the left and right images are further formed similar to that described and illustrated in FIGS. 15 and 16 of U.S. Pat. No. 6,720,988. The depth of the central feature in the three-dimensional model at point 128 in FIG. 16 of U.S. Pat. No. 6,720,988 may be used to represent the distance 801A, for example.
  • Other image processing methods may be used to compare the left and right images of the tissue in a surgical site to determine a measure for the distance 801, such as spatial correlation, where the spatial delay provides an indication of the desired horizontal offset (“the crucial number”) between the left and right telestration images to fuse them together at an appropriate depth.
  • In another embodiment of the invention, a depth map may be generated by software to judge the depth of a surgical site and render the telestration images at that depth. A depth map may be constructed by several ways known in the field of computer vision depth estimation including generating a depth map from the stereo images of the surgical site using left and right image correlation. Alternately, a depth map could be generated by a scanning range sensor, or similar raster depth measurement instrument, attached or otherwise registered to the endoscope tip.
  • In yet another embodiment of the invention, a disparity map may be used to indicate how a pixel in the left eye should be associated with a pixel in the right eye. In a number of computer vision depth estimation algorithms, a depth map is formed by first creating a disparity map. With a disparity map, a depth map need not be created as the disparity map may be used directly to generate a stereo telestration graphic at desired depths. In some cases, a disparity map is created from a pure depth map (such as from a scanning range finder for example) to generate the stereo telestration mark.
  • Referring now to FIGS. 9A-9C, ignoring well known issues of occlusion for the purpose of simplification, diagrams illustrating the generation of a disparity map are now described. In FIG. 9A, the endoscopic camera 110 scans the surgical site 900 within its field of view using the its left and right image forming devices 206L,206R. A feature A 902 in the surgical site 900 is received and scanned by different areas and pixels of the left and right image forming devices 206L,206R.
  • FIG. 9B illustrates left pixels of an exemplary left image 906L and right pixels of an exemplary right image 906R in the field of view of surgical site including the feature A 902 scanned in FIG. 9A. The exemplary left image 906L includes a matrix of a plurality of left pixels LP0,0 through LPM,N on N left scan lines 910L. The exemplary right image 906R includes a matrix of a plurality of right pixels RP0,0 through RPM,N on N right scan lines 910R.
  • The feature A 902 scans into the left and right images 906L,906R at different horizontal pixel locations along respective scan lines 910L-A and 910R-A. From an edge (e.g., the left edge) of the left image, a left horizontal distance d1 along the scan line 910L-A can be determined to the scanned location of the feature A 902. From a similar edge (e.g., the left edge) of the right image, a right horizontal distance dr along the scan line 910R-A can be determined to the scanned location of the feature A 902.
  • Ignoring issues of occlusion, the disparity DP of the feature A 902 between right and left images may be determined by the equation DP=dr−dl. Similarly, a disparity DPx,y for each pixel along scan lines in the right image 906R may be determined in comparison with pixels in corresponding scan lines in the left image 906L to form a disparity map. Alternatively, a disparity DPx,y for each pixel along scan lines in the left image 906L may be determined in comparison with pixels in corresponding scan lines in the right image 906R to form a disparity map. Typically a mixture of feature-based matching and interpolation is employed to provide a DPx,y for every single point in one image, relative to the other image, where interpolation is useful to match points with which no feature is clearly associated.
  • Referring now to FIG. 9C, a matrix 950 of disparities DPx,y for each pixel in one image (right or left) forms the disparity map between right and left images. DP0,0 represents the disparity for one of the left pixel LPX0,0 or right pixel RPX0,0. Similarly, DPm,n represents the disparity for one of the left pixel LPXm,n or right pixel RPXm,n. Assuming that the left image is the base image, the disparity map for the right image and its pixels RPX0,0 through RPXm,n is to be determined such as by the equation DPx,y=drRPXx,y−dlLPXx,y.
  • A depth map is related to the disparity map by elementary geometric relationships. Given the optics of the viewer and/or endoscope, a depth map can be deduced from the disparity map. With the depth map and pixels of the left image 906L as the base image, most of the right image 906R may be generated, but for right-eye scenes that are occluded in the left eye.
  • While the horizontal offset between the left and right telestration images may be used to set a depth of the stereo telestration image, a two-dimensional (“depth-less”) telestration mark or image may be “painted” onto a surgical site over a continuum of depths. That is, a telestration mark, drawing, or image may be drawn on top of one (e.g., the left) image of the stereo pair, and the artificial disparity in the other image (e.g., the right) of the stereo pair is created at a variety of depths, including different depths for different parts of the telestration mark. Digital image processing techniques may be applied to generate a continuum of depths for the stereo telestration image.
  • Referring now FIG. 10, a side perspective view of a surgical site to illustrate differences between a telestration mark having an apparent constant depth and a telestration mark having a depth continuum generated by a disparity map, such as the disparity map matrix 950, between left and right images of the surgical site.
  • A surface 1000 of tissue for example in a surgical site is captured by a camera from above the tissue and viewed in stereo by a stereo viewer. The surface 1000 is uneven having varying surface characteristics that are viewed at differing depths in the field of vision of the stereo viewer.
  • A mentor M generates a mono-view of a telestration mark 1002A using a two dimensional input device. The mono view telestration is transformed into a stereo view of left and right telestration images that are fused together and overlayed over the surface 1000 in the surgical site using a single horizontal offset value. Alternatively, the mentor M may generate a stereo view of the telestration mark using a three dimensional input device but it is constrained to be above the surface 1000. In either case, the telestration mark 1002A may appear to be hovering at an apparent constant depth over the varying surface 1000.
  • Instead of generating the telestration mark 1002A at a constant depth, a “painted” telestration mark 1002B may be generated that appears to be painted onto the varying surface 1000 over its depth continuum. The constant depth telestration mark 1002A may be generated using a single horizontal offset value and a mono-view telestration image as previously discussed with reference to FIGS. 6B-6C. In contrast, the “painted” telestration mark 1002B may be generated using the pixels of the mono-view telestration image and a disparity map with disparities for each pixel.
  • For example, assume the mono-view of the telestration image is directly coupled to the left image for viewing by a left eye of the operator. The disparity map is applied to the pixels of the left image to transform them into pixels for the right image. The transformed pixels of the right image are viewed by the right eye of the operator. As the disparity map was generated using each pixel, the right image can be generated on a pixel-by-pixel basis so that when viewed by a stereo viewer, the mark 1002B appears to be painted on top of the surface 1000.
  • Visual feedback may be provided to show the difference between the placements of the constant depth telestration mark 1002A and the painted telestration mark 1002B. For example, the constant depth telestration mark 1002A may be viewed as a red color image in the stereo viewer and the “painted” telestration mark 1002B may be viewed as a blue color image on the surface 1000 in the stereo viewer.
  • As discussed previously, the horizontal offset between the left and right telestration images may be a function of one or more distances 801A-801C and one or more angles 802A-802C. Regardless of how the distances and angles are determined, it is desirable to determine the amount of horizontal offset between the left and right telestration images to represent a point in space as points in a stereo pair, such that the left and right telestration images fuse together and the operator O perceives the point as being at the appropriate depth, which in some cases is at the same apparent depth as the object of interest in the stereo pair image. It is advantageous to adjust the position of the telestration image so that the operator O can view a three-dimensional image on a stereo viewer with a telestration overlay, without being confused or distracted by a non-fused stereo telestration image.
  • While certain exemplary embodiments of the invention have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that the embodiments of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. For example, elements of one embodiment of the invention may be swapped for or combined with elements of another embodiment of the invention. As a further example, the control knob 187,187′ to control the position of a left or right telestration image may be one or more of control buttons, keys, wheels, track ball, or other control input device. Rather, the embodiments of the invention should be construed according to the claims that follow below.

Claims (18)

1-16. (canceled)
17. A method for a robotic surgical system comprising:
receiving a mono-view of a telestration graphic;
generating a stereo-view of the telestration graphic from the mono-view of the telestration graphic; and
overlaying the stereo-view of the telestration graphic onto a stereo-view of a surgical site.
18. (canceled)
19. The method of claim 17, wherein the generating of the stereo-view of the telestration graphic includes determining a disparity map between pixels in left images and right images of the stereo view of the surgical site to position the stereo view of the telestration graphic at one or more depths in the stereo-view of the surgical site.
20. The method of claim 17, wherein the generating of the stereo-view of the telestration graphic includes determining a depth map between left images and right images of the stereo view of the surgical site to position the stereo view of the telestration graphic over a continuum of depths in the stereo-view of the surgical site.
21. The method of claim 17, wherein the overlaying of the stereo-view of the telestration graphic onto the stereo-view of the surgical site includes combining a left image of the telestration graphic with a left image of the surgical site, and combining a right image of the telestration graphic with a right image of the surgical site.
22. The method of claim 21, wherein the combining of the left and right images of the telestration graphic with the left and right images of the surgical site are by mixing in response to a signal.
23-32. (canceled)
33. A method for telestrating on a three-dimensional image of a surgical site, comprising:
receiving a telestration graphic input associated with one of a pair of stereoscopic images of an surgical site;
determining a corresponding telestration graphic input in the other of the pair of stereoscopic images using at least one of a disparity map and a depth map corresponding to the pair of stereoscopic images so that a three-dimensional view of the telestration graphic input is generated which is displayable as a contoured overlay to a three-dimensional view of the surgical site; and
displaying the three-dimensional view of the telestration graphic input as the contoured overlay to the three-dimensional view of the surgical site on a three-dimensional display.
34. The method according to claim 33, further comprising:
transmitting information for the one of the pair of stereoscopic images to a location prior to receiving the telestration graphic input from the location.
35. The method according to claim 34, wherein the location is a computer operated by a mentor.
36. The method according to claim 34, further comprising:
receiving information for the pair of stereoscopic images prior to transmitting the information for the one of the pair of stereoscopic images to the location.
37. The method according to claim 36, wherein the information for the pair of stereoscopic images is received from a stereoscopic endoscope.
38. The method according to claim 37, wherein the pair of stereoscopic images comprises corresponding right and left camera views.
39. The method according to claim 33, further comprising:
displaying the three-dimensional view of the telestration graphic input as a non-destructive graphics overlay to the three-dimensional view of the surgical site.
40. A medical robotic system providing three-dimensional telestration comprising:
a stereo endoscopic camera;
a console having a three-dimensional display for displaying stereo images captured by the stereo endoscopic camera;
a telestration generator receiving at least one of a pair of stereo images captured by the stereo endoscopic camera to generate a two-dimensional telestration graphic input relative to the received stereo image; and
a stereo telestration system configured to receive the two-dimensional telestration graphic input from the telestration generator, determine a corresponding two-dimensional telestration graphic input in the other of the pair of stereoscopic images using at least one of a disparity map and a depth map corresponding to the pair of stereoscopic images so that a three-dimensional view of the telestration graphic input is generated which is displayable as a contoured overlay to a three-dimensional view of the surgical site on the three-dimensional display.
41. The medical robotic system according to claim 40, wherein the telestration generator includes one or more of a digitizing tablet coupled to the stereo telestration system, a digitizing pen coupled to the digitizing tablet, a first keyboard coupled to the digitizing tablet, a computer coupled to the stereo telestration system, a three-dimensional input device coupled to the computer, a mouse coupled to the computer, and a second keyboard coupled to the computer.
42. The medical robotic system according to claim 41, wherein the two-dimensional telestration graphic input is generated from input provided by a mentor operating the telestration generator.
US12/941,038 2005-12-30 2010-11-06 Stereo telestration for robotic surgery Abandoned US20110050852A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/941,038 US20110050852A1 (en) 2005-12-30 2010-11-06 Stereo telestration for robotic surgery

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/322,866 US7907166B2 (en) 2005-12-30 2005-12-30 Stereo telestration for robotic surgery
US12/941,038 US20110050852A1 (en) 2005-12-30 2010-11-06 Stereo telestration for robotic surgery

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/322,866 Continuation US7907166B2 (en) 2005-12-30 2005-12-30 Stereo telestration for robotic surgery

Publications (1)

Publication Number Publication Date
US20110050852A1 true US20110050852A1 (en) 2011-03-03

Family

ID=38225434

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/322,866 Active 2029-12-27 US7907166B2 (en) 2005-12-30 2005-12-30 Stereo telestration for robotic surgery
US12/941,038 Abandoned US20110050852A1 (en) 2005-12-30 2010-11-06 Stereo telestration for robotic surgery

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/322,866 Active 2029-12-27 US7907166B2 (en) 2005-12-30 2005-12-30 Stereo telestration for robotic surgery

Country Status (1)

Country Link
US (2) US7907166B2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070144298A1 (en) * 2005-12-27 2007-06-28 Intuitive Surgical Inc. Constraint based control in a minimally invasive surgical apparatus
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100194870A1 (en) * 2007-08-01 2010-08-05 Ovidiu Ghita Ultra-compact aperture controlled depth from defocus range sensor
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US20140016856A1 (en) * 2012-05-11 2014-01-16 Cornell University Prediction of successful grasps by end of arm tooling
US20140148818A1 (en) * 2011-08-04 2014-05-29 Olympus Corporation Surgical assistant system
US9026247B2 (en) 2011-03-30 2015-05-05 University of Washington through its Center for Communication Motion and video capture for tracking and evaluating robotic surgery and associated systems and methods
US9161772B2 (en) 2011-08-04 2015-10-20 Olympus Corporation Surgical instrument and medical manipulator
US9244523B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Manipulator system
US9244524B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Surgical instrument and control method thereof
US9423869B2 (en) 2011-08-04 2016-08-23 Olympus Corporation Operation support device
US9477301B2 (en) 2011-08-04 2016-10-25 Olympus Corporation Operation support device and assembly method thereof
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9519341B2 (en) 2011-08-04 2016-12-13 Olympus Corporation Medical manipulator and surgical support apparatus
US9524022B2 (en) 2011-08-04 2016-12-20 Olympus Corporation Medical equipment
US9568992B2 (en) 2011-08-04 2017-02-14 Olympus Corporation Medical manipulator
US9632577B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Operation support device and control method thereof
US9632573B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Medical manipulator and method of controlling the same
US9671860B2 (en) 2011-08-04 2017-06-06 Olympus Corporation Manipulation input device and manipulator system having the same
US9851782B2 (en) 2011-08-04 2017-12-26 Olympus Corporation Operation support device and attachment and detachment method thereof
WO2021149056A1 (en) * 2020-01-22 2021-07-29 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
EP4344668A1 (en) * 2022-09-28 2024-04-03 Medicaroid Corporation Remote surgery support system

Families Citing this family (348)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6132368A (en) * 1996-12-12 2000-10-17 Intuitive Surgical, Inc. Multi-component telepresence system and method
US7666191B2 (en) 1996-12-12 2010-02-23 Intuitive Surgical, Inc. Robotic surgical system with sterile surgical adaptor
US8182469B2 (en) 1997-11-21 2012-05-22 Intuitive Surgical Operations, Inc. Surgical accessory clamp and method
US8529582B2 (en) 1996-12-12 2013-09-10 Intuitive Surgical Operations, Inc. Instrument interface of a robotic surgical system
US6331181B1 (en) * 1998-12-08 2001-12-18 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
US7727244B2 (en) 1997-11-21 2010-06-01 Intuitive Surgical Operation, Inc. Sterile surgical drape
US8206406B2 (en) 1996-12-12 2012-06-26 Intuitive Surgical Operations, Inc. Disposable sterile surgical adaptor
US20030061188A1 (en) * 1999-12-23 2003-03-27 Linus Wiebe General information management system
US9155544B2 (en) * 2002-03-20 2015-10-13 P Tech, Llc Robotic systems and methods
DE10349649B3 (en) * 2003-10-17 2005-05-19 Karl Storz Gmbh & Co. Kg A method and apparatus for generating an annotated image in a sterile work area of a medical facility
US9943372B2 (en) * 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US8073528B2 (en) 2007-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Tool tracking systems, methods and computer products for image guided surgery
US7907166B2 (en) * 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US9782229B2 (en) 2007-02-16 2017-10-10 Globus Medical, Inc. Surgical robot platform
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US8035685B2 (en) * 2007-07-30 2011-10-11 General Electric Company Systems and methods for communicating video data between a mobile imaging system and a fixed monitor system
US9043018B2 (en) * 2007-12-27 2015-05-26 Intuitive Surgical Operations, Inc. Medical device with orientable tip for robotically directed laser cutting and biomaterial application
EP2252231B1 (en) * 2008-03-11 2019-10-16 Health Research, INC. System and method for robotic surgery simulation
US8184880B2 (en) 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
US8430892B2 (en) * 2009-10-06 2013-04-30 Covidien Lp Surgical clip applier having a wireless clip counter
US8521331B2 (en) 2009-11-13 2013-08-27 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
US8682489B2 (en) 2009-11-13 2014-03-25 Intuitive Sugical Operations, Inc. Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
US8996173B2 (en) 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US8543240B2 (en) 2009-11-13 2013-09-24 Intuitive Surgical Operations, Inc. Master finger tracking device and method of use in a minimally invasive surgical system
US8935003B2 (en) 2010-09-21 2015-01-13 Intuitive Surgical Operations Method and system for hand presence detection in a minimally invasive surgical system
WO2012044334A2 (en) 2009-11-13 2012-04-05 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US8477425B2 (en) 2010-02-28 2013-07-02 Osterhout Group, Inc. See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US8488246B2 (en) 2010-02-28 2013-07-16 Osterhout Group, Inc. See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
EP2539759A1 (en) 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US8472120B2 (en) 2010-02-28 2013-06-25 Osterhout Group, Inc. See-through near-eye display glasses with a small scale image source
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US8467133B2 (en) 2010-02-28 2013-06-18 Osterhout Group, Inc. See-through display with an optical assembly including a wedge-shaped illumination system
US8482859B2 (en) 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
JP5530234B2 (en) * 2010-03-29 2014-06-25 オリンパス株式会社 Operation input device and manipulator system
JP5704833B2 (en) * 2010-05-10 2015-04-22 オリンパス株式会社 Operation input device and manipulator system
US8988504B2 (en) 2010-06-03 2015-03-24 Semiconductor Components Industries, Llc Imaging systems with integrated stereo imagers
US9699438B2 (en) 2010-07-02 2017-07-04 Disney Enterprises, Inc. 3D graphic insertion for live action stereoscopic video
FR2963693B1 (en) 2010-08-04 2013-05-03 Medtech PROCESS FOR AUTOMATED ACQUISITION AND ASSISTED ANATOMICAL SURFACES
US8781629B2 (en) * 2010-09-22 2014-07-15 Toyota Motor Engineering & Manufacturing North America, Inc. Human-robot interface apparatuses and methods of controlling robots
US9805625B2 (en) 2010-10-29 2017-10-31 KindHeart, Inc. Surgical simulation assembly
US9342997B2 (en) 2010-10-29 2016-05-17 The University Of North Carolina At Chapel Hill Modular staged reality simulator
US9486189B2 (en) 2010-12-02 2016-11-08 Hitachi Aloka Medical, Ltd. Assembly for use with surgery system
US11412998B2 (en) * 2011-02-10 2022-08-16 Karl Storz Imaging, Inc. Multi-source medical display
US10674968B2 (en) * 2011-02-10 2020-06-09 Karl Storz Imaging, Inc. Adjustable overlay patterns for medical display
US10631712B2 (en) * 2011-02-10 2020-04-28 Karl Storz Imaging, Inc. Surgeon's aid for medical display
US9308050B2 (en) 2011-04-01 2016-04-12 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system and method for spinal and other surgeries
US9204939B2 (en) 2011-08-21 2015-12-08 M.S.T. Medical Surgery Technologies Ltd. Device and method for assisting laparoscopic surgery—rule based approach
US11561762B2 (en) * 2011-08-21 2023-01-24 Asensus Surgical Europe S.A.R.L. Vocally actuated surgical control system
US9757206B2 (en) 2011-08-21 2017-09-12 M.S.T. Medical Surgery Technologies Ltd Device and method for assisting laparoscopic surgery—rule based approach
US10866783B2 (en) 2011-08-21 2020-12-15 Transenterix Europe S.A.R.L. Vocally activated surgical control system
US9795282B2 (en) 2011-09-20 2017-10-24 M.S.T. Medical Surgery Technologies Ltd Device and method for maneuvering endoscope
US9330477B2 (en) * 2011-09-22 2016-05-03 Digital Surgicals Pte. Ltd. Surgical stereo vision systems and methods for microsurgery
US9766441B2 (en) 2011-09-22 2017-09-19 Digital Surgicals Pte. Ltd. Surgical stereo vision systems and methods for microsurgery
FR2983059B1 (en) * 2011-11-30 2014-11-28 Medtech ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD
US9404871B2 (en) * 2012-01-09 2016-08-02 General Electric Company Method and system for steering an insertion tube of a video inspection device
US8663209B2 (en) 2012-01-24 2014-03-04 William Harrison Zurn Vessel clearing apparatus, devices and methods
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US9833207B2 (en) 2012-08-08 2017-12-05 William Harrison Zurn Analysis and clearing module, system and method
US10178368B2 (en) 2012-10-23 2019-01-08 Intuitive Surgical Operations, Inc. Stereo imaging system with automatic disparity adjustment for displaying close range objects
US20140176661A1 (en) 2012-12-21 2014-06-26 G. Anthony Reina System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom)
WO2014110169A1 (en) * 2013-01-08 2014-07-17 Biocardia, Inc. Target site selection, entry and update with automatic remote image annotation
US9962533B2 (en) 2013-02-14 2018-05-08 William Harrison Zurn Module for treatment of medical conditions; system for making module and methods of making module
US9889568B2 (en) 2013-03-14 2018-02-13 Sri International Compact robotic wrist
US9948852B2 (en) 2013-03-15 2018-04-17 Intuitive Surgical Operations, Inc. Intelligent manual adjustment of an image control element
EP2967521B1 (en) 2013-03-15 2019-12-25 SRI International Electromechanical surgical system
US9675419B2 (en) 2013-08-21 2017-06-13 Brachium, Inc. System and method for automating medical procedures
JP6476658B2 (en) * 2013-09-11 2019-03-06 ソニー株式会社 Image processing apparatus and method
US9283048B2 (en) 2013-10-04 2016-03-15 KB Medical SA Apparatus and systems for precise guidance of surgical tools
CN110074844B (en) 2013-12-11 2023-02-17 柯惠Lp公司 Wrist assembly and jaw assembly for robotic surgical system
WO2015107099A1 (en) 2014-01-15 2015-07-23 KB Medical SA Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
US10039605B2 (en) 2014-02-11 2018-08-07 Globus Medical, Inc. Sterile handle for controlling a robotic surgical system from a sterile field
WO2015129473A1 (en) * 2014-02-28 2015-09-03 ソニー株式会社 Robot arm apparatus, calibration method, and program
US10561469B2 (en) * 2014-02-28 2020-02-18 Sony Corporation Robot arm apparatus and robot arm control method
US10334227B2 (en) 2014-03-28 2019-06-25 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
JP6854237B2 (en) 2014-03-28 2021-04-07 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Quantitative 3D visualization of instruments in the field of view
EP3125806B1 (en) 2014-03-28 2023-06-14 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes
CN110251047B (en) * 2014-03-28 2022-01-18 直观外科手术操作公司 Quantitative three-dimensional imaging and printing of surgical implants
US10004562B2 (en) 2014-04-24 2018-06-26 Globus Medical, Inc. Surgical instrument holder for use with a robotic surgical system
WO2015184146A1 (en) * 2014-05-30 2015-12-03 Sameh Mesallum Systems for automated biomechanical computerized surgery
CN107072673A (en) 2014-07-14 2017-08-18 Kb医疗公司 Anti-skidding operating theater instruments for preparing hole in bone tissue
USD779678S1 (en) 2014-07-24 2017-02-21 KindHeart, Inc. Surgical tray
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
KR20230059843A (en) * 2014-11-13 2023-05-03 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Integrated user environments
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US10835119B2 (en) 2015-02-05 2020-11-17 Duke University Compact telescope configurations for light scanning systems and methods of using the same
WO2016127088A1 (en) 2015-02-06 2016-08-11 Duke University Stereoscopic display systems and methods for displaying surgical data and information in a surgical microscope
US10555782B2 (en) 2015-02-18 2020-02-11 Globus Medical, Inc. Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
CN107249498A (en) 2015-02-19 2017-10-13 柯惠Lp公司 The method for relocating of the input unit of robotic surgical system
WO2016144937A1 (en) 2015-03-10 2016-09-15 Covidien Lp Measuring health of a connector member of a robotic surgical system
US20160314711A1 (en) * 2015-04-27 2016-10-27 KindHeart, Inc. Telerobotic surgery system for remote surgeon training using robotic surgery station and remote surgeon station with display of actual animal tissue images and associated methods
JP6714618B2 (en) 2015-06-03 2020-06-24 コヴィディエン リミテッド パートナーシップ Offset instrument drive
US10507068B2 (en) 2015-06-16 2019-12-17 Covidien Lp Robotic surgical system torque transduction sensing
US10779897B2 (en) 2015-06-23 2020-09-22 Covidien Lp Robotic surgical assemblies
EP3325233A1 (en) 2015-07-23 2018-05-30 SRI International Inc. Robotic arm and robotic surgical system
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10058394B2 (en) 2015-07-31 2018-08-28 Globus Medical, Inc. Robot arm and methods of use
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
US10687905B2 (en) 2015-08-31 2020-06-23 KB Medical SA Robotic surgical systems and methods
WO2017044965A1 (en) * 2015-09-10 2017-03-16 Duke University Systems and methods for arbitrary viewpoint robotic manipulation and robotic surgical assistance
US10034716B2 (en) 2015-09-14 2018-07-31 Globus Medical, Inc. Surgical robotic systems and methods thereof
WO2017048922A1 (en) 2015-09-16 2017-03-23 KindHeart, Inc. Surgical simulation system and associated methods
WO2017053363A1 (en) 2015-09-25 2017-03-30 Covidien Lp Robotic surgical assemblies and instrument drive connectors thereof
US9771092B2 (en) 2015-10-13 2017-09-26 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US10912449B2 (en) 2015-10-23 2021-02-09 Covidien Lp Surgical system for detecting gradual changes in perfusion
US10828125B2 (en) * 2015-11-03 2020-11-10 Synaptive Medical (Barbados) Inc. Dual zoom and dual field-of-view microscope
WO2017087439A1 (en) 2015-11-19 2017-05-26 Covidien Lp Optical force sensor for robotic surgical system
CN105395295B (en) * 2015-11-24 2017-05-10 张海钟 Robot system for treating oral cavity and teeth
US10614555B2 (en) * 2016-01-13 2020-04-07 Sony Corporation Correction processing of a surgical site image
DE112016006299T5 (en) * 2016-01-25 2018-10-11 Sony Corporation Medical safety control device, medical safety control method and medical support system
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
JP2017176318A (en) * 2016-03-29 2017-10-05 ソニー・オリンパスメディカルソリューションズ株式会社 Medical three-dimensional observation device, medical three-dimensional observation method, program and medical three-dimensional observation system
JP2017176307A (en) * 2016-03-29 2017-10-05 ソニー株式会社 Control unit of medical support arm, control method of medical support arm device, and medical system
WO2017173524A1 (en) 2016-04-07 2017-10-12 Titan Medical Inc. Camera positioning method and apparatus for capturing images during a medical procedure
US10694939B2 (en) 2016-04-29 2020-06-30 Duke University Whole eye optical coherence tomography(OCT) imaging systems and related methods
CA3022149A1 (en) 2016-05-26 2017-11-30 Covidien Lp Robotic surgical assemblies
WO2017205576A1 (en) 2016-05-26 2017-11-30 Covidien Lp Instrument drive units
CN109275333B (en) 2016-06-03 2022-05-17 柯惠Lp公司 System, method and computer readable program product for controlling a robotic delivery manipulator
US11446099B2 (en) 2016-06-03 2022-09-20 Covidien Lp Control arm for robotic surgical systems
WO2017210074A1 (en) 2016-06-03 2017-12-07 Covidien Lp Passive axis system for robotic surgical systems
US11553984B2 (en) 2016-06-03 2023-01-17 Covidien Lp Robotic surgical system with an embedded imager
JP6847218B2 (en) 2016-11-28 2021-03-24 バーブ サージカル インコーポレイテッドVerb Surgical Inc. Robotic surgical system to reduce unwanted vibrations
JP7233841B2 (en) 2017-01-18 2023-03-07 ケービー メディカル エスアー Robotic Navigation for Robotic Surgical Systems
CA3048039A1 (en) 2017-02-15 2018-08-23 Covidien Lp System and apparatus for crush prevention for medical robot applications
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
US10792119B2 (en) 2017-05-22 2020-10-06 Ethicon Llc Robotic arm cart and uses therefor
CN110650705B (en) 2017-05-24 2023-04-28 柯惠Lp公司 Presence detection of electrosurgical tools in robotic systems
JP7130003B2 (en) 2017-05-25 2022-09-02 コヴィディエン リミテッド パートナーシップ Systems and methods for detection of objects within the field of view of an image capture device
EP3629983B1 (en) 2017-05-25 2023-06-28 Covidien LP Robotic surgical systems and drapes for covering components of robotic surgical systems
EP3629980A4 (en) 2017-05-25 2021-03-10 Covidien LP Robotic surgical system with automated guidance
US10856948B2 (en) 2017-05-31 2020-12-08 Verb Surgical Inc. Cart for robotic arms and method and apparatus for registering cart to surgical table
US10485623B2 (en) 2017-06-01 2019-11-26 Verb Surgical Inc. Robotic arm cart with fine position adjustment features and uses therefor
US10913145B2 (en) 2017-06-20 2021-02-09 Verb Surgical Inc. Cart for robotic arms and method and apparatus for cartridge or magazine loading of arms
US11141226B2 (en) * 2017-06-23 2021-10-12 Asensus Surgical Us, Inc. Method of graphically tagging and recalling identified structures under visualization for robotic surgery
US10675094B2 (en) 2017-07-21 2020-06-09 Globus Medical Inc. Robot surgical platform
JP7349992B2 (en) 2017-09-05 2023-09-25 コヴィディエン リミテッド パートナーシップ Collision handling algorithms for robotic surgical systems
US11583358B2 (en) 2017-09-06 2023-02-21 Covidien Lp Boundary scaling of surgical robots
JP2020534050A (en) * 2017-09-06 2020-11-26 コヴィディエン リミテッド パートナーシップ Systems, methods, and computer-readable media for providing stereoscopic perception notifications and / or recommendations during robotic surgical procedures.
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11129636B2 (en) 2017-10-30 2021-09-28 Cilag Gmbh International Surgical instruments comprising an articulation drive that provides for high articulation angles
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US10898252B2 (en) 2017-11-09 2021-01-26 Globus Medical, Inc. Surgical robotic systems for bending surgical rods, and related methods and devices
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
US11382666B2 (en) 2017-11-09 2022-07-12 Globus Medical Inc. Methods providing bend plans for surgical rods and related controllers and computer program products
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11678881B2 (en) 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US20190200981A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US20190201118A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Display arrangements for robot-assisted surgical platforms
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US11304763B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US20190201039A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Situational awareness of electrosurgical systems
US11659023B2 (en) * 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US20190201139A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Communication arrangements for robot-assisted surgical platforms
US20190201146A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Safety systems for smart powered surgical stapling
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US11234756B2 (en) * 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
CA3085476A1 (en) 2018-01-04 2019-07-11 Covidien Lp Systems and assemblies for mounting a surgical accessory to robotic surgical systems, and providing access therethrough
US11154375B2 (en) 2018-02-02 2021-10-26 Brachium, Inc. Medical robotic work station
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US11189379B2 (en) 2018-03-06 2021-11-30 Digital Surgery Limited Methods and systems for using multiple data structures to process surgical data
US11389188B2 (en) 2018-03-08 2022-07-19 Cilag Gmbh International Start temperature of blade
CA3090181A1 (en) 2018-03-08 2019-09-12 Covidien Lp Surgical robotic systems
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11589915B2 (en) 2018-03-08 2023-02-28 Cilag Gmbh International In-the-jaw classifier based on a model
US11213294B2 (en) 2018-03-28 2022-01-04 Cilag Gmbh International Surgical instrument comprising co-operating lockout features
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
WO2019204012A1 (en) 2018-04-20 2019-10-24 Covidien Lp Compensation for observer movement in robotic surgical systems having stereoscopic displays
WO2019245867A1 (en) * 2018-06-19 2019-12-26 Tornier, Inc. Virtual guidance for orthopedic surgical procedures
WO2020009830A1 (en) 2018-07-03 2020-01-09 Covidien Lp Systems, methods, and computer-readable media for detecting image degradation during surgical procedures
US11197728B2 (en) 2018-09-17 2021-12-14 Auris Health, Inc. Systems and methods for concomitant medical procedures
US11689707B2 (en) * 2018-09-20 2023-06-27 Shoppertrak Rct Llc Techniques for calibrating a stereoscopic camera in a device
US11109746B2 (en) 2018-10-10 2021-09-07 Titan Medical Inc. Instrument insertion system, method, and apparatus for performing medical procedures
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11586106B2 (en) 2018-12-28 2023-02-21 Titan Medical Inc. Imaging apparatus having configurable stereoscopic perspective
US11717355B2 (en) 2019-01-29 2023-08-08 Covidien Lp Drive mechanisms for surgical instruments such as for use in robotic surgical systems
US11576733B2 (en) 2019-02-06 2023-02-14 Covidien Lp Robotic surgical assemblies including electrosurgical instruments having articulatable wrist assemblies
US11484372B2 (en) 2019-02-15 2022-11-01 Covidien Lp Articulation mechanisms for surgical instruments such as for use in robotic surgical systems
US11291444B2 (en) 2019-02-19 2022-04-05 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a closure lockout
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11918313B2 (en) 2019-03-15 2024-03-05 Globus Medical Inc. Active end effectors for surgical robots
US20200297357A1 (en) 2019-03-22 2020-09-24 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US20220409324A1 (en) * 2019-12-30 2022-12-29 Intuitive Surgical Operations, Inc. Systems and methods for telestration with spatial memory
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
USD963851S1 (en) 2020-07-10 2022-09-13 Covidien Lp Port apparatus
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
US11941814B2 (en) 2020-11-04 2024-03-26 Globus Medical Inc. Auto segmentation using 2-D images taken during 3-D imaging spin
USD1022197S1 (en) 2020-11-19 2024-04-09 Auris Health, Inc. Endoscope
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
KR20230113360A (en) 2020-11-30 2023-07-28 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 A system for providing composite indicators in a user interface for a robot-assisted system
EP4315824A1 (en) * 2021-03-29 2024-02-07 Alcon Inc. Stereoscopic imaging platform with continuous autofocusing mode
US11819302B2 (en) 2021-03-31 2023-11-21 Moon Surgical Sas Co-manipulation surgical system having user guided stage control
US11812938B2 (en) 2021-03-31 2023-11-14 Moon Surgical Sas Co-manipulation surgical system having a coupling mechanism removeably attachable to surgical instruments
US11832909B2 (en) 2021-03-31 2023-12-05 Moon Surgical Sas Co-manipulation surgical system having actuatable setup joints
AU2022247392A1 (en) 2021-03-31 2023-09-28 Moon Surgical Sas Co-manipulation surgical system for use with surgical instruments for performing laparoscopic surgery
US11844583B2 (en) 2021-03-31 2023-12-19 Moon Surgical Sas Co-manipulation surgical system having an instrument centering mode for automatic scope movements
US11948226B2 (en) 2021-05-28 2024-04-02 Covidien Lp Systems and methods for clinical workspace simulation
US11857273B2 (en) 2021-07-06 2024-01-02 Globus Medical, Inc. Ultrasonic robotic surgical navigation
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
CN114098993B (en) * 2021-11-24 2023-05-23 重庆金山医疗机器人有限公司 Main hand pitching information acquisition method
US11918304B2 (en) 2021-12-20 2024-03-05 Globus Medical, Inc Flat panel registration fixture and method of using same
US11832910B1 (en) 2023-01-09 2023-12-05 Moon Surgical Sas Co-manipulation surgical system having adaptive gravity compensation

Citations (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4219842A (en) * 1978-08-24 1980-08-26 The Magnavox Company Video signal combiner having a common phase response and independent amplitude response
US4603231A (en) * 1983-03-31 1986-07-29 Interand Corporation System for sensing spatial coordinates
US4614366A (en) * 1983-11-18 1986-09-30 Exactident, Inc. Nail identification wafer
US5217003A (en) * 1991-03-18 1993-06-08 Wilk Peter J Automated surgical system and apparatus
US5428192A (en) * 1993-05-10 1995-06-27 Ace Cad Enterprise Co., Ltd. Method and apparatus for finding the location of a pointing instrument on a tablet
US5432528A (en) * 1991-04-12 1995-07-11 Abekas Video Systems, Inc. Video combiner
US5468921A (en) * 1994-08-11 1995-11-21 Boeckeler Instruments, Inc. Extended communication cable for a light pen or the like
US5561708A (en) * 1991-10-03 1996-10-01 Viscorp Method and apparatus for interactive television through use of menu windows
US5577991A (en) * 1992-06-09 1996-11-26 Olympus Optical Co., Ltd. Three-dimensional vision endoscope with position adjustment means for imaging device and visual field mask
US5579057A (en) * 1993-06-07 1996-11-26 Scientific-Atlanta, Inc. Display system for selectively overlaying symbols and graphics onto a video signal
US5657095A (en) * 1993-06-14 1997-08-12 Pioneer Electronic Corporation System for Combining image signals
US5797900A (en) * 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5808665A (en) * 1992-01-21 1998-09-15 Sri International Endoscopic surgical instrument and method for use
US5836869A (en) * 1994-12-13 1998-11-17 Olympus Optical Co., Ltd. Image tracking endoscope system
US5839441A (en) * 1996-06-03 1998-11-24 The Trustees Of The University Of Pennsylvania Marking tumors and solid objects in the body with ultrasound
US5855583A (en) * 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
US5950629A (en) * 1991-06-13 1999-09-14 International Business Machines Corporation System for assisting a surgeon during surgery
US6014666A (en) * 1997-10-28 2000-01-11 Microsoft Corporation Declarative and programmatic access control of component-based server applications using roles
US6057833A (en) * 1997-04-07 2000-05-02 Shoreline Studios Method and apparatus for providing real time enhancements and animations over a video image
US6108458A (en) * 1996-07-03 2000-08-22 Massachusetts Institute Of Technology Sparse array image correlation
US6139490A (en) * 1996-02-22 2000-10-31 Precision Optics Corporation Stereoscopic endoscope with virtual reality viewing
US6201984B1 (en) * 1991-06-13 2001-03-13 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US6317868B1 (en) * 1997-10-24 2001-11-13 University Of Washington Process for transparently enforcing protection domains and access control as well as auditing operations in software components
US20020012460A1 (en) * 2000-06-26 2002-01-31 Kabushiki Kaisha Topcon Stereo image measuring device
US20020018064A1 (en) * 2000-07-11 2002-02-14 Itaru Hatanaka Apparatus and method for processing three-dimensional graphic images
US20020082612A1 (en) * 1998-11-20 2002-06-27 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US6434416B1 (en) * 1998-11-10 2002-08-13 Olympus Optical Co., Ltd. Surgical microscope
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
US20020157023A1 (en) * 2001-03-29 2002-10-24 Callahan John R. Layering enterprise application services using semantic firewalls
US6515659B1 (en) * 1998-05-27 2003-02-04 In-Three, Inc. Method and system for creating realistic smooth three-dimensional depth contours from two-dimensional images
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US20030151809A1 (en) * 2002-02-12 2003-08-14 Susumu Takahashi Observation apparatus
US6612980B2 (en) * 1995-07-24 2003-09-02 Medical Media Systems Anatomical visualization system
US20030210812A1 (en) * 2002-02-26 2003-11-13 Ali Khamene Apparatus and method for surgical navigation
US20040002642A1 (en) * 2002-07-01 2004-01-01 Doron Dekel Video pose tracking system and method
US6678090B2 (en) * 1995-05-17 2004-01-13 Leica Microsystems Ag Microscope
US20040009459A1 (en) * 2002-05-06 2004-01-15 Anderson James H. Simulation system for medical procedures
US20040039485A1 (en) * 1999-04-07 2004-02-26 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US20040052333A1 (en) * 2002-09-13 2004-03-18 James Sayre Device and method for margin marking of radiography specimens
US6714841B1 (en) * 1995-09-15 2004-03-30 Computer Motion, Inc. Head cursor control interface for an automated endoscope system for optimal positioning
US6714839B2 (en) * 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
US6720988B1 (en) * 1998-12-08 2004-04-13 Intuitive Surgical, Inc. Stereo imaging system and method for use in telerobotic systems
US20040070615A1 (en) * 2002-05-31 2004-04-15 Ewing Richard E. Communicating medical information in a communication network
US6728884B1 (en) * 1999-10-01 2004-04-27 Entrust, Inc. Integrating heterogeneous authentication and authorization mechanisms into an application access control system
US6731988B1 (en) * 1992-01-21 2004-05-04 Sri International System and method for remote endoscopic surgery
US6741757B1 (en) * 2000-03-07 2004-05-25 Microsoft Corporation Feature correspondence between images using an image pyramid
US6791601B1 (en) * 1999-11-11 2004-09-14 Stryker Corporation Multi-function image and video capture device for use in an endoscopic camera system
US6799065B1 (en) * 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US6817972B2 (en) * 1999-10-01 2004-11-16 Computer Motion, Inc. Heart stabilizer
US6856324B2 (en) * 2001-03-27 2005-02-15 Siemens Corporate Research, Inc. Augmented reality guided instrument positioning with guiding graphics
US6864886B1 (en) * 2000-08-10 2005-03-08 Sportvision, Inc. Enhancing video using a virtual surface
US6866671B2 (en) * 1996-12-12 2005-03-15 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
US20050154288A1 (en) * 1996-06-24 2005-07-14 Computer Motion, Inc. Method and apparatus for accessing medical data over a network
US20050179702A1 (en) * 2004-02-13 2005-08-18 Video Delta, Inc. Embedded video processing system
US20060013473A1 (en) * 1997-04-15 2006-01-19 Vulcan Patents Llc Data processing system and method
US20060087504A1 (en) * 1999-10-21 2006-04-27 Meier Kevin R Telestrator system
US20060100505A1 (en) * 2004-10-26 2006-05-11 Viswanathan Raju R Surgical navigation using a three-dimensional user interface
US20060142657A1 (en) * 2002-03-06 2006-06-29 Mako Surgical Corporation Haptic guidance system and method
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20070021738A1 (en) * 2005-06-06 2007-01-25 Intuitive Surgical Inc. Laparoscopic ultrasound robotic surgical system
US7194118B1 (en) * 2000-11-10 2007-03-20 Lucid, Inc. System for optically sectioning and mapping surgically excised tissue
US20070156017A1 (en) * 2005-12-30 2007-07-05 Intuitive Surgical Inc. Stereo telestration for robotic surgery
US20070167702A1 (en) * 2005-12-30 2007-07-19 Intuitive Surgical Inc. Medical robotic system providing three-dimensional telestration
US20070183041A1 (en) * 2006-02-09 2007-08-09 Northern Digital Inc. Retroreflective marker-tracking systems
US20070265527A1 (en) * 2006-05-11 2007-11-15 Richard Wohlgemuth Medical position determination using redundant position detection means and priority weighting for the position detection means
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20080033240A1 (en) * 2005-10-20 2008-02-07 Intuitive Surgical Inc. Auxiliary image display and manipulation on a computer display in a medical robotic system
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US20080285724A1 (en) * 2007-05-05 2008-11-20 Ziehm Imaging Gmbh X-ray diagnostic imaging system with a plurality of coded markers
US20090036902A1 (en) * 2006-06-06 2009-02-05 Intuitive Surgical, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20090046146A1 (en) * 2007-08-13 2009-02-19 Jonathan Hoyt Surgical communication and control system
US20090073170A1 (en) * 2004-10-26 2009-03-19 Koninklijke Philips Electronics, N.V. Disparity map
US20090088773A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US20090088634A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Tool tracking systems and methods for image guided surgery
US20090088897A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Methods and systems for robotic instrument tool tracking
US20090171332A1 (en) * 2007-12-27 2009-07-02 Intuitive Surgical, Inc. Medical device with orientable tip for robotically directed laser cutting and biomaterial application
US20090171371A1 (en) * 2007-12-26 2009-07-02 Intuitive Surgical, Inc. Medical robotic system with functionality to determine and display a distance indicated by movement of a tool robotically manipulated by an operator
US7563228B2 (en) * 2005-01-24 2009-07-21 Siemens Medical Solutions Usa, Inc. Stereoscopic three or four dimensional ultrasound imaging
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20090268010A1 (en) * 2008-04-26 2009-10-29 Intuitive Surgical, Inc. Augmented stereoscopic visualization for a surgical robot using a captured fluorescence image and captured stereoscopic visible images
US20100074525A1 (en) * 2008-09-23 2010-03-25 Tal Drory Manipulating an Image by Applying a De-Identification Process
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100168763A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Configuration marker design and detection for instrument tracking
US20100168562A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Fiducial marker design and detection for locating surgical instrument in images
US20100245541A1 (en) * 2009-03-31 2010-09-30 Intuitive Surgical, Inc. Targets, fixtures, and workflows for calibrating an endoscopic camera
US8184880B2 (en) * 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5175616A (en) * 1989-08-04 1992-12-29 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Canada Stereoscopic video-graphic coordinate specification system
US5583536A (en) * 1994-06-09 1996-12-10 Intel Corporation Method and apparatus for analog video merging and key detection
US6159016A (en) * 1996-12-20 2000-12-12 Lubell; Alan Method and system for producing personal golf lesson video
ES2180110T3 (en) * 1997-11-24 2003-02-01 Weiglhofer Gerhard COHERENCE DETECTOR.
GB0222265D0 (en) 2002-09-25 2002-10-30 Imp College Innovations Ltd Control of robotic manipulation
JP2004309930A (en) * 2003-04-09 2004-11-04 Olympus Corp Stereoscopic observation system
DE10349649B3 (en) 2003-10-17 2005-05-19 Karl Storz Gmbh & Co. Kg A method and apparatus for generating an annotated image in a sterile work area of a medical facility
US7540866B2 (en) 2004-06-04 2009-06-02 Stereotaxis, Inc. User interface for remote control of medical devices

Patent Citations (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4219842A (en) * 1978-08-24 1980-08-26 The Magnavox Company Video signal combiner having a common phase response and independent amplitude response
US4603231A (en) * 1983-03-31 1986-07-29 Interand Corporation System for sensing spatial coordinates
US4614366A (en) * 1983-11-18 1986-09-30 Exactident, Inc. Nail identification wafer
US5217003A (en) * 1991-03-18 1993-06-08 Wilk Peter J Automated surgical system and apparatus
US5432528A (en) * 1991-04-12 1995-07-11 Abekas Video Systems, Inc. Video combiner
US5950629A (en) * 1991-06-13 1999-09-14 International Business Machines Corporation System for assisting a surgeon during surgery
US6201984B1 (en) * 1991-06-13 2001-03-13 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5561708A (en) * 1991-10-03 1996-10-01 Viscorp Method and apparatus for interactive television through use of menu windows
US20020058929A1 (en) * 1992-01-21 2002-05-16 Green Philip S. Roll pitch roll tool
US5808665A (en) * 1992-01-21 1998-09-15 Sri International Endoscopic surgical instrument and method for use
US6731988B1 (en) * 1992-01-21 2004-05-04 Sri International System and method for remote endoscopic surgery
US5577991A (en) * 1992-06-09 1996-11-26 Olympus Optical Co., Ltd. Three-dimensional vision endoscope with position adjustment means for imaging device and visual field mask
US5428192A (en) * 1993-05-10 1995-06-27 Ace Cad Enterprise Co., Ltd. Method and apparatus for finding the location of a pointing instrument on a tablet
US5579057A (en) * 1993-06-07 1996-11-26 Scientific-Atlanta, Inc. Display system for selectively overlaying symbols and graphics onto a video signal
US5657095A (en) * 1993-06-14 1997-08-12 Pioneer Electronic Corporation System for Combining image signals
US5468921A (en) * 1994-08-11 1995-11-21 Boeckeler Instruments, Inc. Extended communication cable for a light pen or the like
US5836869A (en) * 1994-12-13 1998-11-17 Olympus Optical Co., Ltd. Image tracking endoscope system
US6678090B2 (en) * 1995-05-17 2004-01-13 Leica Microsystems Ag Microscope
US6612980B2 (en) * 1995-07-24 2003-09-02 Medical Media Systems Anatomical visualization system
US6714841B1 (en) * 1995-09-15 2004-03-30 Computer Motion, Inc. Head cursor control interface for an automated endoscope system for optimal positioning
US5855583A (en) * 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
US6139490A (en) * 1996-02-22 2000-10-31 Precision Optics Corporation Stereoscopic endoscope with virtual reality viewing
US5797900A (en) * 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US5839441A (en) * 1996-06-03 1998-11-24 The Trustees Of The University Of Pennsylvania Marking tumors and solid objects in the body with ultrasound
US20050154288A1 (en) * 1996-06-24 2005-07-14 Computer Motion, Inc. Method and apparatus for accessing medical data over a network
US6108458A (en) * 1996-07-03 2000-08-22 Massachusetts Institute Of Technology Sparse array image correlation
US6866671B2 (en) * 1996-12-12 2005-03-15 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
US6057833A (en) * 1997-04-07 2000-05-02 Shoreline Studios Method and apparatus for providing real time enhancements and animations over a video image
US20060013473A1 (en) * 1997-04-15 2006-01-19 Vulcan Patents Llc Data processing system and method
US6317868B1 (en) * 1997-10-24 2001-11-13 University Of Washington Process for transparently enforcing protection domains and access control as well as auditing operations in software components
US6014666A (en) * 1997-10-28 2000-01-11 Microsoft Corporation Declarative and programmatic access control of component-based server applications using roles
US6515659B1 (en) * 1998-05-27 2003-02-04 In-Three, Inc. Method and system for creating realistic smooth three-dimensional depth contours from two-dimensional images
US6434416B1 (en) * 1998-11-10 2002-08-13 Olympus Optical Co., Ltd. Surgical microscope
US6459926B1 (en) * 1998-11-20 2002-10-01 Intuitive Surgical, Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesurgery
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
US20020082612A1 (en) * 1998-11-20 2002-06-27 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US20030216715A1 (en) * 1998-11-20 2003-11-20 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US6837883B2 (en) * 1998-11-20 2005-01-04 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US20060241414A1 (en) * 1998-11-20 2006-10-26 Intuitive Surgical Inc. Repositioning and reorientation of master/slave relationship in minimally invasive telesuregery
US20070038080A1 (en) * 1998-12-08 2007-02-15 Intuitive Surgical Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US7277120B2 (en) * 1998-12-08 2007-10-02 Intuitive Surgical, Inc Stereo imaging system and method for use in telerobotic systems
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6714839B2 (en) * 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
US6720988B1 (en) * 1998-12-08 2004-04-13 Intuitive Surgical, Inc. Stereo imaging system and method for use in telerobotic systems
US6799065B1 (en) * 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US20040039485A1 (en) * 1999-04-07 2004-02-26 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US6817972B2 (en) * 1999-10-01 2004-11-16 Computer Motion, Inc. Heart stabilizer
US6728884B1 (en) * 1999-10-01 2004-04-27 Entrust, Inc. Integrating heterogeneous authentication and authorization mechanisms into an application access control system
US20060087504A1 (en) * 1999-10-21 2006-04-27 Meier Kevin R Telestrator system
US7075556B1 (en) * 1999-10-21 2006-07-11 Sportvision, Inc. Telestrator system
US6791601B1 (en) * 1999-11-11 2004-09-14 Stryker Corporation Multi-function image and video capture device for use in an endoscopic camera system
US6741757B1 (en) * 2000-03-07 2004-05-25 Microsoft Corporation Feature correspondence between images using an image pyramid
US20020012460A1 (en) * 2000-06-26 2002-01-31 Kabushiki Kaisha Topcon Stereo image measuring device
US20020018064A1 (en) * 2000-07-11 2002-02-14 Itaru Hatanaka Apparatus and method for processing three-dimensional graphic images
US6864886B1 (en) * 2000-08-10 2005-03-08 Sportvision, Inc. Enhancing video using a virtual surface
US7194118B1 (en) * 2000-11-10 2007-03-20 Lucid, Inc. System for optically sectioning and mapping surgically excised tissue
US6856324B2 (en) * 2001-03-27 2005-02-15 Siemens Corporate Research, Inc. Augmented reality guided instrument positioning with guiding graphics
US20020157023A1 (en) * 2001-03-29 2002-10-24 Callahan John R. Layering enterprise application services using semantic firewalls
US20030151809A1 (en) * 2002-02-12 2003-08-14 Susumu Takahashi Observation apparatus
US20030210812A1 (en) * 2002-02-26 2003-11-13 Ali Khamene Apparatus and method for surgical navigation
US20060142657A1 (en) * 2002-03-06 2006-06-29 Mako Surgical Corporation Haptic guidance system and method
US20040009459A1 (en) * 2002-05-06 2004-01-15 Anderson James H. Simulation system for medical procedures
US20040070615A1 (en) * 2002-05-31 2004-04-15 Ewing Richard E. Communicating medical information in a communication network
US20040002642A1 (en) * 2002-07-01 2004-01-01 Doron Dekel Video pose tracking system and method
US20040052333A1 (en) * 2002-09-13 2004-03-18 James Sayre Device and method for margin marking of radiography specimens
US20050179702A1 (en) * 2004-02-13 2005-08-18 Video Delta, Inc. Embedded video processing system
US20060100505A1 (en) * 2004-10-26 2006-05-11 Viswanathan Raju R Surgical navigation using a three-dimensional user interface
US20090073170A1 (en) * 2004-10-26 2009-03-19 Koninklijke Philips Electronics, N.V. Disparity map
US7563228B2 (en) * 2005-01-24 2009-07-21 Siemens Medical Solutions Usa, Inc. Stereoscopic three or four dimensional ultrasound imaging
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20070021738A1 (en) * 2005-06-06 2007-01-25 Intuitive Surgical Inc. Laparoscopic ultrasound robotic surgical system
US20080033240A1 (en) * 2005-10-20 2008-02-07 Intuitive Surgical Inc. Auxiliary image display and manipulation on a computer display in a medical robotic system
US20070167702A1 (en) * 2005-12-30 2007-07-19 Intuitive Surgical Inc. Medical robotic system providing three-dimensional telestration
US20070156017A1 (en) * 2005-12-30 2007-07-05 Intuitive Surgical Inc. Stereo telestration for robotic surgery
US7907166B2 (en) * 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US20070183041A1 (en) * 2006-02-09 2007-08-09 Northern Digital Inc. Retroreflective marker-tracking systems
US20070265527A1 (en) * 2006-05-11 2007-11-15 Richard Wohlgemuth Medical position determination using redundant position detection means and priority weighting for the position detection means
US20090036902A1 (en) * 2006-06-06 2009-02-05 Intuitive Surgical, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US20080285724A1 (en) * 2007-05-05 2008-11-20 Ziehm Imaging Gmbh X-ray diagnostic imaging system with a plurality of coded markers
US20090046146A1 (en) * 2007-08-13 2009-02-19 Jonathan Hoyt Surgical communication and control system
US20090088773A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US20090088634A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Tool tracking systems and methods for image guided surgery
US20090088897A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Methods and systems for robotic instrument tool tracking
US20090171371A1 (en) * 2007-12-26 2009-07-02 Intuitive Surgical, Inc. Medical robotic system with functionality to determine and display a distance indicated by movement of a tool robotically manipulated by an operator
US20090171332A1 (en) * 2007-12-27 2009-07-02 Intuitive Surgical, Inc. Medical device with orientable tip for robotically directed laser cutting and biomaterial application
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20090268015A1 (en) * 2008-04-26 2009-10-29 Intuitive Surgical, Inc. Augmented stereoscopic visualization for a surgical robot
US20090268012A1 (en) * 2008-04-26 2009-10-29 Intuitive Surgical, Inc Augmented stereoscopic visualization for a surgical robot using a captured visible image combined with a fluorescence image and a captured visible image
US20090268011A1 (en) * 2008-04-26 2009-10-29 Intuitive Surgical, Inc. Augmented stereoscopic visualization for a surgical robot using a camera unit with a modified prism
US20090270678A1 (en) * 2008-04-26 2009-10-29 Intuitive Surgical, Inc. Augmented stereoscopic visualization for a surgical robot using time duplexing
US20090268010A1 (en) * 2008-04-26 2009-10-29 Intuitive Surgical, Inc. Augmented stereoscopic visualization for a surgical robot using a captured fluorescence image and captured stereoscopic visible images
US20100074525A1 (en) * 2008-09-23 2010-03-25 Tal Drory Manipulating an Image by Applying a De-Identification Process
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100168763A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Configuration marker design and detection for instrument tracking
US20100168562A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Fiducial marker design and detection for locating surgical instrument in images
US8184880B2 (en) * 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
US20100245541A1 (en) * 2009-03-31 2010-09-30 Intuitive Surgical, Inc. Targets, fixtures, and workflows for calibrating an endoscopic camera

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US9266239B2 (en) 2005-12-27 2016-02-23 Intuitive Surgical Operations, Inc. Constraint based control in a minimally invasive surgical apparatus
US20070144298A1 (en) * 2005-12-27 2007-06-28 Intuitive Surgical Inc. Constraint based control in a minimally invasive surgical apparatus
US10159535B2 (en) 2005-12-27 2018-12-25 Intuitive Surgical Operations, Inc. Constraint based control in a minimally invasive surgical apparatus
US20100194870A1 (en) * 2007-08-01 2010-08-05 Ovidiu Ghita Ultra-compact aperture controlled depth from defocus range sensor
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US9155592B2 (en) 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9026247B2 (en) 2011-03-30 2015-05-05 University of Washington through its Center for Communication Motion and video capture for tracking and evaluating robotic surgery and associated systems and methods
US9244524B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Surgical instrument and control method thereof
US20140148818A1 (en) * 2011-08-04 2014-05-29 Olympus Corporation Surgical assistant system
US9851782B2 (en) 2011-08-04 2017-12-26 Olympus Corporation Operation support device and attachment and detachment method thereof
US9218053B2 (en) * 2011-08-04 2015-12-22 Olympus Corporation Surgical assistant system
US9161772B2 (en) 2011-08-04 2015-10-20 Olympus Corporation Surgical instrument and medical manipulator
US9423869B2 (en) 2011-08-04 2016-08-23 Olympus Corporation Operation support device
US9477301B2 (en) 2011-08-04 2016-10-25 Olympus Corporation Operation support device and assembly method thereof
US9244523B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Manipulator system
US9519341B2 (en) 2011-08-04 2016-12-13 Olympus Corporation Medical manipulator and surgical support apparatus
US9524022B2 (en) 2011-08-04 2016-12-20 Olympus Corporation Medical equipment
US9568992B2 (en) 2011-08-04 2017-02-14 Olympus Corporation Medical manipulator
US9632577B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Operation support device and control method thereof
US9632573B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Medical manipulator and method of controlling the same
US9671860B2 (en) 2011-08-04 2017-06-06 Olympus Corporation Manipulation input device and manipulator system having the same
US9098913B2 (en) * 2012-05-11 2015-08-04 Cornell University Prediction of successful grasps by end of arm tooling
US20140016856A1 (en) * 2012-05-11 2014-01-16 Cornell University Prediction of successful grasps by end of arm tooling
WO2021149056A1 (en) * 2020-01-22 2021-07-29 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
IL294987B1 (en) * 2020-01-22 2023-09-01 Beyeonics Surgical Ltd System and method for improved electronic assisted medical procedures
US11931292B2 (en) * 2020-01-22 2024-03-19 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
EP4344668A1 (en) * 2022-09-28 2024-04-03 Medicaroid Corporation Remote surgery support system

Also Published As

Publication number Publication date
US20070156017A1 (en) 2007-07-05
US7907166B2 (en) 2011-03-15

Similar Documents

Publication Publication Date Title
US7907166B2 (en) Stereo telestration for robotic surgery
US11076748B2 (en) Display monitor control of a telesurgical tool
US10432921B2 (en) Automated panning in robotic surgical systems based on tool tracking
US6919867B2 (en) Method and apparatus for augmented reality visualization
US10622111B2 (en) System and method for image registration of multiple video streams
EP3107286B1 (en) Medical robotic system providing three-dimensional telestration
JP4172816B2 (en) Remote operation method and system with a sense of reality
US9766441B2 (en) Surgical stereo vision systems and methods for microsurgery
US8504136B1 (en) See-through abdomen display for minimally invasive surgery
EP2903551B1 (en) Digital system for surgical video capturing and display
US20050090730A1 (en) Stereoscopic video magnification and navigation system
Breedveld et al. Observation in laparoscopic surgery: overview of impeding effects and supporting aids
EP1705513A1 (en) System for the stereoscopic viewing of real-time or static images
WO2003002011A1 (en) Stereoscopic video magnification and navigation system
EP4221581A1 (en) Auto-navigating digital surgical microscope
Breedveld et al. Eye-Hand Coordination in Laparoscopy

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION