US20120316486A1 - Surgical Component Navigation Systems And Methods - Google Patents
Surgical Component Navigation Systems And Methods Download PDFInfo
- Publication number
- US20120316486A1 US20120316486A1 US13/214,783 US201113214783A US2012316486A1 US 20120316486 A1 US20120316486 A1 US 20120316486A1 US 201113214783 A US201113214783 A US 201113214783A US 2012316486 A1 US2012316486 A1 US 2012316486A1
- Authority
- US
- United States
- Prior art keywords
- surgical
- patient
- region
- sensor
- movable region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000001356 surgical procedure Methods 0.000 claims abstract description 45
- 230000004044 response Effects 0.000 claims abstract description 29
- 230000008569 process Effects 0.000 claims abstract description 10
- 208000027418 Wounds and injury Diseases 0.000 claims description 35
- 206010052428 Wound Diseases 0.000 claims description 33
- 238000001514 detection method Methods 0.000 claims description 27
- 230000033001 locomotion Effects 0.000 claims description 12
- 230000003287 optical effect Effects 0.000 claims description 12
- 238000002591 computed tomography Methods 0.000 claims description 7
- 230000008878 coupling Effects 0.000 claims description 5
- 238000010168 coupling process Methods 0.000 claims description 5
- 238000005859 coupling reaction Methods 0.000 claims description 5
- 238000003306 harvesting Methods 0.000 claims description 5
- 230000003993 interaction Effects 0.000 claims description 4
- 238000012544 monitoring process Methods 0.000 abstract description 2
- 210000004373 mandible Anatomy 0.000 description 20
- 210000003484 anatomy Anatomy 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 5
- 238000002595 magnetic resonance imaging Methods 0.000 description 5
- 238000001931 thermography Methods 0.000 description 5
- 210000003462 vein Anatomy 0.000 description 5
- 210000001367 artery Anatomy 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 210000002050 maxilla Anatomy 0.000 description 4
- 210000005036 nerve Anatomy 0.000 description 4
- 230000001902 propagating effect Effects 0.000 description 4
- 239000007789 gas Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000012806 monitoring device Methods 0.000 description 3
- 210000003625 skull Anatomy 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 239000004053 dental implant Substances 0.000 description 2
- 230000005672 electromagnetic field Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 229910052719 titanium Inorganic materials 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 210000003781 tooth socket Anatomy 0.000 description 2
- 241000894006 Bacteria Species 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 208000002847 Surgical Wound Diseases 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000007408 cone-beam computed tomography Methods 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 238000012377 drug delivery Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 239000003292 glue Substances 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 238000007373 indentation Methods 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000399 orthopedic effect Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 238000004659 sterilization and disinfection Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- A61F13/05—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C1/00—Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
- A61C1/08—Machine parts specially adapted for dentistry
- A61C1/082—Positioning or guiding, e.g. of drills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C1/00—Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
- A61C1/08—Machine parts specially adapted for dentistry
- A61C1/082—Positioning or guiding, e.g. of drills
- A61C1/084—Positioning or guiding, e.g. of drills of implanting tools
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0294—Trajectory determination or predictive filtering, e.g. target tracking or Kalman filtering
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F13/00—Bandages or dressings; Absorbent pads
- A61F2013/00089—Wound bandages
- A61F2013/0017—Wound bandages possibility of applying fluid
Definitions
- the present general inventive concept relates generally to navigation of surgical components and, more particularly, to systems and methods to assist a surgeon in navigating anatomical regions of a patient to properly position and locate surgical components, adjuncts, surgical guides, goggles, dressings, instruments, and other surgical components before, during, and after injury or surgery of a patient, and for navigation and use around wounds and surgical sites.
- Intra-operative navigation systems are comparable to global positioning satellite (GPS) systems commonly used in automobiles and are composed of three primary components: a localizer, which is analogous to a satellite in space; an instrument or surgical probe adjunct, guide, goggle, or dressing, which represents the track waves emitted by the GPS unit in the vehicle; and CT scan and/or other data sets such as MRI, PET/CT, or optical data sets that are analogous to a road map of the anatomical structure of the patient.
- GPS global positioning satellite
- CT scan and/or other data sets such as MRI, PET/CT, or optical data sets that are analogous to a road map of the anatomical structure of the patient.
- Computer assisted image guidance techniques typically involve acquiring preoperative images of the relevant anatomical structures and generating a data base which represents a three dimensional model of the anatomical structures.
- the position of the instrument relative to the patient is determined by the computer using at least three fixed reference elements that span the coordinate system of the object in question.
- the process of correlating the anatomic references to the digitalized data set constitutes the registration process.
- the relevant surgical instruments or other components and surgical sites typically have a known and fixed geometry which is also defined preoperatively.
- the position of the component being used is registered with the anatomical coordinate system and a graphical display showing the relative positions of the tool and anatomical structure may be computed and displayed to assist the surgeon in properly positioning and manipulating the surgical component with respect to the relevant anatomical structure.
- One of the disadvantages of known systems is the need to maintain proper positioning of surgical instruments relative to movable anatomic references when those references are moved during surgery, and to enable surgeons to properly position surgical instruments in real time when anatomical reference points are moved during surgery.
- the present general inventive concept provides systems and methods to digitally register and track movable regions of a patient, enabling a surgeon to accurately position and navigate surgical components such as, but not limited to, surgical instruments, adjuncts, guides, goggles, wound dressings, and other surgical components with respect to reference points even when the reference points are moved before, during, or after treatment or surgery.
- surgical components such as, but not limited to, surgical instruments, adjuncts, guides, goggles, wound dressings, and other surgical components with respect to reference points even when the reference points are moved before, during, or after treatment or surgery.
- Example embodiments of the present general inventive concept can be achieved by providing a navigation system to track positions of surgical components before, during, or after an operation of a patient, including a power source to emit a detectable signal during operation of a patient, a first sensor mounted to a movable region of the patient to respond to the emitted signal, and a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the response of the first sensor.
- the navigation system can include a second sensor mounted to a surgical component to respond to the emitted signal such that the control unit tracks a position of the surgical component relative to the movable region as the surgical component and movable region move with respect to the fixed region, based on the responses of the first and second sensors.
- Example embodiments of the present general inventive concept can also be achieved by providing a navigation system to track positions of surgical components before, during, or after an operation of a patient, including a detection unit to detect an LED or electromagnetic signal, a first sensor mounted to a movable region of the patient to emit a first LED or electromagnetic signal to be detected by the detection unit, and a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the detected first LED or electromagnetic signal.
- a navigation system to track positions of surgical components before, during, or after an operation of a patient, including a detection unit to detect an LED or electromagnetic signal, a first sensor mounted to a movable region of the patient to emit a first LED or electromagnetic signal to be detected by the detection unit, and a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the detected first LED or electromagnetic signal.
- Example embodiments of the present general inventive concept can also be achieved by providing a method of tracking positions of surgical components before, during, or after an operation of a patient, including emitting tracking signals to a targeted region of a surgical site, coupling a first sensor to a movable region of the patient such that the first sensor responds to the emitted tracking signals, and tracking a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the response of the first sensor.
- Example embodiments of the present general inventive concept can also be achieved by providing a navigation system to track positions of surgical components during surgery of a patient, including a power source to emit a tracking signal during surgery of a patient, a first sensor mounted to a region of the patient to generate a first response signal to the emitted tracking signal, a second sensor mounted to a surgical component to generate a second response signal to the emitted tracking signal, and a control unit to track a position of the surgical component relative to the region as the surgical instrument and region move with respect to a fixed region of the patient, wherein the tracked position is based on a triangulation calculation relative to the first and second response signals independent of a shape dimension of the first and second sensors.
- the first sensor can be a digital scanner to read data pertaining to a region of interest of the patient to adjust existing CT scan data of the patient.
- the navigation system can include a set of navigation goggles worn by a surgeon to display in real-time the position of the surgical component and/or region during surgery.
- Example embodiments of the present general inventive concept can also be achieved by providing a navigation system to track positions of surgical components, including a power source to emit a tracking signal during an operation of a patient, a first component mounted to a region of interest of the patient, the first component including a first sensor to respond to the emitted tracking signal to provide location information of the first component, a second component including a second sensor to respond to the emitted tracking signal to provide location information of the second component, and a control unit to track the locations of the first and second components relative to a fixed region of the patient as the first or second components move with respect to the fixed region based on the responses of the first and second sensors, independent of a shape dimension of the first or second sensors.
- Example embodiments of the present general inventive concept can also be achieved by providing a wound care device to monitor and treat wounds of a patient, including a dressing to cover a wound of a patient, at least one detector to measure a characteristic parameter of the wound, and to transmit a signal representative of the measured characteristic parameter, and a control unit to receive the transmitted signal and to output a response indicative of the measured characteristic parameter to treat the wound.
- the monitoring device can include a sensor device to facilitate calculation of location information of the monitoring device.
- the monitoring device can be part of the navigation system or can be used as a separate component to monitor and treat wounds.
- FIG. 1 is a perspective view of a system environment in which the features of the present general inventive concept may be implemented;
- FIG. 2A is a perspective view of an exemplary guide member including optical sensor members in accordance with an example embodiment of the present general inventive concept
- FIG. 2B is a perspective view of an exemplary guide member including electromagnetic sensor members in accordance with another example embodiment of the present general inventive concept
- FIG. 3 is a perspective view of a surgical instrument including optical or electromagnetic sensor members in accordance with an example embodiment of the present general inventive concept
- FIG. 4 is a diagram illustrating a power source emitter and detection unit communicating with sensor units configured in accordance with an example embodiment of the present general inventive concept
- FIG. 5 is a perspective view of a system environment including a scanning wand and navigation goggles for use in accordance with example embodiments of the present general inventive concept;
- FIG. 6 illustrates an exemplary set of navigation goggles configured in accordance with an example embodiment of the present general inventive concept
- FIG. 7 is a perspective view of a system environment including dressings configured for use in accordance with example embodiments of the present general inventive concept
- FIG. 8 illustrates an exemplary wound dressing configured in accordance with an example embodiment of the present general inventive concept
- FIG. 9 illustrates an exemplary wound dressing including a plurality of sensors to aid in navigation and detection of a variety of parameters to assist in treatment of the wound, according to an example embodiment of the present general inventive concept.
- the present general inventive concept provides systems and methods of navigating surgical components with respect to anatomical regions of a patient, and assisting a surgeon in locating anatomical regions of a patient to properly position and locate surgical components such as, but not limited to, surgical adjuncts, surgical guides, goggles, dressings, and other surgical instruments and treatment components before, during, and after injury or surgery of a patient, and for navigation and use around surgical sites.
- surgical components is intended to encompass, but is not limited to, all surgical devices, instruments, and components for use in navigation around wound sites, whether used before, during, or after surgery or treatment thereof.
- the navigation system enables a surgeon to track a location of a movable reference point relative to a fixed reference point as the movable reference point moves in space with respect to the fixed reference point during a surgical procedure.
- the techniques of the present general inventive concept can be implemented in conjunction with robots to provide reference in space for surgical components and wound locations to aid in precision surgery.
- the navigation system utilizes known GPS triangulation methods to determine the location of sensors on both the patient's body and the surgical component, independent of the shape or size of the sensors.
- FIG. 1 is a perspective view illustrating an exemplary system environment in which the features of the present general inventive concept may be implemented.
- the system environment of FIG. 1 includes a navigation system generally indicated by reference number 10 to navigate surgical instruments with respect to targeted anatomical structures of a patient 1 .
- the simplified diagram of FIG. 1 illustrates a drilling instrument 13 for use in an oral surgery procedure and a patient 1 .
- the patient is prepared for oral surgery toward a targeted region of the patient's mandible 19 .
- the mandible 19 is a movable anatomical structure as generally indicated by the phantom lines and direction arrow in FIG. 1 .
- the mandible 19 is movable with respect to a fixed reference point such as the patient's skull or maxilla 15 , the mandible 19 is referred to as a movable region or movable reference point.
- a fixed reference point such as the patient's skull or maxilla 15
- the mandible 19 is referred to as a movable region or movable reference point.
- the present general inventive concept is not limited to any particular anatomical structure or type of movable reference point, nor is it limited to oral surgery procedures.
- Those skilled in the art will appreciate that many other anatomical structures could be used as a movable reference depending on the location and scope of the targeted surgical region, such as head, legs, arms, feet, hands, etc.
- the present general inventive concepts can be used to navigate any type of surgical or medical/dental instrument or component, for example, endoscopic systems, suction devices, screw devices, guides, wires, syringes, needles, drug delivery systems, biopsy systems, arthroscopic systems, wound dressings, etc.
- embodiments of the present general inventive concept may be used to navigate and/or treat any targeted region or anatomical structure of the patient's body during any medical or dental procedure, internally or externally, in addition to surgery on the mandible region as illustrated in FIG. 1 .
- the simplified diagram does not illustrate various connections, for example, power, ground, and interface connections to the various components; however, those skilled in the art will recognize the need for such connections and understand how to implement such connections, based on the components ultimately selected for use.
- the navigation system 10 includes a surgical aid device such as movable guide member 11 , a power source or emitting device 17 , and a control unit 16 having a display monitor 8 .
- the movable guide member 11 can be a customized guide fitted to individual cusps of the teeth including sensors to provide triangulation information for use in navigating craniofacial or dental operations.
- the system may also include a surgical component such as 13 to be tracked with respect to the location of a surgical site of interest as represented by movable guide member 11 .
- the movable guide member 11 and surgical instrument 13 can include sensor elements 12 and 14 , respectively.
- the emitting device 17 emits a propagating signal to communicate with the sensors 12 and 14 to track the location of the surgical instrument 13 relative to the movable guide member 11 .
- a customized guide member 11 for example, it is possible to use the patient's teeth or dental alveolus as unique registration points (e.g., fixed points) to register the mouthpiece/guide 11 during oral surgery.
- the emitting device 17 may also include a detection unit 17 c to detect responses of the sensors 12 , 14 . Once the responses are detected by the detection unit 17 c , the control unit 16 utilizes a multi-triangulation concept to calculate the position of the sensors 12 and 14 based on the detected responses to tracking signals emitted by the emitting device 17 .
- the manner in which the emitting device 17 and/or detection unit 17 c communicates with the sensors 12 and 14 to track the position thereof is well known in the art and is therefore only described generally. In some embodiments, it is possible that the functions of the emitter 17 and sensors 12 and 14 may be reversed and/or combined using sound engineering judgment to achieve the same or similar results.
- the sensors 12 and 14 can function as emitters rather than sensors, and it is possible for the emitter 17 to function as a sensor rather than an emitter.
- the emitter 17 it is possible to utilize known triangulation methods to calculate and track the positions of the sensors 12 and 14 relative to the targeted surgical field using the configurations and techniques of the present general inventive concept.
- the navigation system 10 may include an optional imaging device (not illustrated), such as an MRI unit, CT scanner, or other type of imaging device, optical device, or electromagnetic device, to acquire pre-, intra-, or post-operative or real-time images of the patient 1 , in order to determine location coordinates with respect to a fixed portion of the patient's body, for example, to obtain digital coordinates of the various components relative to the patient's maxilla or skull region 15 .
- an optional imaging device such as an MRI unit, CT scanner, or other type of imaging device, optical device, or electromagnetic device, to acquire pre-, intra-, or post-operative or real-time images of the patient 1 , in order to determine location coordinates with respect to a fixed portion of the patient's body, for example, to obtain digital coordinates of the various components relative to the patient's maxilla or skull region 15 .
- the emitting device 17 can generate a tracking signal which can be received by sensors 12 and/or 14 .
- the tracking signal may take the form of an infrared light signal (IR), electromagnetic (EM) signal, Bluetooth signal, Wi-Fi signal, or other known or later developed wired or wireless signal.
- IR infrared light signal
- EM electromagnetic
- Bluetooth Bluetooth signal
- Wi-Fi Wi-Fi
- the propagating signal is an LED light signal transmitted from the emitting device 17 to the sensors 12 and 14 .
- the sensors 12 and 14 in order to track the location of the guide member 11 and/or surgical component 13 , the sensors 12 and 14 can function as reflecting markers to transmit light signals received from the emitting device 17 to a detection unit 17 c , such as a CCD camera device.
- the detection unit 17 c can determine the location of the sensors 12 and 14 based on characteristics such as intensity, refraction angle, etc. of the reflected LED signals, and can inform the control unit 16 of the location of the sensors in real time based on the characteristics of the reflected LED signals.
- the sensors 12 and 14 can include one or more emitting devices to emit LED signals directly from the sensors to the detection unit 17 c .
- the position of the sensors 12 , 14 can be directly tracked by the detection unit 17 c by detecting and characterizing the LED signals emitted from the sensors directly, in which case the emitting device 17 may not be required.
- the patient's MRI or CT scans may be fed into the control unit 16 to compare the scanned MRI or CT images to anatomical landmarks or reference points fixed on the patient's head and face to calibrate a location of the fixed reference point relative to a target point for the procedure or surgery.
- the patient's maxilla 15 can be used as a fixed reference point. To register the fixed reference point, it is possible to calculate a position of the fixed reference point with respect to the targeted surgical field (e.g., mandible region) based on coordinates of the patient generated by the MRI or CT scans.
- the targeted surgical field e.g., mandible region
- a fixed device such as a screw device (not illustrated), adapted to include an integrated sensor device to correspond and define a fixed reference point of the patient's skull.
- the fixed sensor device can then be used to communicate with the emitting device 17 and/or detection unit 17 c to calibrate the location of the fixed reference point relative to one or more other sensors or reference points of the patient.
- the fixed reference point 15 may be used as a positional reference frame to determine the relative position of the surgical component 13 with respect to the target point of the surgery, and to calibrate a position of the movable guide element 11 .
- a surgical aid component such as a movable guide member 11 adapted with a sensor array 12 to a portion of the patient's mandible to track movements of the patient's mandible 19 , as illustrated in FIG. 1 .
- the exemplary movable guide member 11 can be configured in the shape of a semicircular mouthpiece to fit precisely on the patient's mandible.
- the movable guide member 11 typically includes a series of holes 122 which the surgeon uses to locate and orient dental implants during oral surgery.
- the movable guide member 11 can be attached to the patient's mandible by way of fasteners 120 and 121 .
- the fasteners 120 , 121 may take the form of fixation screws, bolts, or pins, but the present general inventive concept is not limited thereto.
- fixation methods such as intermaxillary fixation (IMF) methods, IMF screws, and the like, can be adapted to include a sensor device in accordance with the present general inventive concept to track movements of a movable region of the patient during a medical or dental procedure. It is possible to mount a sensor 12 to a guide member such as a bite plate device and/or customized guide based on the individual unique cusps of teeth, secured to a lower jaw of the patient by screws.
- IMF intermaxillary fixation
- FIG. 2A illustrates a mouthpiece-shaped guide member 11 to incorporate the sensor 12
- the present general inventive concept is not limited to such configuration, and various other types of sensor arrangements may be used in connection with a variety of other types of fixation devices, methods, or splints to track and maintain a movable reference point during surgery.
- a sensor device into a locating pin or other fastening device, such as a surgical screw, and to attach the pin or screw to the targeted movable region of the patient to track the movable reference during a particular medical or dental (i.e., surgical) procedure.
- a locating pin or other fastening device such as a surgical screw
- the guide members 11 , 11 ′ can be fabricated from a digital scan for use as fixation assist.
- the guide members 11 , 11 ′ can be fabricated from a digital scanner, CT, CBCT, MRI, or similar devices to produce individualized tooth-borne (via tooth cusps) template.
- Other types of guide members can be used to register other anatomical regions of the body, such as a bone borne template for edentulous mandible, maxilla, spine, hip, etc., or soft tissue templates for radial forearm, nose, ear, or other regions.
- the techniques and devices of the present general inventive concept are not limited to craniofacial use, but can be applied in dentistry, oral surgery, orthopedics, ENT, neurosurgery, or other surgical fields.
- the guide members can be sterilized prior to introduction into the operating room, obviating the need for re-sterilization process.
- RFID sensors and/or other types of sensors, such as Bluetooth enabled sensors
- the RFID sensors can be powered by solar cells or other energy harvesting devices, such as RF harvesting devices.
- the integrated device can then be attached to a movable region of interest, such as the patient's lower jaw, to track movements thereof during an operative procedure.
- the present general inventive concept is not limited to the exemplary configurations illustrated and described herein. To the contrary, a variety of other configurations and combinations of dental/medical devices can be adapted with a variety of different sensor technologies (e.g., swarming technology) to carry out the techniques of the present general inventive concept. For example, it is possible to utilize various combinations of sensor technologies, such as EM and/or optical, during a single operative procedure, depending on the particular components and instruments chosen and adapted for use.
- FIG. 2A there is illustrated a perspective view of a typical movable guide member 11 adapted to include an array of sensor members 12 a , 12 b , and 12 c to detect light emitted from the emitting device 17 , in accordance with an example embodiment of the present general inventive concept.
- the sensors 12 a , 12 b , and 12 c can function as reflecting markers to transmit light signals received from the emitting device 17 to a detection unit 17 c .
- the detection unit 17 c can continuously acquire the position of the sensors 12 a , 12 b , and 12 c and can inform the control unit 16 of the location of the sensors in real time.
- the control system 16 can compute the position of the movable guide member 11 using a known multi-triangulation method based on information received from the sensors 12 a , 12 b , and 12 c , and can display on display monitor 8 an image displaying the position of the movable guide member 11 with respect to various other components, structures, and reference points of the navigation system 10 .
- the sensors 12 a , 12 b , and 12 c can be configured to extend from an outer surface of the guide member 11 to help maintain consistent line-of-sight between the sensors 12 a , 12 b , 12 c and the light emitting device 17 .
- FIGS. 1 and 2A depict an oral surgery configuration, those skilled in the art will appreciate that the present general inventive concept is not limited to the embodiments of FIGS.
- guide members 11 and sensors 12 a , 12 b , 12 c may be used to facilitate mounting of such devices on other parts of the body, internally and externally, and may be used in connection with other types of surgeries where it is useful to maintain a movable reference to help locate surgical instruments or components when the target anatomical structure is moved during surgery.
- a sensor array 12 to the movable guide member 11 to facilitate tracking of the guide member 11 as the mandible is moved, enabling the surgeon to maintain consistent and proper positioning of the surgical component 13 with respect to the mandible even when the mandible is moved during surgery.
- the surgeon attaches the movable guide member 11 and sensor 12 to the target point, such as the patient's mandible 19 as illustrated in FIG. 1 .
- the control unit 16 can track the location of the movable guide member 11 and the surgical component 13 in real time, enabling the surgeon to maintain proper positioning of the surgical component 13 with respect to the target point even when the movable guide member 11 is moved during surgery.
- the surgeon may move the surgical component 13 with respect to the targeted surgical region of the patient, for example the mandible 19 area as illustrated in FIG. 1 .
- the control unit 16 can track the location of the surgical component 13 via the sensors 14 mounted on the surgical component 13 .
- the control system 16 can interpret the response signals of the sensor 14 to compute the position of the surgical component 13 using a known multi-triangulation method based on response signals of the sensors 14 , and can display on display monitor 8 an image displaying the position of the surgical component 13 with respect to the targeted region of the patient.
- the emitting device 17 emits infrared light signals
- the emitting device does not emit light signals but instead emits EM or other types of RF or wireless signals
- FIG. 2B is a perspective view of guide member including sensor members in accordance with another example embodiment of the present general inventive concept, for example, in a case where the emitting device 17 emits EM or other RF-based signals.
- the sensors of the movable guide member 11 ′ can include an array of detectors, such as radio frequency identification (RFID) sensors 12 a ′, 12 b ′, and 12 c ′, to communicate with the EM signals emitted from the emitting device 17 .
- RFID radio frequency identification
- the RFID sensors 12 a ′, 12 b ′, and 12 c ′ can be mounted internally with respect to the guide member 11 ′ as illustrated in FIG. 2B .
- the RFID sensors can be mounted within the internal structure of the guide member 11 ′ since it is not as important to maintain a direct line-of-sight between the sensors and the emitting device 17 due to the penetrating characteristics of EM and other types of RF signals.
- the RFID sensors 12 a ′, 12 b ′, and 12 c ′ function to interact with the electromagnetic field generated by the emitting device 17 , and the control unit 16 can recognize any disruptions in the magnetic field caused by the RFID sensors, enabling the system's computer, which has special tracking software, to recognize the location of the RFID sensors and its location in the surgical field using a known multi-triangulation concept based on the interaction of the RFID sensors 12 a ′, 12 b ′, and 12 c ′ with the electromagnetic field.
- control unit 16 can compute the position of the movable guide member 11 ′ in real time based on this information, and can display on display monitor 8 an image displaying the position of the movable guide member 11 ′ with respect to various other components, structures, and reference points of the navigation system 10 .
- FIG. 3 is a perspective view of an exemplary surgical component 13 including a sensor array 14 configured in accordance with an example embodiment of the present general inventive concept.
- the surgical component 13 includes a sensor array 14 including sensors 14 a , 14 b , and 14 c . These sensors are configured to respond to propagating signals emitted from the emitting device 17 to track the location of the surgical component in the surgical field, in the manners discussed above. As with sensors 12 a , 12 b , and 12 c , sensors 14 a , 14 b , and 14 c can be configured to interact with LED, EM, Wireless, WiFi, Bluetooth, IR, and/or other types and combinations of wired or wireless signals in known ways to track the location of various components associated with the sensors. The sensors can be powered by solar cells or other energy harvesting devices.
- the sensor array may be mounted in the form of a ring-like shape to fit around a shaft or neck region of the surgical component 13 , as illustrated in FIG. 3 .
- Such a configuration is easily adaptable to any number of different shaped and sized surgical components.
- the specific means of mounting the sensors to the various components can be chosen with sound engineering judgment, and a variety of mounting shapes and configurations could be used without departing from the broader scope of the present general inventive concept.
- the sensors 14 a , 14 b , and 14 c could be integrally mounted and formed in the surgical component 13 as a single body to communicate with the propagating signal without sacrificing proper positioning of the surgical component 13 with respect to the surgical field.
- the control unit 16 can calculate the position of the surgical component 13 relative to the movable reference region and can track and compare the relative movements of the guide member 11 with respect to the surgical component 13 . It is possible to include a slot or other type of holding means in one or more of the exemplary devices of the navigation system to hold a microSD card or other memory device to store or upload data to/from the navigation system.
- the sensors 12 and 14 are not required to be the same or similar types of devices, but instead may be different, wherein the sensors independently interact with one or more of the emitting devices 17 and/or detection unit 17 c to track location information of the respective sensors.
- one of the sensors 12 could be configured to include an EM source and a light reflector sensor, and the other sensor 14 could be configured to include an RFID receptor to interact with the EM field generated by sensor 12 .
- the emitter device 17 and detection unit 17 c could be adapted to track the location of sensor 12 by characterizing the light reflected by sensor 12
- the control unit 16 could be adapted track the relative distance between the sensors 12 and 14 by detecting disruptions in the EM field caused by movement of the RFID receptor of sensor 14 .
- a variety of other types and combinations of sensors could also be used.
- FIG. 4 is a simple diagram illustrating a light source and light detector in communication with sensor arrays 12 , 14 in accordance with an example embodiment of the present general inventive concept.
- three points of reference are used, corresponding to three sensors on each device ( 12 a , 12 b , 12 c and 14 a , 14 b , 14 c ).
- the sensors 12 a , 12 b , 12 c and 14 a , 14 b , 14 c can communicate with the power source 17 and/or detection unit 17 c to provide information regarding the location of the respective devices, as indicated by the dotted lines extending between the sensors and the power source 17 and detection unit 17 c .
- the sensors 12 a , 12 b , 12 c can communicate directly with the other sensors 14 a , 14 b , 14 c to provide information about the relative positions of the devices, as indicated by the dotted lines extending between the sensor arrays 12 and 14 .
- the sensors 12 a , 12 b , and 12 c could be configured to include an EM source to emit a tracking signal to the sensors 14 a , 14 b , and 14 c
- the sensors 14 a , 14 b , and 14 c could be configured to include an RFID receptor configured to interact with the EM field generated by the EM source based on the position of the RFID receptors.
- disruptions or changes to the EM field caused by movement of the RFID receptors can be detected by the detection unit 17 c and fed to the control unit 16 ( FIG. 1 ) to calculate and display location information about the relative positions of the sensors.
- the use of RFID, Bluetooth, IR, EM, LED, or other types of sensors can be interchanged, mixed, or combined for use with different devices and applications, without departing from the broader principles and scope of the present general inventive concept.
- swarming technology can be used to implement a variety of different sensor technologies (e.g., EM and/or optical) on a variety of different surgical components and regions of interest to track movements thereof during single or multiple operative procedures of a patient.
- thermography in conjunction with the navigation techniques of the present general inventive concept to identify other structures in and around the surgical region of interest such as nerves, arteries, veins, and the like.
- the RFID sensors track and identify the location of teeth or other structures in a surgical region of interest, such as the mandible
- thermography thus providing additional navigational information to supplement the information provided from the multi-triangulation techniques of the present general inventive concept.
- thermal imaging cameras into, or in combination with, the exemplary sensors of the present general inventive concept in order to detect variations in the infrared radiation of various body parts and to display thermographic images thereof.
- thermography can be used to identify where the canal is, thus providing additional location information in addition to the information provided by the RFID or other sensors. Accordingly, not only can the multi-triangulation concepts of the present general inventive concept be used to indicate where a boney indentation is in the bone, but thermography concepts can also be incorporated into the navigation system of the present general inventive concept to help identify and locate the nerve, artery, and/or vein during surgery.
- FIG. 5 is a perspective view of a system environment including a pair of navigation goggles 50 and a digital scanning wand 51 for use in accordance with example embodiments of the present general inventive concept.
- the scanning wand 51 can be used to superimpose measurements onto the patient scan data, such as CT scan data.
- the measurements from the scanning wand 51 can be used to supplement or replace patient scan data to enable the surgeon to determine location information of surgical sites of interest that may be modified or moved relative to the original scan data.
- the navigation goggles 50 can interface with the navigation system, via a wired or wireless connection, to enable the surgeon to visualize location information of surgical sites of interest in real time during surgery.
- FIG. 6 illustrates an exemplary set of navigation goggles 50 configured in accordance with an example embodiment of the present general inventive concept.
- the goggles 50 facilitate 3D viewing of the surgical field with an overlay of the scan.
- the goggles can include sensors to sense the blinking of the eyelids and eye movements to function in part with verbal commands and buttons on the instruments to control various aspects of the surgical field including the 3D viewing experience of the goggles.
- the goggles 50 can include various overlays to display navigation data, such as location of surgical components and/or surgical sites in 3-dimensional space, angular information, target points, and the like.
- the location information provided by the navigation system can processed and fed to the navigation goggles 50 in various forms to assist the surgeon in visualizing and locating surgical components and surgical sites as the operation is being performed. For example, it is possible for the surgeon to visualize tumors or other surgical sites, to see the depths of invasion, and to superimpose data from the digital wand and/or CT scan while cutting or performing other operations on the patient.
- FIG. 7 is a perspective view of a system environment including exemplary dressings 70 configured for use in accordance with example embodiments of the present general inventive concept.
- the dressings 70 can include suitable sensors, such as RFID sensors, to communicate location information concerning the placement of the dressings 70 .
- the dressings 70 can be placed to reference various aspects of surgical and non-surgical wound dimensions, wherein the wounds orientation and sensors are able to detect the condition of the wound in conjunction with navigation.
- the dressings 70 can include a solar cell or other energy harvesting device to power the sensors, but the present general inventive concept is not limited to any particular type of sensor or power source.
- location information can be communicated from the dressings 70 to the navigation system using GPS triangulation techniques relative to the sensors of each dressing, thus providing location information of each dressing relative to other surgical components or surgical sites of interest.
- the location information can then be processed by the control unit and displayed in various formats to the surgeon via display monitor 8 ( FIG. 1 ) and/or navigation goggles 50 ( FIG. 6 ).
- FIG. 8 illustrates an exemplary wound dressing 80 including a tripartite sensor arrangement (similar as described above) to facilitate GPS triangulation calculations and location data of the dressing 80 relative to other surgical components, surgical sites, and/or other anatomical regions of interest.
- a tripartite sensor arrangement similar as described above to facilitate GPS triangulation calculations and location data of the dressing 80 relative to other surgical components, surgical sites, and/or other anatomical regions of interest.
- FIG. 9 illustrates an exemplary wound dressing 90 including a plurality of treatment devices to aid in the navigation, detection, and/or treatment of a variety of parameters to assist in operations of a wound or surgical site, according to an example embodiment of the present general inventive concept.
- Exemplary treatment devices are illustrated in a circuit fashion in FIG. 9 , with a key indicating some exemplary parameters for use of the treatment devices, although the present general inventive concept is not limited to the illustrated parameters, and a variety of other parameters could be used without departing from the scope and spirit of the present general inventive concept.
- the treatment devices of FIG. 9 can be implemented in combination with RFID or other navigation sensors to provide navigation and treatment information respecting a particular wound.
- one or more gas sensors to detect gases such as NO, O 2 , CO 2 , or other gases in or around a particular wound area.
- This information can be communicated to the navigation system to provide a monitoring component of a particular wound area.
- Other parameters can also be monitored, for example, temperature, pH, bacteria level, pressure, and the like.
- UV ultraviolet
- Treatment devices may also be targeted to various regions of the dressing using navigation information provided by RFID or other GPS devices of the wound dressing.
Abstract
A navigation and monitoring system to track positions of surgical components during surgery of a patient. The navigation system includes a power source to emit a tracking signal during surgery of the patient, a first sensor mounted to a region of the patient to respond to the emitted tracking signal, and a control unit to track a position of the region relative to a fixed region of the patient as the region moves with respect to the fixed region, based on the response of the first sensor. The system can calibrate and register a movable reference point of the patient relative to a fixed reference point, and can maintain that reference point when the movable reference point moves in space during a surgical process.
Description
- This application claims priority from U.S. application Ser. No. 12/860,635 filed on Aug. 20, 2010, the contents of which are incorporated by reference in their entirety.
- 1. Field of Invention
- The present general inventive concept relates generally to navigation of surgical components and, more particularly, to systems and methods to assist a surgeon in navigating anatomical regions of a patient to properly position and locate surgical components, adjuncts, surgical guides, goggles, dressings, instruments, and other surgical components before, during, and after injury or surgery of a patient, and for navigation and use around wounds and surgical sites.
- 2. Description of the Related Art
- The controlled positioning of surgical instruments and other components is of significant importance in many surgical procedures and wound care applications, and various methods and navigation systems have been developed to navigate surgical components relative to a patient during surgery. Intra-operative navigation systems are comparable to global positioning satellite (GPS) systems commonly used in automobiles and are composed of three primary components: a localizer, which is analogous to a satellite in space; an instrument or surgical probe adjunct, guide, goggle, or dressing, which represents the track waves emitted by the GPS unit in the vehicle; and CT scan and/or other data sets such as MRI, PET/CT, or optical data sets that are analogous to a road map of the anatomical structure of the patient. These image navigation techniques generally allow positioning of a surgical instrument within a margin of error of about 1 to 2 mm, or sub mm accuracy depending on the scan.
- Computer assisted image guidance techniques typically involve acquiring preoperative images of the relevant anatomical structures and generating a data base which represents a three dimensional model of the anatomical structures. The position of the instrument relative to the patient is determined by the computer using at least three fixed reference elements that span the coordinate system of the object in question. The process of correlating the anatomic references to the digitalized data set constitutes the registration process. The relevant surgical instruments or other components and surgical sites typically have a known and fixed geometry which is also defined preoperatively. During the surgical procedure, the position of the component being used is registered with the anatomical coordinate system and a graphical display showing the relative positions of the tool and anatomical structure may be computed and displayed to assist the surgeon in properly positioning and manipulating the surgical component with respect to the relevant anatomical structure.
- One of the disadvantages of known systems is the need to maintain proper positioning of surgical instruments relative to movable anatomic references when those references are moved during surgery, and to enable surgeons to properly position surgical instruments in real time when anatomical reference points are moved during surgery.
- The present general inventive concept provides systems and methods to digitally register and track movable regions of a patient, enabling a surgeon to accurately position and navigate surgical components such as, but not limited to, surgical instruments, adjuncts, guides, goggles, wound dressings, and other surgical components with respect to reference points even when the reference points are moved before, during, or after treatment or surgery.
- Additional features and embodiments of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
- Example embodiments of the present general inventive concept can be achieved by providing a navigation system to track positions of surgical components before, during, or after an operation of a patient, including a power source to emit a detectable signal during operation of a patient, a first sensor mounted to a movable region of the patient to respond to the emitted signal, and a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the response of the first sensor.
- The navigation system can include a second sensor mounted to a surgical component to respond to the emitted signal such that the control unit tracks a position of the surgical component relative to the movable region as the surgical component and movable region move with respect to the fixed region, based on the responses of the first and second sensors.
- Example embodiments of the present general inventive concept can also be achieved by providing a navigation system to track positions of surgical components before, during, or after an operation of a patient, including a detection unit to detect an LED or electromagnetic signal, a first sensor mounted to a movable region of the patient to emit a first LED or electromagnetic signal to be detected by the detection unit, and a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the detected first LED or electromagnetic signal.
- Example embodiments of the present general inventive concept can also be achieved by providing a method of tracking positions of surgical components before, during, or after an operation of a patient, including emitting tracking signals to a targeted region of a surgical site, coupling a first sensor to a movable region of the patient such that the first sensor responds to the emitted tracking signals, and tracking a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the response of the first sensor.
- Example embodiments of the present general inventive concept can also be achieved by providing a navigation system to track positions of surgical components during surgery of a patient, including a power source to emit a tracking signal during surgery of a patient, a first sensor mounted to a region of the patient to generate a first response signal to the emitted tracking signal, a second sensor mounted to a surgical component to generate a second response signal to the emitted tracking signal, and a control unit to track a position of the surgical component relative to the region as the surgical instrument and region move with respect to a fixed region of the patient, wherein the tracked position is based on a triangulation calculation relative to the first and second response signals independent of a shape dimension of the first and second sensors.
- The first sensor can be a digital scanner to read data pertaining to a region of interest of the patient to adjust existing CT scan data of the patient.
- The navigation system can include a set of navigation goggles worn by a surgeon to display in real-time the position of the surgical component and/or region during surgery.
- Example embodiments of the present general inventive concept can also be achieved by providing a navigation system to track positions of surgical components, including a power source to emit a tracking signal during an operation of a patient, a first component mounted to a region of interest of the patient, the first component including a first sensor to respond to the emitted tracking signal to provide location information of the first component, a second component including a second sensor to respond to the emitted tracking signal to provide location information of the second component, and a control unit to track the locations of the first and second components relative to a fixed region of the patient as the first or second components move with respect to the fixed region based on the responses of the first and second sensors, independent of a shape dimension of the first or second sensors.
- Example embodiments of the present general inventive concept can also be achieved by providing a wound care device to monitor and treat wounds of a patient, including a dressing to cover a wound of a patient, at least one detector to measure a characteristic parameter of the wound, and to transmit a signal representative of the measured characteristic parameter, and a control unit to receive the transmitted signal and to output a response indicative of the measured characteristic parameter to treat the wound.
- The monitoring device can include a sensor device to facilitate calculation of location information of the monitoring device. The monitoring device can be part of the navigation system or can be used as a separate component to monitor and treat wounds.
- The above-mentioned features of the present general inventive concept will become more clearly understood from the following detailed description read together with the drawings in which:
-
FIG. 1 is a perspective view of a system environment in which the features of the present general inventive concept may be implemented; -
FIG. 2A is a perspective view of an exemplary guide member including optical sensor members in accordance with an example embodiment of the present general inventive concept; -
FIG. 2B is a perspective view of an exemplary guide member including electromagnetic sensor members in accordance with another example embodiment of the present general inventive concept; -
FIG. 3 is a perspective view of a surgical instrument including optical or electromagnetic sensor members in accordance with an example embodiment of the present general inventive concept; -
FIG. 4 is a diagram illustrating a power source emitter and detection unit communicating with sensor units configured in accordance with an example embodiment of the present general inventive concept; -
FIG. 5 is a perspective view of a system environment including a scanning wand and navigation goggles for use in accordance with example embodiments of the present general inventive concept; -
FIG. 6 illustrates an exemplary set of navigation goggles configured in accordance with an example embodiment of the present general inventive concept; -
FIG. 7 is a perspective view of a system environment including dressings configured for use in accordance with example embodiments of the present general inventive concept; -
FIG. 8 illustrates an exemplary wound dressing configured in accordance with an example embodiment of the present general inventive concept; and -
FIG. 9 illustrates an exemplary wound dressing including a plurality of sensors to aid in navigation and detection of a variety of parameters to assist in treatment of the wound, according to an example embodiment of the present general inventive concept. - Reference will now be made to various embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. The following description of the various embodiments is merely exemplary in nature and is in no way intended to limit the present general inventive concept, its application, or uses. The example embodiments are merely described below in order to explain the present general inventive concept by referring to the figures.
- The present general inventive concept provides systems and methods of navigating surgical components with respect to anatomical regions of a patient, and assisting a surgeon in locating anatomical regions of a patient to properly position and locate surgical components such as, but not limited to, surgical adjuncts, surgical guides, goggles, dressings, and other surgical instruments and treatment components before, during, and after injury or surgery of a patient, and for navigation and use around surgical sites. As used herein, the term surgical components is intended to encompass, but is not limited to, all surgical devices, instruments, and components for use in navigation around wound sites, whether used before, during, or after surgery or treatment thereof.
- In some embodiments, the navigation system enables a surgeon to track a location of a movable reference point relative to a fixed reference point as the movable reference point moves in space with respect to the fixed reference point during a surgical procedure.
- The techniques of the present general inventive concept can be implemented in conjunction with robots to provide reference in space for surgical components and wound locations to aid in precision surgery.
- In some embodiments, the navigation system utilizes known GPS triangulation methods to determine the location of sensors on both the patient's body and the surgical component, independent of the shape or size of the sensors.
-
FIG. 1 is a perspective view illustrating an exemplary system environment in which the features of the present general inventive concept may be implemented. The system environment ofFIG. 1 includes a navigation system generally indicated byreference number 10 to navigate surgical instruments with respect to targeted anatomical structures of apatient 1. The simplified diagram ofFIG. 1 illustrates adrilling instrument 13 for use in an oral surgery procedure and apatient 1. InFIG. 1 , the patient is prepared for oral surgery toward a targeted region of the patient's mandible 19. As illustrated inFIG. 1 , themandible 19 is a movable anatomical structure as generally indicated by the phantom lines and direction arrow inFIG. 1 . Since themandible 19 is movable with respect to a fixed reference point such as the patient's skull ormaxilla 15, themandible 19 is referred to as a movable region or movable reference point. However, the present general inventive concept is not limited to any particular anatomical structure or type of movable reference point, nor is it limited to oral surgery procedures. Those skilled in the art will appreciate that many other anatomical structures could be used as a movable reference depending on the location and scope of the targeted surgical region, such as head, legs, arms, feet, hands, etc. Accordingly, the present general inventive concepts can be used to navigate any type of surgical or medical/dental instrument or component, for example, endoscopic systems, suction devices, screw devices, guides, wires, syringes, needles, drug delivery systems, biopsy systems, arthroscopic systems, wound dressings, etc. Furthermore, embodiments of the present general inventive concept may be used to navigate and/or treat any targeted region or anatomical structure of the patient's body during any medical or dental procedure, internally or externally, in addition to surgery on the mandible region as illustrated inFIG. 1 . It is noted that the simplified diagram does not illustrate various connections, for example, power, ground, and interface connections to the various components; however, those skilled in the art will recognize the need for such connections and understand how to implement such connections, based on the components ultimately selected for use. - Referring to
FIG. 1 , thenavigation system 10 includes a surgical aid device such asmovable guide member 11, a power source or emittingdevice 17, and acontrol unit 16 having adisplay monitor 8. In some embodiments, themovable guide member 11 can be a customized guide fitted to individual cusps of the teeth including sensors to provide triangulation information for use in navigating craniofacial or dental operations. The system may also include a surgical component such as 13 to be tracked with respect to the location of a surgical site of interest as represented bymovable guide member 11. Themovable guide member 11 andsurgical instrument 13 can includesensor elements device 17 emits a propagating signal to communicate with thesensors surgical instrument 13 relative to themovable guide member 11. Thus, using a customizedguide member 11, for example, it is possible to use the patient's teeth or dental alveolus as unique registration points (e.g., fixed points) to register the mouthpiece/guide 11 during oral surgery. It is also possible to form other shapes and sizes of guide members, such as but not limited to guidance screws, implants, bandages, dressings, drapes, and the like, and attach them to other body parts to provide registration points for other parts of the body during other types of surgeries. - The emitting
device 17 may also include adetection unit 17 c to detect responses of thesensors detection unit 17 c, thecontrol unit 16 utilizes a multi-triangulation concept to calculate the position of thesensors device 17. The manner in which the emittingdevice 17 and/ordetection unit 17 c communicates with thesensors emitter 17 andsensors sensors emitter 17 to function as a sensor rather than an emitter. In any case, it is possible to utilize known triangulation methods to calculate and track the positions of thesensors navigation system 10 may include an optional imaging device (not illustrated), such as an MRI unit, CT scanner, or other type of imaging device, optical device, or electromagnetic device, to acquire pre-, intra-, or post-operative or real-time images of thepatient 1, in order to determine location coordinates with respect to a fixed portion of the patient's body, for example, to obtain digital coordinates of the various components relative to the patient's maxilla orskull region 15. - Referring to
FIG. 1 , the emittingdevice 17 can generate a tracking signal which can be received bysensors 12 and/or 14. The tracking signal may take the form of an infrared light signal (IR), electromagnetic (EM) signal, Bluetooth signal, Wi-Fi signal, or other known or later developed wired or wireless signal. In the example embodiment ofFIG. 1 , it is presumed for convenience of description that the propagating signal is an LED light signal transmitted from the emittingdevice 17 to thesensors guide member 11 and/orsurgical component 13, thesensors device 17 to adetection unit 17 c, such as a CCD camera device. Using the reflected LED signals, thedetection unit 17 c can determine the location of thesensors control unit 16 of the location of the sensors in real time based on the characteristics of the reflected LED signals. In other embodiments, it is possible that thesensors detection unit 17 c. In this case, the position of thesensors detection unit 17 c by detecting and characterizing the LED signals emitted from the sensors directly, in which case the emittingdevice 17 may not be required. Those skilled in the art will appreciate that many other configurations and combinations of elements in addition to those illustrated inFIG. 1 could be used without departing from the broader scope of the present general inventive concept. - During typical dental or medical procedures, the patient's MRI or CT scans may be fed into the
control unit 16 to compare the scanned MRI or CT images to anatomical landmarks or reference points fixed on the patient's head and face to calibrate a location of the fixed reference point relative to a target point for the procedure or surgery. In the embodiment ofFIG. 1 , the patient'smaxilla 15 can be used as a fixed reference point. To register the fixed reference point, it is possible to calculate a position of the fixed reference point with respect to the targeted surgical field (e.g., mandible region) based on coordinates of the patient generated by the MRI or CT scans. It is also possible to directly register a location of the fixed reference point by mounting a fixed device, such as a screw device (not illustrated), adapted to include an integrated sensor device to correspond and define a fixed reference point of the patient's skull. The fixed sensor device can then be used to communicate with the emittingdevice 17 and/ordetection unit 17 c to calibrate the location of the fixed reference point relative to one or more other sensors or reference points of the patient. In this way, the fixedreference point 15 may be used as a positional reference frame to determine the relative position of thesurgical component 13 with respect to the target point of the surgery, and to calibrate a position of themovable guide element 11. - To carry out a particular surgical process, it may be important to move the patient's
mandible 19 during the process as indicated by the phantom lines and direction arrow illustrating movement of themandible 19 as depicted inFIG. 1 . Here, the surgeon can attach a surgical aid component such as amovable guide member 11 adapted with asensor array 12 to a portion of the patient's mandible to track movements of the patient'smandible 19, as illustrated inFIG. 1 . - Referring to
FIGS. 1 and 2A , the exemplarymovable guide member 11 can be configured in the shape of a semicircular mouthpiece to fit precisely on the patient's mandible. Themovable guide member 11 typically includes a series ofholes 122 which the surgeon uses to locate and orient dental implants during oral surgery. Themovable guide member 11 can be attached to the patient's mandible by way offasteners fasteners guide member 11 andsensor 12 to these and/or other movable regions of the patient without departing from the broader scope of the present general inventive concept. For example, fixation methods such as intermaxillary fixation (IMF) methods, IMF screws, and the like, can be adapted to include a sensor device in accordance with the present general inventive concept to track movements of a movable region of the patient during a medical or dental procedure. It is possible to mount asensor 12 to a guide member such as a bite plate device and/or customized guide based on the individual unique cusps of teeth, secured to a lower jaw of the patient by screws. This facilitates using the teeth and/or dental alveolus as unique registration points (fixed points) to register the location of the mouthpiece/guide during oral surgery. It is possible to use other body parts and attachment devices, chosen with sound engineering judgment, to assist with other types of surgeries or treatment operations. Moreover, although the example embodiment ofFIG. 2A illustrates a mouthpiece-shapedguide member 11 to incorporate thesensor 12, the present general inventive concept is not limited to such configuration, and various other types of sensor arrangements may be used in connection with a variety of other types of fixation devices, methods, or splints to track and maintain a movable reference point during surgery. For example, it is possible to incorporate a sensor device into a locating pin or other fastening device, such as a surgical screw, and to attach the pin or screw to the targeted movable region of the patient to track the movable reference during a particular medical or dental (i.e., surgical) procedure. - Referring to
FIGS. 2A and 2B , theguide members guide members - It is also possible to integrate RFID sensors, and/or other types of sensors, such as Bluetooth enabled sensors, into a mesh-like bite plate device, where the sensors are disposed or integrated within the mesh construct of the device itself. The RFID sensors can be powered by solar cells or other energy harvesting devices, such as RF harvesting devices. The integrated device can then be attached to a movable region of interest, such as the patient's lower jaw, to track movements thereof during an operative procedure. The present general inventive concept is not limited to the exemplary configurations illustrated and described herein. To the contrary, a variety of other configurations and combinations of dental/medical devices can be adapted with a variety of different sensor technologies (e.g., swarming technology) to carry out the techniques of the present general inventive concept. For example, it is possible to utilize various combinations of sensor technologies, such as EM and/or optical, during a single operative procedure, depending on the particular components and instruments chosen and adapted for use.
- Referring to the example embodiment of
FIG. 2A , there is illustrated a perspective view of a typicalmovable guide member 11 adapted to include an array ofsensor members device 17, in accordance with an example embodiment of the present general inventive concept. In this example embodiment, thesensors device 17 to adetection unit 17 c. Thedetection unit 17 c can continuously acquire the position of thesensors control unit 16 of the location of the sensors in real time. Thecontrol system 16 can compute the position of themovable guide member 11 using a known multi-triangulation method based on information received from thesensors movable guide member 11 with respect to various other components, structures, and reference points of thenavigation system 10. - Referring to
FIGS. 1 and 2A , thesensors guide member 11 to help maintain consistent line-of-sight between thesensors light emitting device 17. AlthoughFIGS. 1 and 2A depict an oral surgery configuration, those skilled in the art will appreciate that the present general inventive concept is not limited to the embodiments ofFIGS. 1 and 2A , and that many other shapes and sizes ofguide members 11 andsensors - In the case of dental implants, for example, it is possible to mount a
sensor array 12 to themovable guide member 11 to facilitate tracking of theguide member 11 as the mandible is moved, enabling the surgeon to maintain consistent and proper positioning of thesurgical component 13 with respect to the mandible even when the mandible is moved during surgery. - In the embodiment of
FIG. 1 , the surgeon attaches themovable guide member 11 andsensor 12 to the target point, such as the patient'smandible 19 as illustrated inFIG. 1 . During a surgical procedure, thecontrol unit 16 can track the location of themovable guide member 11 and thesurgical component 13 in real time, enabling the surgeon to maintain proper positioning of thesurgical component 13 with respect to the target point even when themovable guide member 11 is moved during surgery. - During a surgical procedure, the surgeon may move the
surgical component 13 with respect to the targeted surgical region of the patient, for example themandible 19 area as illustrated inFIG. 1 . As the surgeon is moving thesurgical component 13, thecontrol unit 16 can track the location of thesurgical component 13 via thesensors 14 mounted on thesurgical component 13. Thecontrol system 16 can interpret the response signals of thesensor 14 to compute the position of thesurgical component 13 using a known multi-triangulation method based on response signals of thesensors 14, and can display on display monitor 8 an image displaying the position of thesurgical component 13 with respect to the targeted region of the patient. These techniques enable a surgeon to track the relative positions of themovable guide member 11 andsurgical component 13 in the targeted surgical field, even when themovable guide member 11 is moved during the surgical process. Using the present general inventive concepts, it is thus possible to utilize known GPS triangulation methods to determine the location of sensors on both the patient's body and the surgical component, independent of information regarding the shape or size of the sensor to calculation the location thereof. - Referring to
FIG. 1 , in the case where the emittingdevice 17 emits infrared light signals, it is important that thesensors movable guide member 11 andsurgical component 13 in thecontrol unit 16 as thesurgical component 13 and guidemember 11 are moved during surgery. However, in cases where the emitting device does not emit light signals but instead emits EM or other types of RF or wireless signals, it is not as important to maintain thesensors -
FIG. 2B is a perspective view of guide member including sensor members in accordance with another example embodiment of the present general inventive concept, for example, in a case where the emittingdevice 17 emits EM or other RF-based signals. - Referring to
FIG. 2B , in a case where the emittingdevice 17 emits EM or other RF-based signals, the sensors of themovable guide member 11′ can include an array of detectors, such as radio frequency identification (RFID)sensors 12 a′, 12 b′, and 12 c′, to communicate with the EM signals emitted from the emittingdevice 17. Unlike the configuration ofFIG. 2A , theRFID sensors 12 a′, 12 b′, and 12 c′ can be mounted internally with respect to theguide member 11′ as illustrated inFIG. 2B . The RFID sensors can be mounted within the internal structure of theguide member 11′ since it is not as important to maintain a direct line-of-sight between the sensors and the emittingdevice 17 due to the penetrating characteristics of EM and other types of RF signals. In operation, theRFID sensors 12 a′, 12 b′, and 12 c′ function to interact with the electromagnetic field generated by the emittingdevice 17, and thecontrol unit 16 can recognize any disruptions in the magnetic field caused by the RFID sensors, enabling the system's computer, which has special tracking software, to recognize the location of the RFID sensors and its location in the surgical field using a known multi-triangulation concept based on the interaction of theRFID sensors 12 a′, 12 b′, and 12 c′ with the electromagnetic field. Similar to the embodiment ofFIG. 2A , thecontrol unit 16 can compute the position of themovable guide member 11′ in real time based on this information, and can display on display monitor 8 an image displaying the position of themovable guide member 11′ with respect to various other components, structures, and reference points of thenavigation system 10. -
FIG. 3 is a perspective view of an exemplarysurgical component 13 including asensor array 14 configured in accordance with an example embodiment of the present general inventive concept. - Referring to
FIG. 3 , thesurgical component 13 includes asensor array 14 includingsensors device 17 to track the location of the surgical component in the surgical field, in the manners discussed above. As withsensors sensors - To facilitate attachment of the
sensor array 14 to the surgical component, the sensor array may be mounted in the form of a ring-like shape to fit around a shaft or neck region of thesurgical component 13, as illustrated inFIG. 3 . Such a configuration is easily adaptable to any number of different shaped and sized surgical components. However, those skilled in the art will appreciate that the specific means of mounting the sensors to the various components can be chosen with sound engineering judgment, and a variety of mounting shapes and configurations could be used without departing from the broader scope of the present general inventive concept. For example, thesensors surgical component 13 as a single body to communicate with the propagating signal without sacrificing proper positioning of thesurgical component 13 with respect to the surgical field. Using the responses of thesensors control unit 16 can calculate the position of thesurgical component 13 relative to the movable reference region and can track and compare the relative movements of theguide member 11 with respect to thesurgical component 13. It is possible to include a slot or other type of holding means in one or more of the exemplary devices of the navigation system to hold a microSD card or other memory device to store or upload data to/from the navigation system. - Referring to
FIG. 4 , it is possible to configure thesensors emitter device 17 and/ordetection unit 17 c, to provide additional information about the relative positions of therespective guide member 11 andsurgical component 13. In this regard, thesensors devices 17 and/ordetection unit 17 c to track location information of the respective sensors. For example, one of thesensors 12 could be configured to include an EM source and a light reflector sensor, and theother sensor 14 could be configured to include an RFID receptor to interact with the EM field generated bysensor 12. In such a case, theemitter device 17 anddetection unit 17 c could be adapted to track the location ofsensor 12 by characterizing the light reflected bysensor 12, and thecontrol unit 16 could be adapted track the relative distance between thesensors sensor 14. A variety of other types and combinations of sensors could also be used. -
FIG. 4 is a simple diagram illustrating a light source and light detector in communication withsensor arrays sensors power source 17 and/ordetection unit 17 c to provide information regarding the location of the respective devices, as indicated by the dotted lines extending between the sensors and thepower source 17 anddetection unit 17 c. It is also possible that thesensors other sensors sensor arrays sensors sensors sensors detection unit 17 c and fed to the control unit 16 (FIG. 1 ) to calculate and display location information about the relative positions of the sensors. Moreover, the use of RFID, Bluetooth, IR, EM, LED, or other types of sensors can be interchanged, mixed, or combined for use with different devices and applications, without departing from the broader principles and scope of the present general inventive concept. For example, swarming technology can be used to implement a variety of different sensor technologies (e.g., EM and/or optical) on a variety of different surgical components and regions of interest to track movements thereof during single or multiple operative procedures of a patient. - It is also possible to utilize thermography in conjunction with the navigation techniques of the present general inventive concept to identify other structures in and around the surgical region of interest such as nerves, arteries, veins, and the like. For example, after the RFID sensors track and identify the location of teeth or other structures in a surgical region of interest, such as the mandible, it is possible to identify the location of nerves, arteries, or veins in the mandible using thermography, thus providing additional navigational information to supplement the information provided from the multi-triangulation techniques of the present general inventive concept. In other words, it is possible to incorporate thermal imaging cameras into, or in combination with, the exemplary sensors of the present general inventive concept in order to detect variations in the infrared radiation of various body parts and to display thermographic images thereof. In this way, if the surgeon knows that the artery, vein, or nerve runs along with the vein, the use of thermography can be used to identify where the canal is, thus providing additional location information in addition to the information provided by the RFID or other sensors. Accordingly, not only can the multi-triangulation concepts of the present general inventive concept be used to indicate where a boney indentation is in the bone, but thermography concepts can also be incorporated into the navigation system of the present general inventive concept to help identify and locate the nerve, artery, and/or vein during surgery.
-
FIG. 5 is a perspective view of a system environment including a pair ofnavigation goggles 50 and adigital scanning wand 51 for use in accordance with example embodiments of the present general inventive concept. Thescanning wand 51 can be used to superimpose measurements onto the patient scan data, such as CT scan data. The measurements from thescanning wand 51 can be used to supplement or replace patient scan data to enable the surgeon to determine location information of surgical sites of interest that may be modified or moved relative to the original scan data. For example, thenavigation goggles 50 can interface with the navigation system, via a wired or wireless connection, to enable the surgeon to visualize location information of surgical sites of interest in real time during surgery. -
FIG. 6 illustrates an exemplary set ofnavigation goggles 50 configured in accordance with an example embodiment of the present general inventive concept. Referring toFIG. 6 , thegoggles 50 facilitate 3D viewing of the surgical field with an overlay of the scan. The goggles can include sensors to sense the blinking of the eyelids and eye movements to function in part with verbal commands and buttons on the instruments to control various aspects of the surgical field including the 3D viewing experience of the goggles. Thegoggles 50 can include various overlays to display navigation data, such as location of surgical components and/or surgical sites in 3-dimensional space, angular information, target points, and the like. Thus, the location information provided by the navigation system can processed and fed to thenavigation goggles 50 in various forms to assist the surgeon in visualizing and locating surgical components and surgical sites as the operation is being performed. For example, it is possible for the surgeon to visualize tumors or other surgical sites, to see the depths of invasion, and to superimpose data from the digital wand and/or CT scan while cutting or performing other operations on the patient. -
FIG. 7 is a perspective view of a system environment includingexemplary dressings 70 configured for use in accordance with example embodiments of the present general inventive concept. Similar to thesurgical components 13 andguide members 11, thedressings 70 can include suitable sensors, such as RFID sensors, to communicate location information concerning the placement of thedressings 70. Thedressings 70 can be placed to reference various aspects of surgical and non-surgical wound dimensions, wherein the wounds orientation and sensors are able to detect the condition of the wound in conjunction with navigation. Thedressings 70 can include a solar cell or other energy harvesting device to power the sensors, but the present general inventive concept is not limited to any particular type of sensor or power source. Thus, by strategically placing one ormore dressings 70 at various locations of interest on or around the patient, location information can be communicated from thedressings 70 to the navigation system using GPS triangulation techniques relative to the sensors of each dressing, thus providing location information of each dressing relative to other surgical components or surgical sites of interest. The location information can then be processed by the control unit and displayed in various formats to the surgeon via display monitor 8 (FIG. 1 ) and/or navigation goggles 50 (FIG. 6 ). -
FIG. 8 illustrates an exemplary wound dressing 80 including a tripartite sensor arrangement (similar as described above) to facilitate GPS triangulation calculations and location data of the dressing 80 relative to other surgical components, surgical sites, and/or other anatomical regions of interest. -
FIG. 9 illustrates an exemplary wound dressing 90 including a plurality of treatment devices to aid in the navigation, detection, and/or treatment of a variety of parameters to assist in operations of a wound or surgical site, according to an example embodiment of the present general inventive concept. Exemplary treatment devices are illustrated in a circuit fashion inFIG. 9 , with a key indicating some exemplary parameters for use of the treatment devices, although the present general inventive concept is not limited to the illustrated parameters, and a variety of other parameters could be used without departing from the scope and spirit of the present general inventive concept. - The treatment devices of
FIG. 9 can be implemented in combination with RFID or other navigation sensors to provide navigation and treatment information respecting a particular wound. For example, as illustrated inFIG. 9 , it is possible to provide one or more gas sensors to detect gases such as NO, O2, CO2, or other gases in or around a particular wound area. This information can be communicated to the navigation system to provide a monitoring component of a particular wound area. Other parameters can also be monitored, for example, temperature, pH, bacteria level, pressure, and the like. It is also possible to provide one or more ultraviolet (UV) devices to detect and/or deliver UV energy to targeted areas of the wound, based on results of the other parameter measurements and/or detections. Treatment devices may also be targeted to various regions of the dressing using navigation information provided by RFID or other GPS devices of the wound dressing. - While the present general inventive concept has been illustrated by description of example embodiments and while the illustrative embodiments have been described by referring to the drawings, it is not the intention of the applicant to restrict or in any way limit the scope of the appended claims to the illustrative examples. Additional advantages and modifications of the present general inventive concept will readily appear to those skilled in the art. The present general inventive concept in its broader aspects is therefore not limited to the specific details, representative apparatus and methods, and illustrative examples illustrated and described. Accordingly, departures may be made from such details without departing from the spirit or scope of applicant's general inventive concept.
Claims (28)
1. A navigation system to track positions of surgical components, comprising:
a power source to emit a tracking signal during an operation of a patient;
a first sensor mounted to a movable region of the patient to respond to the emitted tracking signal; and
a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the response of the first sensor.
2. The navigation system of claim 1 , further comprising:
a second sensor mounted to a surgical component to respond to the emitted tracking signal such that the control unit tracks a position of the surgical component relative to the movable region as the surgical component and movable region move with respect to the fixed region, based on the responses of the first and second sensors.
3. The navigation system of claim 2 , wherein the first and second sensors each comprise at least three receptors to interact with the emitted tracking signal, and the control unit tracks the position of the surgical component relative to the movable region using a triangulation calculation based on the interaction of the at least three receptors.
4. The navigation system of claim 2 , further comprising:
a detection unit to detect the responses of the first and second sensors such that the control unit tracks the movement of the movable region and the surgical component based on the detected responses.
5. The navigation system of claim 4 , wherein the first and second sensors each comprise at least three reflectors to reflect the emitted tracking signal, and the control unit tracks the position of the surgical component relative to the movable region using a triangulation calculation based on the reflected signals of the at least three reflectors.
6. The navigation system of claim 2 , wherein the first sensor comprises an emitting unit to emit a second tracking signal to the second sensor, and the second sensor comprises a receptor unit to respond to the second tracking signal such that the control unit tracks the movement of the surgical component relative to the movable region based on the response of the receptor unit to the second tracking signal.
7. The navigation system of claim 1 , wherein the first sensor comprises at least three RFID, Bluetooth, LED, or WiFi receptors to interact with the emitted tracking signal, and the control unit tracks the position of the movable region using a triangulation calculation based on the interaction of the at least three receptors.
8. The navigation system of claim 1 , further comprising:
a surgical aid component fixedly mounted to the movable region, wherein the first sensor is coupled to an outer surface of the surgical component and is oriented to maintain a visible line of sight with the emitted tracking signal.
9. A navigation system to track positions of surgical components during surgery of a patient, comprising:
a detection unit to detect an optical signal;
a first sensor mounted to a movable region of the patient to emit a first optical signal to be detected by the detection unit; and
a control unit to track a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the detected first optical signal.
10. The navigation system of claim 9 , further comprising:
a second sensor mounted to a surgical component to emit a second optical signal to be detected by the detection unit such that the control unit tracks a position of the surgical component relative to the movable region as the surgical component and movable region move with respect to the fixed region, based on the detected first and second optical signals.
11. The navigation system of claim 10 , wherein the first and second sensors each comprise at least three optical emitters to respectively emit first, second, and third light signals to be detected by the detection unit, such that the control unit tracks the position of the surgical component relative to the movable region using a triangulation calculation based on the detected first, second, and third light signals.
12. The navigation system of claim 10 , wherein the first sensor comprises an emitting unit to emit a tracking signal to the second sensor, and the second sensor comprises a receptor unit to respond to the tracking signal such that the control unit tracks the movement of the surgical component relative to the movable region based on the response of the receptor unit to the tracking signal.
13. The navigation system of claim 9 , further comprising:
a surgical component fixedly mounted to the movable region, wherein the first sensor is coupled to an outer surface of the surgical component to maintain a visible line of sight with the light detector as the movable region is moved during the surgery.
14. A method of tracking positions of surgical components during a surgical process of a patient, comprising:
emitting tracking signals to a targeted region of the surgical process;
coupling a first sensor to a movable region of the patient such that the first sensor responds to the emitted tracking signals; and
tracking a position of the movable region relative to a fixed region of the patient as the movable region moves with respect to the fixed region, based on the response of the first sensor.
15. The method of claim 14 , wherein a location of the fixed region is based on a scanned image of the patient.
16. The method of claim 14 , further comprising:
coupling a second sensor to a surgical component to be used in the surgery such that the second sensor responds to the emitted signal;
tracking a position of the surgical component relative to the movable region as the surgical component and movable region move with respect to the fixed region, based on the responses of the first and second sensors; and
displaying an image of the relative positions of the surgical component and movable region.
17. The method of claim 16 , wherein the displaying an image is performed by a set of navigation goggles to be worn by a surgeon.
18. The method of claim 16 , wherein the first sensor comprises an emitting unit to emit a second tracking signal to the second sensor, and the second sensor comprises a receptor unit to respond to the second tracking signal such that the control unit tracks the movement of the surgical component relative to the movable region based on the response of the receptor unit to the second tracking signal.
19. The method of claim 14 , wherein the coupling of the first sensor to the movable region of the patient comprises:
fixedly mounting a surgical aid component to the movable region; and
coupling the first sensor to the surgical aid component.
20. The method of claim 19 , wherein the first sensor is coupled to an outer surface of the surgical aid component and is oriented to maintain a visible line of sight with the emitted signals as the movable region moves with respect to the fixed region during the surgical process.
21. The method of claim 16 , wherein the first sensor comprises at least three RFID, Bluetooth, LED, or WiFi receptors to interact with the emitted tracking signals, and the control unit tracks the position of the movable region using a triangulation calculation based on the interaction of the at least three receptors.
22. A navigation system to track positions of surgical components during surgery of a patient, comprising:
a power source to emit a tracking signal during surgery of a patient;
a first sensor mounted to a region of the patient to generate a first response signal to the emitted tracking signal;
a second sensor mounted to a surgical component to generate a second response signal to the emitted tracking signal; and
a control unit to track a position of the surgical component relative to the region as the surgical instrument and region move with respect to a fixed region of the patient,
wherein the tracked position is based on a triangulation calculation relative to the first and second response signals independent of a shape dimension of the first and second sensors.
23. The navigation system of claim 22 , wherein the first sensor comprises a digital scanner to read data pertaining to a region of interest of the patient to adjust existing CT scan data of the patient.
24. The navigation system of claim 22 , further comprising a set of navigation goggles worn by a surgeon to display in real-time the position of the surgical component and/or region during surgery.
25. A navigation system to track positions of surgical components, comprising:
a power source to emit a tracking signal during an operation of a patient;
a first component mounted to a region of interest of the patient, the first component including a first sensor to respond to the emitted tracking signal to provide location information of the first component;
a second component including a second sensor to respond to the emitted tracking signal to provide location information of the second component; and
a control unit to track the locations of the first and second components relative to a fixed region of the patient as the first or second components move with respect to the fixed region based on the responses of the first and second sensors, independent of a shape dimension of the first or second sensors.
26. A wound care device to monitor and treat wounds of a patient, comprising:
a dressing to cover a wound of a patient;
at least one detector to measure a characteristic parameter of the wound, and to transmit a signal representative of the measured characteristic parameter; and
a control unit to receive the transmitted signal and to output a response indicative of the measured characteristic parameter to treat the wound.
27. The wound care device of claim 26 , wherein the dressing includes at least one delivery device to deliver a treatment element to a selected region of the wound based on a location of the measured characteristic parameter.
28. The wound care device of claim 27 , further comprising an energy harvesting device to power the at least one detector and the at least one delivery device.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/214,783 US20120316486A1 (en) | 2010-08-20 | 2011-08-22 | Surgical Component Navigation Systems And Methods |
US14/879,612 US10639204B2 (en) | 2010-08-20 | 2015-10-09 | Surgical component navigation systems and methods |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/860,635 US20120046536A1 (en) | 2010-08-20 | 2010-08-20 | Surgical Instrument Navigation Systems and Methods |
US13/214,783 US20120316486A1 (en) | 2010-08-20 | 2011-08-22 | Surgical Component Navigation Systems And Methods |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/860,635 Continuation-In-Part US20120046536A1 (en) | 2010-08-20 | 2010-08-20 | Surgical Instrument Navigation Systems and Methods |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/879,612 Continuation US10639204B2 (en) | 2010-08-20 | 2015-10-09 | Surgical component navigation systems and methods |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120316486A1 true US20120316486A1 (en) | 2012-12-13 |
Family
ID=47293757
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/214,783 Abandoned US20120316486A1 (en) | 2010-08-20 | 2011-08-22 | Surgical Component Navigation Systems And Methods |
US14/879,612 Active 2031-09-21 US10639204B2 (en) | 2010-08-20 | 2015-10-09 | Surgical component navigation systems and methods |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/879,612 Active 2031-09-21 US10639204B2 (en) | 2010-08-20 | 2015-10-09 | Surgical component navigation systems and methods |
Country Status (1)
Country | Link |
---|---|
US (2) | US20120316486A1 (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110160572A1 (en) * | 2009-12-31 | 2011-06-30 | Orthosensor | Disposable wand and sensor for orthopedic alignment |
US20120319859A1 (en) * | 2010-01-20 | 2012-12-20 | Creative Team Instruments Ltd. | Orientation detector for use with a hand-held surgical or dental tool |
US20130171587A1 (en) * | 2010-09-21 | 2013-07-04 | Implantdent Co., Ltd. | Surgical guide preparation tool and method for preparing surgical guide |
US20130289347A1 (en) * | 2011-01-12 | 2013-10-31 | Olympus Corporation | Endoscopic system |
US20130296651A1 (en) * | 2011-01-24 | 2013-11-07 | Olympus Corporation | Endoscope system |
US20140080086A1 (en) * | 2012-09-20 | 2014-03-20 | Roger Chen | Image Navigation Integrated Dental Implant System |
US20140147807A1 (en) * | 2012-11-27 | 2014-05-29 | National Chung Cheng University | Computer-aided positioning and navigation system for dental implant |
WO2014152519A2 (en) | 2013-03-14 | 2014-09-25 | X-Nav Technologies, LLC | Image guided navigation system |
US20150265367A1 (en) * | 2014-03-19 | 2015-09-24 | Ulrich Gruhler | Automatic registration of the penetration depth and the rotational orientation of an invasive instrument |
CN104966394A (en) * | 2015-07-10 | 2015-10-07 | 新博医疗技术有限公司 | Wireless communication system for coordinate measurement |
US20160015476A1 (en) * | 2014-07-15 | 2016-01-21 | Synaptive Medical (Barbados) Inc. | Medical Device Control Interface |
CN105395252A (en) * | 2015-12-10 | 2016-03-16 | 哈尔滨工业大学 | Wearable three-dimensional image navigation device for vascular intervention operation and realizing man-machine interaction |
WO2016096984A1 (en) * | 2014-12-18 | 2016-06-23 | Norwegian University Of Science And Technology (Ntnu) | Intervention guidance device |
WO2016105592A1 (en) * | 2014-12-24 | 2016-06-30 | Chodorow Ingram | Disposable surgical intervention guides, methods, and kits |
US20160184068A1 (en) * | 2014-12-24 | 2016-06-30 | Ingram Chodorow | Disposable surgical intervention guides, methods, and kits |
US9402691B2 (en) | 2014-09-16 | 2016-08-02 | X-Nav Technologies, LLC | System for determining and tracking movement during a medical procedure |
EP3090699A1 (en) * | 2015-05-04 | 2016-11-09 | Weigl, Paul | System and method for the collection and provision of three-dimensional characteristic data of the bone, soft tissue and mouth situation of a patient |
US9675796B2 (en) | 2013-11-10 | 2017-06-13 | Brainsgate Ltd. | Implant and delivery system for neural stimulator |
US20170202636A1 (en) * | 2015-02-04 | 2017-07-20 | Jerry T. Huang | Investigation and control device of drive system |
US20170333135A1 (en) * | 2016-05-18 | 2017-11-23 | Fei Gao | Operational system on a workpiece and method thereof |
US9943374B2 (en) | 2014-09-16 | 2018-04-17 | X-Nav Technologies, LLC | Image guidance system for detecting and tracking an image pose |
TWI629974B (en) * | 2017-05-26 | 2018-07-21 | 醫百科技股份有限公司 | Surgical guidance system |
US10064700B2 (en) * | 2013-02-14 | 2018-09-04 | Zvi Fudim | Surgical guide kit apparatus and method |
US10143827B2 (en) | 2014-09-30 | 2018-12-04 | Integra Lifesciences Switzerland Sàrl | Optoelectronic sensing of a subcutaneous implant setting |
US20180368767A1 (en) * | 2015-07-03 | 2018-12-27 | Witooth Dental Services And Technologies, S.L. | Intraoral device |
US10271907B2 (en) | 2015-05-13 | 2019-04-30 | Brainsgate Ltd. | Implant and delivery system for neural stimulator |
CN110267616A (en) * | 2017-02-27 | 2019-09-20 | 史密夫和内修有限公司 | Operation guiding system supports array |
CN110815202A (en) * | 2018-08-07 | 2020-02-21 | 杭州海康机器人技术有限公司 | Obstacle detection method and device |
EP3558159A4 (en) * | 2016-12-23 | 2020-11-25 | Planmeca Oy | Tracking pieces for tracking movements of hard tissue of a jaw |
US20220104885A1 (en) * | 2020-10-06 | 2022-04-07 | Michael J. Hartman | Patient specific dynamic navigation tracker arm mount apparatus and method |
US11712464B2 (en) | 2012-09-06 | 2023-08-01 | Norwegian University Of Science And Technology (Ntnu) | Intervention device |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9554869B1 (en) * | 2016-01-08 | 2017-01-31 | Eped Inc. | Bite tray having fiducial markers for head scan registration and method of use |
JP2019527566A (en) | 2016-05-13 | 2019-10-03 | スミス アンド ネフュー ピーエルシーSmith & Nephew Public Limited Company | Wound monitoring and treatment device using sensor |
US11839433B2 (en) | 2016-09-22 | 2023-12-12 | Medtronic Navigation, Inc. | System for guided procedures |
WO2018162732A1 (en) | 2017-03-09 | 2018-09-13 | Smith & Nephew Plc | Apparatus and method for imaging blood in a target region of tissue |
EP3592212A1 (en) | 2017-03-09 | 2020-01-15 | Smith & Nephew PLC | Wound dressing, patch member and method of sensing one or more wound parameters |
JP7235673B2 (en) | 2017-04-11 | 2023-03-08 | スミス アンド ネフュー ピーエルシー | Component placement and stress relief for sensor-enabled wound dressings |
CA3062989A1 (en) | 2017-05-15 | 2018-11-22 | Smith & Nephew Plc | Wound analysis device and method |
CA3066073A1 (en) | 2017-06-23 | 2018-12-27 | Smith & Nephew Plc | Positioning of sensors for sensor enabled wound monitoring or therapy |
GB201809007D0 (en) | 2018-06-01 | 2018-07-18 | Smith & Nephew | Restriction of sensor-monitored region for sensor-enabled wound dressings |
GB201804502D0 (en) | 2018-03-21 | 2018-05-02 | Smith & Nephew | Biocompatible encapsulation and component stress relief for sensor enabled negative pressure wound therapy dressings |
CA3072006A1 (en) | 2017-08-10 | 2019-02-14 | Smith & Nephew Plc | Positioning of sensors for sensor enabled wound monitoring or therapy |
JP2020533093A (en) | 2017-09-10 | 2020-11-19 | スミス アンド ネフュー ピーエルシーSmith & Nephew Public Limited Company | Systems and methods for inspecting encapsulation, as well as components within wound dressings equipped with sensors |
GB201718870D0 (en) | 2017-11-15 | 2017-12-27 | Smith & Nephew Inc | Sensor enabled wound therapy dressings and systems |
GB201804971D0 (en) | 2018-03-28 | 2018-05-09 | Smith & Nephew | Electrostatic discharge protection for sensors in wound therapy |
US11596553B2 (en) | 2017-09-27 | 2023-03-07 | Smith & Nephew Plc | Ph sensing for sensor enabled negative pressure wound monitoring and therapy apparatuses |
EP3687396A1 (en) | 2017-09-28 | 2020-08-05 | Smith & Nephew plc | Neurostimulation and monitoring using sensor enabled wound monitoring and therapy apparatus |
WO2019096828A1 (en) | 2017-11-15 | 2019-05-23 | Smith & Nephew Plc | Integrated sensor enabled wound monitoring and/or therapy dressings and systems |
US20190175059A1 (en) * | 2017-12-07 | 2019-06-13 | Medtronic Xomed, Inc. | System and Method for Assisting Visualization During a Procedure |
EP3849401A1 (en) | 2018-09-12 | 2021-07-21 | Smith & Nephew plc | Device, apparatus and method of determining skin perfusion pressure |
CN109700533B (en) * | 2018-12-17 | 2021-07-30 | 上海交通大学医学院附属第九人民医院 | Fixed jaw position navigation registration guide plate and registration method thereof |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6450978B1 (en) * | 1998-05-28 | 2002-09-17 | Orthosoft, Inc. | Interactive computer-assisted surgical system and method thereof |
US6640128B2 (en) * | 2000-12-19 | 2003-10-28 | Brainlab Ag | Method and device for the navigation-assisted dental treatment |
US20040087852A1 (en) * | 2001-02-06 | 2004-05-06 | Edward Chen | Computer-assisted surgical positioning method and system |
US20050085714A1 (en) * | 2003-10-16 | 2005-04-21 | Foley Kevin T. | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
US20070106152A1 (en) * | 2005-09-23 | 2007-05-10 | Kantrowitz Allen B | Fiducial marker system for subject movement compensation during medical treatment |
US20080039717A1 (en) * | 2006-08-11 | 2008-02-14 | Robert Frigg | Simulated bone or tissue manipulation |
US7457443B2 (en) * | 2001-05-31 | 2008-11-25 | Image Navigation Ltd. | Image guided implantology methods |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5269785A (en) | 1990-06-28 | 1993-12-14 | Bonutti Peter M | Apparatus and method for tissue removal |
AU3950595A (en) | 1994-10-07 | 1996-05-06 | St. Louis University | Surgical navigation systems including reference and localization frames |
US5772594A (en) | 1995-10-17 | 1998-06-30 | Barrick; Earl F. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US5921992A (en) | 1997-04-11 | 1999-07-13 | Radionics, Inc. | Method and system for frameless tool calibration |
US6434507B1 (en) | 1997-09-05 | 2002-08-13 | Surgical Navigation Technologies, Inc. | Medical instrument and method for use with computer-assisted image guided surgery |
US6096050A (en) | 1997-09-19 | 2000-08-01 | Surgical Navigation Specialist Inc. | Method and apparatus for correlating a body with an image of the body |
US6226548B1 (en) | 1997-09-24 | 2001-05-01 | Surgical Navigation Technologies, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
US6348058B1 (en) | 1997-12-12 | 2002-02-19 | Surgical Navigation Technologies, Inc. | Image guided spinal surgery guide, system, and method for use thereof |
US6081741A (en) | 1998-06-05 | 2000-06-27 | Vector Medical, Inc. | Infrared surgical site locating device and method |
US6381029B1 (en) | 1998-12-23 | 2002-04-30 | Etrauma, Llc | Systems and methods for remote viewing of patient images |
US5989023A (en) | 1998-12-31 | 1999-11-23 | John D. Summer | Intraoral jaw tracking device |
US6470207B1 (en) | 1999-03-23 | 2002-10-22 | Surgical Navigation Technologies, Inc. | Navigational guidance via computer-assisted fluoroscopic imaging |
US6491699B1 (en) | 1999-04-20 | 2002-12-10 | Surgical Navigation Technologies, Inc. | Instrument guidance method and system for image guided surgery |
US7386339B2 (en) | 1999-05-18 | 2008-06-10 | Mediguide Ltd. | Medical imaging and navigation system |
US7366562B2 (en) * | 2003-10-17 | 2008-04-29 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US6474341B1 (en) | 1999-10-28 | 2002-11-05 | Surgical Navigation Technologies, Inc. | Surgical communication and power system |
US7819861B2 (en) * | 2001-05-26 | 2010-10-26 | Nuortho Surgical, Inc. | Methods for electrosurgical electrolysis |
DE60232316D1 (en) | 2001-02-27 | 2009-06-25 | Smith & Nephew Inc | DEVICE FOR TOTAL KNEE CONSTRUCTION |
EP1392174B1 (en) | 2001-03-26 | 2010-07-14 | ALL-OF-INNOVATION Gesellschaft mit beschränkter Haftung | Method and device system for removing material or for working material |
US20040171930A1 (en) | 2003-02-04 | 2004-09-02 | Zimmer Technology, Inc. | Guidance system for rotary surgical instrument |
US7559935B2 (en) | 2003-02-20 | 2009-07-14 | Medtronic, Inc. | Target depth locators for trajectory guide for introducing an instrument |
US7398116B2 (en) | 2003-08-11 | 2008-07-08 | Veran Medical Technologies, Inc. | Methods, apparatuses, and systems useful in conducting image guided interventions |
US7392076B2 (en) | 2003-11-04 | 2008-06-24 | Stryker Leibinger Gmbh & Co. Kg | System and method of registering image data to intra-operatively digitized landmarks |
US20050113659A1 (en) | 2003-11-26 | 2005-05-26 | Albert Pothier | Device for data input for surgical navigation system |
DE102004042489B4 (en) | 2004-08-31 | 2012-03-29 | Siemens Ag | Medical examination or treatment facility with associated method |
CA2613277C (en) | 2005-06-28 | 2016-05-10 | Stryker Corporation | Powered surgical tool with control module that contains a sensor for remotely monitoring the tool power generating unit |
-
2011
- 2011-08-22 US US13/214,783 patent/US20120316486A1/en not_active Abandoned
-
2015
- 2015-10-09 US US14/879,612 patent/US10639204B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6450978B1 (en) * | 1998-05-28 | 2002-09-17 | Orthosoft, Inc. | Interactive computer-assisted surgical system and method thereof |
US6640128B2 (en) * | 2000-12-19 | 2003-10-28 | Brainlab Ag | Method and device for the navigation-assisted dental treatment |
US20040087852A1 (en) * | 2001-02-06 | 2004-05-06 | Edward Chen | Computer-assisted surgical positioning method and system |
US7457443B2 (en) * | 2001-05-31 | 2008-11-25 | Image Navigation Ltd. | Image guided implantology methods |
US20050085714A1 (en) * | 2003-10-16 | 2005-04-21 | Foley Kevin T. | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
US20070106152A1 (en) * | 2005-09-23 | 2007-05-10 | Kantrowitz Allen B | Fiducial marker system for subject movement compensation during medical treatment |
US20080039717A1 (en) * | 2006-08-11 | 2008-02-14 | Robert Frigg | Simulated bone or tissue manipulation |
Cited By (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110160583A1 (en) * | 2009-12-31 | 2011-06-30 | Orthosensor | Orthopedic Navigation System with Sensorized Devices |
US20110160738A1 (en) * | 2009-12-31 | 2011-06-30 | Orthosensor | Operating room surgical field device and method therefore |
US20110160572A1 (en) * | 2009-12-31 | 2011-06-30 | Orthosensor | Disposable wand and sensor for orthopedic alignment |
US9452023B2 (en) | 2009-12-31 | 2016-09-27 | Orthosensor Inc. | Operating room surgical field device and method therefore |
US9452022B2 (en) | 2009-12-31 | 2016-09-27 | Orthosensor Inc | Disposable wand and sensor for orthopedic alignment |
US9011448B2 (en) | 2009-12-31 | 2015-04-21 | Orthosensor Inc. | Orthopedic navigation system with sensorized devices |
US20120319859A1 (en) * | 2010-01-20 | 2012-12-20 | Creative Team Instruments Ltd. | Orientation detector for use with a hand-held surgical or dental tool |
US8920167B2 (en) * | 2010-09-21 | 2014-12-30 | Implantdent Co., Ltd. | Surgical guide preparation tool and method for preparing surgical guide |
US20130171587A1 (en) * | 2010-09-21 | 2013-07-04 | Implantdent Co., Ltd. | Surgical guide preparation tool and method for preparing surgical guide |
US20130289347A1 (en) * | 2011-01-12 | 2013-10-31 | Olympus Corporation | Endoscopic system |
US20130296651A1 (en) * | 2011-01-24 | 2013-11-07 | Olympus Corporation | Endoscope system |
US9615729B2 (en) * | 2011-01-24 | 2017-04-11 | Olympus Corporation | Endoscope detecting system |
US11712464B2 (en) | 2012-09-06 | 2023-08-01 | Norwegian University Of Science And Technology (Ntnu) | Intervention device |
US20140080086A1 (en) * | 2012-09-20 | 2014-03-20 | Roger Chen | Image Navigation Integrated Dental Implant System |
US20140147807A1 (en) * | 2012-11-27 | 2014-05-29 | National Chung Cheng University | Computer-aided positioning and navigation system for dental implant |
US20150140505A1 (en) * | 2012-11-27 | 2015-05-21 | National Chung Cheng University | Computer-aided positioning and navigation system for dental implant |
US10064700B2 (en) * | 2013-02-14 | 2018-09-04 | Zvi Fudim | Surgical guide kit apparatus and method |
EP2967288A4 (en) * | 2013-03-14 | 2016-11-30 | X Nav Technologies Llc | Image guided navigation system |
WO2014152519A2 (en) | 2013-03-14 | 2014-09-25 | X-Nav Technologies, LLC | Image guided navigation system |
US9844324B2 (en) | 2013-03-14 | 2017-12-19 | X-Nav Technologies, LLC | Image guided navigation system |
KR101762366B1 (en) * | 2013-03-14 | 2017-07-28 | 엑스-네브 테크놀로지스, 엘엘씨 | Image guided navigation system |
US9675796B2 (en) | 2013-11-10 | 2017-06-13 | Brainsgate Ltd. | Implant and delivery system for neural stimulator |
US10512771B2 (en) | 2013-11-10 | 2019-12-24 | Brainsgate Ltd. | Implant and delivery system for neural stimulator |
US20150265367A1 (en) * | 2014-03-19 | 2015-09-24 | Ulrich Gruhler | Automatic registration of the penetration depth and the rotational orientation of an invasive instrument |
US10548578B2 (en) * | 2014-03-19 | 2020-02-04 | Karl Storz Se & Co. Kg | Automatic registration of the penetration depth and the rotational orientation of an invasive instrument |
US20160015476A1 (en) * | 2014-07-15 | 2016-01-21 | Synaptive Medical (Barbados) Inc. | Medical Device Control Interface |
US9827060B2 (en) * | 2014-07-15 | 2017-11-28 | Synaptive Medical (Barbados) Inc. | Medical device control interface |
US9402691B2 (en) | 2014-09-16 | 2016-08-02 | X-Nav Technologies, LLC | System for determining and tracking movement during a medical procedure |
US9943374B2 (en) | 2014-09-16 | 2018-04-17 | X-Nav Technologies, LLC | Image guidance system for detecting and tracking an image pose |
US10143827B2 (en) | 2014-09-30 | 2018-12-04 | Integra Lifesciences Switzerland Sàrl | Optoelectronic sensing of a subcutaneous implant setting |
WO2016096984A1 (en) * | 2014-12-18 | 2016-06-23 | Norwegian University Of Science And Technology (Ntnu) | Intervention guidance device |
US20160184068A1 (en) * | 2014-12-24 | 2016-06-30 | Ingram Chodorow | Disposable surgical intervention guides, methods, and kits |
US9962234B2 (en) | 2014-12-24 | 2018-05-08 | Isethco Llc | Disposable surgical intervention guides, methods, and kits |
WO2016105592A1 (en) * | 2014-12-24 | 2016-06-30 | Chodorow Ingram | Disposable surgical intervention guides, methods, and kits |
US10136968B2 (en) * | 2014-12-24 | 2018-11-27 | Isethco Llc | Disposable surgical intervention guides, methods, and kits |
US20170202636A1 (en) * | 2015-02-04 | 2017-07-20 | Jerry T. Huang | Investigation and control device of drive system |
EP3090699A1 (en) * | 2015-05-04 | 2016-11-09 | Weigl, Paul | System and method for the collection and provision of three-dimensional characteristic data of the bone, soft tissue and mouth situation of a patient |
US10271907B2 (en) | 2015-05-13 | 2019-04-30 | Brainsgate Ltd. | Implant and delivery system for neural stimulator |
US20180368767A1 (en) * | 2015-07-03 | 2018-12-27 | Witooth Dental Services And Technologies, S.L. | Intraoral device |
US11000228B2 (en) * | 2015-07-03 | 2021-05-11 | Witooth Dental Services And Technologies, S.L. | Intraoral device |
CN104966394A (en) * | 2015-07-10 | 2015-10-07 | 新博医疗技术有限公司 | Wireless communication system for coordinate measurement |
CN105395252A (en) * | 2015-12-10 | 2016-03-16 | 哈尔滨工业大学 | Wearable three-dimensional image navigation device for vascular intervention operation and realizing man-machine interaction |
US20170333135A1 (en) * | 2016-05-18 | 2017-11-23 | Fei Gao | Operational system on a workpiece and method thereof |
EP3558159A4 (en) * | 2016-12-23 | 2020-11-25 | Planmeca Oy | Tracking pieces for tracking movements of hard tissue of a jaw |
CN110267616A (en) * | 2017-02-27 | 2019-09-20 | 史密夫和内修有限公司 | Operation guiding system supports array |
TWI629974B (en) * | 2017-05-26 | 2018-07-21 | 醫百科技股份有限公司 | Surgical guidance system |
US10687902B2 (en) | 2017-05-26 | 2020-06-23 | Eped, Inc. | Surgical navigation system and auxiliary positioning assembly thereof |
CN110815202A (en) * | 2018-08-07 | 2020-02-21 | 杭州海康机器人技术有限公司 | Obstacle detection method and device |
US20220104885A1 (en) * | 2020-10-06 | 2022-04-07 | Michael J. Hartman | Patient specific dynamic navigation tracker arm mount apparatus and method |
Also Published As
Publication number | Publication date |
---|---|
US10639204B2 (en) | 2020-05-05 |
US20160030132A1 (en) | 2016-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10639204B2 (en) | Surgical component navigation systems and methods | |
WO2012024672A2 (en) | Surgical component navigation systems and methods | |
US11583344B2 (en) | Devices, systems and methods for natural feature tracking of surgical tools and other objects | |
CN107847278B (en) | Targeting system for providing visualization of a trajectory for a medical instrument | |
JP5741885B2 (en) | System and method for non-contact determination and measurement of the spatial position and / or orientation of an object, in particular a medical instrument calibration and test method including a pattern or structure relating to a medical instrument | |
EP2436333B1 (en) | Surgical navigation system | |
RU2434600C2 (en) | Surgical system controlled by images | |
US7747312B2 (en) | System and method for automatic shape registration and instrument tracking | |
TWI625116B (en) | Ultrasound ct registration for positioning | |
US8483434B2 (en) | Technique for registering image data of an object | |
JP2018061835A (en) | Pre-operative registration of anatomical images with position-tracking system using ultrasound | |
US20110190637A1 (en) | Medical measuring system, method for surgical intervention as well as use of a medical measuring system | |
KR20150127031A (en) | System for establishing virtual constraint boundaries | |
US11701180B2 (en) | Surgical instrument system | |
US11510738B2 (en) | Surgical instrument system | |
KR101923927B1 (en) | Image registration system and method using subject-specific tracker | |
US20220398744A1 (en) | Tracking system for robotized computer-assisted surgery | |
KR20230059157A (en) | Apparatus and method for registering live and scan images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MANHATTAN TECHNOLOGIES, LLC, TENNESSEE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEUNG, ANDREW;CAMPBELL, JOSHUA;REEL/FRAME:026999/0327 Effective date: 20110921 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: X-NAV TECHNOLOGIES, LLC, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MANHATTAN TECHNOLOGIES, LLC;REEL/FRAME:043947/0960 Effective date: 20171024 |