US20140073907A1 - System and method for image guided medical procedures - Google Patents

System and method for image guided medical procedures Download PDF

Info

Publication number
US20140073907A1
US20140073907A1 US13/835,479 US201313835479A US2014073907A1 US 20140073907 A1 US20140073907 A1 US 20140073907A1 US 201313835479 A US201313835479 A US 201313835479A US 2014073907 A1 US2014073907 A1 US 2014073907A1
Authority
US
United States
Prior art keywords
image
imaging
imaging modality
anatomical region
volumetric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/835,479
Inventor
Dinesh Kumar
Amit VOHRA
Daniel S. Sperling
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Convergent Life Sciences Inc
Original Assignee
Convergent Life Sciences Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Convergent Life Sciences Inc filed Critical Convergent Life Sciences Inc
Priority to US13/835,479 priority Critical patent/US20140073907A1/en
Assigned to Convergent Life Sciences, Inc. reassignment Convergent Life Sciences, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SPERLING, DANIEL S., KUMAR, DINESH, VOHRA, AMIT
Priority to PCT/US2013/055561 priority patent/WO2014031531A1/en
Publication of US20140073907A1 publication Critical patent/US20140073907A1/en
Assigned to Convergent Life Sciences, Inc. reassignment Convergent Life Sciences, Inc. CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF ASSIGNOR NAME PREVIOUSLY RECORDED AT REEL: 030014 FRAME: 0107. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: SPERLING, DANNY S, KUMAR, DINESH, VOHRA, AMIT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B19/5212
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B10/0233Pointed or sharp biopsy instruments
    • A61B10/0241Pointed or sharp biopsy instruments for prostate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B18/22Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser the beam being directed along or through a flexible conduit, e.g. an optical fibre; Couplings or hand-pieces therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N5/0601Apparatus for use inside the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N5/0613Apparatus adapted for a specific treatment
    • A61N5/0625Warming the body, e.g. hyperthermia treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00274Prostate operation, e.g. prostatectomy, turp, bhp treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B2018/2005Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser with beam delivery through an interstitially insertable device, e.g. needle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N5/0601Apparatus for use inside the body
    • A61N2005/0612Apparatus for use inside the body using probes penetrating tissue; interstitial probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N2005/063Radiation therapy using light comprising light transmitting means, e.g. optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1001X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy
    • A61N5/1007Arrangements or means for the introduction of sources into the body
    • A61N2005/1012Templates or grids for guiding the introduction of sources

Definitions

  • the present disclosure relates to systems and methods for image guided medical and surgical procedures.
  • U.S. Pat Pub. 2009/0054772 (EP20050781862), expressly incorporated herein by reference, entitled “Focused ultrasound therapy system”, provides a method for performing a High Intensity Focused Ultrasound (HIFU) procedure for specific clinical application.
  • Basic image registration is performed for fusion from a diagnostic modality such as Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) to ultrasound, only through body positioning, referred to as “immobilization”, resulting in only image registration via horizontal movement and zoom factor.
  • CT Computed Tomography
  • MRI Magnetic Resonance Imaging
  • U.S. Pub. Pat. 2007/0167762 expressly incorporated herein by reference, entitled “Ultrasound System for interventional treatment”, provides an ultrasound system that can load a “wide-area” image signal such as CT or MRI that can be loaded and fused with the ultrasound image, using a manual definition of position of lesions and needle insertion position at the time of procedure.
  • U.S. Pub. App. 2010/02906853 expressly incorporated herein by reference, entitled “Fusion of 3D volumes with CT reconstruction” discloses a method for registration of ultrasound device in three dimensions to a C-arm scan, the method including acquiring a baseline volume, acquiring images in which ultrasound device is disposed, locating the device within the images, registering the location of the device to the baseline volume, acquiring an ultrasound volume from the ultrasound device, registering the ultrasound volume to the baseline volume, and performing fusion imaging to display a view of the ultrasound device in the baseline volume.
  • a mutual information based method is provided to register and display a 3D ultrasound image fused with a CT image.
  • U.S. Pub. App. 2011/0178389 expressly incorporated herein by reference, entitled “Fused image modalities guidance” discloses a system and method for registration of medical images, which registers a previously obtained volume(s) onto an ultrasound volume during an ultrasound procedure, to produce a multimodal image, which may be used to guide a medical procedure.
  • the multimodal image includes MRI information presented in the framework of a Trans Rectal Ultrasound (TRUS) image during a TRUS procedure.
  • TRUS Trans Rectal Ultrasound
  • Prostate cancer is one of the most common types of cancer affecting men. It is a slow growing cancer, which is easily treatable if identified at an early stage. A prostate cancer diagnosis often leads to surgery or radiation therapy. Such treatments are costly and can cause serious side effects, including incontinence and erectile dysfunction. Unlike many other types of cancer, prostate cancer is not always lethal and often is unlikely to spread or cause harm. Many patients who are diagnosed with prostate cancer receive radical treatment even though it would not prolong the patient's life, ease pain, or significantly increase the patient's health.
  • Prostate cancer may be diagnosed by taking a biopsy of the prostate, which is conventionally conducted under the guidance of ultrasound imaging.
  • Ultrasound imaging has high spatial resolution, and is relatively inexpensive and portable. However, ultrasound imaging has relatively low tissue discrimination ability. Accordingly, ultrasound imaging provides adequate imaging of the prostate organ, but it does not provide adequate imaging of tumors within the organ due to the similarity of cancer tissue and benign tissues, as well as the lack of tissue uniformity. Due to the inability to visualize the cancerous portions within the organ with ultrasound, the entire prostate must be considered during the biopsy. Thus, in the conventional prostate biopsy procedure, a urologist relies on the guidance of two-dimensional ultrasound to systematically remove tissue samples from various areas throughout the entire prostate, including areas that are free from cancer.
  • Magnetic Resonance Imaging has long been used to evaluate the prostate and surrounding structures. MRI is in some ways superior to ultrasound imaging because it has very good soft tissue contrast. There are several types of MRI techniques, including T2 weighted imaging, diffusion weighted imaging, and dynamic contrast imaging. Standard T2-weighted imaging does not discriminate cancer from other processes with acceptable accuracy. Diffusion-weighted imaging and dynamic contrast imaging may be integrated with traditional T2-weighted imaging to produce multi-parametric MRI. The use of multi-parametric MRI has been shown to improve sensitivity over any single parameter and may enhance overall accuracy in cancer diagnosis.
  • MRI As with ultrasound imaging, MRI also has limitations. For instance, it has a relatively long imaging time, requires specialized and costly facilities, and is not well-suited for performance by a urologist at a urology center. Furthermore, performing direct prostate biopsy within MRI machines is not practical for a urologist at a urology center.
  • MRI and ultrasound imaging modalities To overcome these shortcomings and maximize the usefulness of the MRI and ultrasound imaging modalities, methods and devices have been developed for digitizing medical images generated by multiple imaging modalities (e.g., ultrasound and MRI) and fusing or integrating multiple images to form a single composite image.
  • This composite image includes information from each of the original images that were fused together.
  • a fusion or integration of Magnetic Resonance (MR) images with ultrasound-generated images has been useful in the analysis of prostate cancer within a patient.
  • Image-guided biopsy systems such as the Artemis produced by Eigen (See, e.g., U.S. Pub. App. Nos.
  • a urologist can profitably implement an image-guided biopsy system in his or her practice while contemporaneously attempting to learn to perform MRI scans. Furthermore, even if a urologist invested the time and money in purchasing MRI equipment and learning to perform MRI scans, the urologist would still be unable to perform the MRI-ultrasound fusion because a radiologist is needed for the performance of advanced MRI assessment and manipulation techniques which are outside the scope of a urologist's expertise.
  • MRI is generally considered to offer the best soft tissue contrast of all imaging modalities.
  • anatomical e.g., T 1 , T 2
  • functional MRI e.g. dynamic contrast-enhanced (DCE), magnetic resonance spectroscopic imaging (MRSI) and diffusion-weighted imaging (DWI)
  • DCE dynamic contrast-enhanced
  • MRSI magnetic resonance spectroscopic imaging
  • DWI diffusion-weighted imaging
  • DCE improves specificity over T 2 imaging in detecting cancer. It measures the vascularity of tissue based on the flow of blood and permeability of vessels. Tumors can be detected based on their early enhancement and early washout of the contrast agent. DWI measures the water diffusion in tissues. Increased cellular density in tumors reduces the signal intensity on apparent diffusion maps.
  • imaging modalities other than trans-rectal ultrasound (TRUS) for biopsy and/or therapy typically provides a number of logistic problems. For instance, directly using MRI to navigate during biopsy or therapy can be complicated (e.g. requiring use of nonmagnetic materials) and expensive (e.g., MRI operating costs). This need for specially designed tracking equipment, access to an MRI machine, and limited availability of machine time has resulted in very limited use of direct MRI-guided biopsy or therapy. CT imaging is likewise expensive and has limited access, and poses a radiation risk for operators and patient.
  • TRUS trans-rectal ultrasound
  • one known solution is to register a pre-acquired image (e.g., an MRI or CT image), with a 3D TRUS image acquired during a procedure. Regions of interest identifiable in the pre-acquired image volume may be tied to corresponding locations within the TRUS image such that they may be visualized during/prior to biopsy target planning or therapeutic application.
  • This solution allows a radiologist to acquire, analyze and annotate MRI/CT scan at the image acquisition facility while a urologist can still perform the procedure using live ultrasound in his/her clinic.
  • the present technology provides a method for combining information from plurality of medical imaging modalities, such as positron Emission Tomography (PET), Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Magnetic Resonance Spectroscopic Imaging (MRSI), Ultrasound, Echo Cardiograms and Elastography, supplemented by information obtained in advance by at least one other modality, which is properly registered to the real time image despite soft tissue movement, deformation, or change in size.
  • the real time image is of a soft tissue organ or gland such as prostate, skin, heart, lung, kidney, liver, bladder, ovaries, and thyroid, and the supplemented real time image is used for a medical image guided procedure.
  • the real time image may also be used for orthopedic or musculoskeletal procedures, or exercise physiology.
  • a further real-time imaging type is endoscopy, or more generally, videography, which is in growing use, especially for minimally invasive procedures.
  • the medical procedure may be a needle based procedure, such as but not limited to biopsy, laser ablation, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, and/or direct injection of a photothermal or photodynamic agent.
  • the medical professional seeks to treat a highly localized portion of an organ, while either avoiding a toxic or damaging therapy to adjacent structures, or to avoid waste of a valuable agent.
  • the available real-time medical imaging modalities for guiding the localized treatment visualize the organ, but do not clearly delineate the portion of the organ to be treated.
  • non-real time imaging modalities are available for defining locations sought to be treated with the localized treatment.
  • the organ in the time between the non-real time imaging and the real time procedure, the organ can shift, deform (especially as a result of the procedure itself), or change in size, thus substantially distorting the relationship between the real time image used to guide the procedure and the non-real time diagnostic or tissue localization image.
  • a further complication is that the non-real time image may have a different intrinsic coordinate system from the real time imaging, leading to artifacts. Therefore, the present technology seeks to address these issues by compensating for differences in the patient's anatomy between acquisition of the non-real time image and the real time procedure, using, for example, general anatomical information, landmarks common to both images, and tissue and procedure models.
  • Typical medical procedures comprise image-guided non-needle based procedures such as but not limited to HIFU, IMRT, and robotic surgery.
  • the pre-operative imaging modality may thus be used to a identify target object or gland, and suspicious lesions of the object or gland, for targeted biopsy, targeted therapy, targeted dose delivery or a combination of the above.
  • the pre-operative imaging modality may be used to identify and annotate surrounding structures that need to be spared in order to minimize the impact of procedure on quality of life.
  • such structures may be nerve bundles, the urethra, rectum and bladder identified in a magnetic resonance (MR) image.
  • MR magnetic resonance
  • the pre-operative imaging modality may be used to identify and uniquely label anatomical landmarks for manual, semi-automated or automated registration.
  • anatomical landmarks may be urethra at prostate base, urethra at apex, verumontanum, calcifications and cysts.
  • the pre-operative imaging modality may be used to identify and uniquely label anatomical structures for manual, semi-automated or automated registration.
  • such structures may be urethra, seminal vesicles, ejaculatory ducts, bladder and rectum.
  • a targeted biopsy may be performed for a malignancy to determine the extent of malignancy and best treatment option.
  • Needle guidance procedures may be provided where the pre-operative imaging modality is analyzed to plan the complete procedure or part of the procedure, such that anatomical locations of targets for needle placement is planned in advance, and the anatomical locations are guided by the real time imaging modality.
  • the needle locations and trajectories may be identified in advance based on the non-real time, pre-operative imaging modality, such that the target region is adequately sampled for biopsy to maximize the accuracy while minimizing number of samples for each target region.
  • the needle locations and trajectories may be identified in advance, such that a target region is effectively treated in a therapeutic procedure within the target area, while minimizing the damage to the surrounding tissue and structures.
  • the trajectory may be optimized in a prostate procedure such that the needle insertion minimizes damage to important structures such as rectum and nerve bundles.
  • the duration of needle placement at each location in a therapeutic procedure may be optimized using a pre-operative imaging modality, to effectively design a treatment for the target region locally while sparing the surrounding tissues and structures.
  • Anatomical landmarks and/or structures identified in pre-operative image may also identified in the intra-operative (live) image and labeled consistently.
  • the pre-operative image may also identify surfaces and boundaries, which can be defied or modeled as, for example, triangulated meshes.
  • the surfaces may represent the entire anatomical structure/object or a part thereof.
  • a boundary may have no real anatomical correlate, and be defined virtually; however, an advantage arises if the virtual boundary can be consistently and accurately identified in both the pre-operative imaging and the real-time inter-operative imaging, since the facilitates registration and alignment of the images.
  • the virtual features of the images may be based on generic anatomy, e.g., of humans or animals, or patient-specific.
  • Labeled surfaces and landmarks in pre-operative and intra-operative images may be used for rigid registration.
  • the bladder is labeled as “1” in pre-operative image, it is registered with object labeled “1” in intra-operative image. More generally, regions on an image are classified or segmented, and that classification or segment definition from the pre-operative imaging is applied to the inter-operative real time imaging.
  • the correspondence may be “hard-correspondence” or “soft-correspondence”, i.e., the landmarks may have absolute correspondence or a “fuzzy” correspondence.
  • “soft-correspondence” permits or facilitates automated or semi-automated labeling of objects, since the real-time imaging is typically not used by a fully automated system to perform a procedure, and the skilled medical professional can employ judgment, especially if the labeling indicates a possible degree of unreliability, in relying on the automated labeling.
  • a urologist in a prostate procedure can review the fused image in real time to determine whether there is sufficient consistency to proceed and rely on the pre-operative imaging information, or whether only the inter-operative real-time imaging is to be employed.
  • the pre-operative imaging labeling boundaries are imprecise, and therefore that the medical professional might wish to treat such boundaries as being advisory and not absolute.
  • the landmark and object registration may be performed rigidly using a simultaneous landmark and surface registration algorithm.
  • a rigid registration may be optionally followed by an affine registration.
  • An elastic registration method may follow the rigid or affine registration.
  • An elastic registration method may be at least one of intensity based, binary mask based, surfaces- and landmarks-based method or a combination of these methods.
  • a deformation model may be computed from a number of training datasets is used for image registration.
  • the deformation model models the deformation of the object of interest, for example, a prostate goes through deformation upon application of an external tool such as ultrasound transducer or endo-rectal coil.
  • the training datasets may include sets of corresponding planning images and live modality images for same patient.
  • pre-operative imaging is obtained under conditions that model a soft tissue deformation that might occur during the real-time imaging.
  • the correspondence may be further refined by identifying and defining mismatching corresponding features between the pre-procedure and intra-procedure images.
  • a calcification may be seen in both MRI (pre-procedure) and ultrasound (inter-procedure) images, and if these anatomical landmarks mismatch slightly, a user may identify these landmarks visually and select them by click of a mouse; alternately, an automated indication of mismatch may be generated.
  • An algorithm can then refine the correspondence such that the boundaries of the object of interest do not move while the deformation inside the object gets updated. The deformation inside the object of interest may thus follow an elastic deformation model based on the new landmarks constrained by object boundaries.
  • An image registration method may therefore be provided that maps a region of interest from a pre-procedural (planning) image to an intra-procedural (live) image, along with a complete plan such that the plan and the region of interest are mapped and deformed to conform to the shape of the object during the procedure.
  • the technology provides a method of image, fusion where the mapped plan may be displayed as one or more overlays on a live imaging modality display during an image guided procedure.
  • the fusion need not be an overlay, and may be supplemental information through a different modality, such as voice or sonic feedback, force-feedback or proprioceptive feedback, a distinct display (without overlay), or the like.
  • voice or sonic feedback e.g., voice synthesis feedback
  • force-feedback or proprioceptive feedback e.g., force-feedback or proprioceptive feedback
  • a distinct display without overlay
  • different types of information may be distinguished by color, intensity, depth (on a stereoscopic display), icons, or other known means.
  • the plan may be indicated by static images or graphics, animated graphics, and/or acoustic information (e.g., voice synthesis feedback).
  • a planning image can also be overlaid on the live imaging modality during an image guided procedure such that the images can be toggled back and forth, or displayed together in a real-time “fused” display.
  • the mapped plan may be further adjusted to account for a new shape of object revealed during real-time imaging. This may be done using an automated method, semi-automatic method, or manual method or a combination thereof.
  • a pre-procedure planning image may be used to plan the procedure such that the plan is embedded in an electronic, printed, or interactive web-based report.
  • the present technology identifies imaging modalities clearly including landmarks, objects and intensity information, to perform registration using a combination of rigid, affine and non-rigid elastic registration.
  • the modeling of the objects within an image may thus comprise a segmentation of anatomical features.
  • the method may further comprise transforming coordinate systems of various imaging modalities.
  • the system may further comprise at least one modeling processor configured to perform real-time model updates of a patient soft-tissue to ensure that a pre-operative image remains accurately registered with an intra-operative image.
  • the annotated regions of the medical imaging scan or the plan may be generated by a computer-aided diagnosis or therapeutic planning system. At least apportion of the pre-operative imaging may be conducted at a remote location from the therapeutic or diagnostic procedure, and the information conveyed between the two through the Internet, preferably over a virtual private network. A true private network may also be used, or simply encrypted files communicate over otherwise public channels.
  • the physical separation of the imaging modalities facilitates professional specialization, since experts at different aspects of the process need not be collocated.
  • the present technology permits porting information from a planning image frame of reference to a live imaging modality for guiding a medical procedure.
  • the plan may be defined as a region of interest and needle placement or a method to plan a treatment or biopsy, for example.
  • the present technology may employ not only object boundaries, but also surrounding or internal information for registration, and thus is may be employed in applications where there is significant internal deformation that cannot be modeled using boundaries alone.
  • image fusion is sometimes used to define the process of registering two images that are acquired via different imaging modalities or at different time instances.
  • the registration/fusion of images obtained from different modalities creates a number of complications.
  • the shape of soft tissues in two images may change between acquisitions of each image.
  • a diagnostic or therapeutic procedure can alter the shape of the object that was previously imaged.
  • the frame of reference (FOR) of the acquired images is typically different. That is, multiple MRI volumes are obtained in high resolution transverse, coronal or sagittal planes respectively, with lower resolution representing the slice distance. These planes are usually in rough alignment with the patient's head-toe, anterior-posterior or left-right orientations.
  • TRUS images are often acquired while a patient lies on his side in a fetal position by reconstructing multiple rotated samples 2D frames to a 3D volume.
  • the 2D image frames are obtained at various instances of rotation of the TRUS probe after insertion in to the rectal canal.
  • the probe is inserted at an angle (approximately 30-45 degrees) to the patient's head-toe orientation.
  • the gland in MRI and TRUS will need to be rigidly aligned because their relative orientations are unknown at scan time.
  • well-defined and invariant anatomical landmarks may be used to register the images, though since the margins of landmarks themselves vary with imaging modality, the registration may be imperfect or require discretion in interpretation.
  • a further difficulty with these different modalities is that the intensity of objects in the images do not necessarily correspond. For instance, structures that appear bright in one modality (e.g., MRI) may appear dark in another modality (e.g., ultrasound). Thus, the logistical process of overlaying or merging the images requires perceptual optimization. In addition, structures identified in one image (soft tissue in MRI) may be entirely absent in another image. TRUS imaging causes further deformation of gland due to pressure exerted by the TRUS transducer on prostate. As a result, rigid registration is not sufficient to account for difference between MRI and TRUS images. Finally, the resolution of the images may also impact registration quality.
  • the boundary/surface model of the prostate is an elastic object that has a gland boundary or surface model that defines the volume of the prostate.
  • the boundary can then be used as a reference for aligning both images.
  • each point of the volume defined within the gland boundary of the prostate in one image should correspond to a point within a volume defined by a gland boundary of the prostate in the other image, and vice versa.
  • the data in each data set may be transformed, assuming elastic deformation of the prostate gland.
  • the characteristics of soft tissue under shear and strain, compression, heating and/or inflammation, bleeding, coagulation, biopsy sampling, tissue resection, etc., as well as normal physiological changes for healthy and pathological tissue, over time are modeled, and therefore these various effects accounted for during the pre-operative imaging and real-time intraprocedural imaging.
  • a system and method for use in medical imaging of a prostate of a patient.
  • the utility includes obtaining a first 3D image volume from an MRI imaging device.
  • this first 3D image volume is acquired from data storage. That is, the first 3D image volume is acquired at a time prior to a current procedure.
  • a first shape or surface model may be obtained from the MRI image (e.g., a triangulated mesh describing the gland).
  • the surface model can be manually or automatically extracted from all co-registered MRI image modalities. That is, multiple MRI images may themselves be registered with each other as a first step.
  • the 3D image processing may be automated, so that a technician need not be solely occupied by the image processing, which may take seconds or minutes.
  • the MRI images may be T 1 , T 2 , DCE (dynamic contrast-enhanced), DWI (diffusion weighted imaging), ADC (apparent diffusion coefficient) or other.
  • CAT computer aided tomography
  • the surface of the prostate may not represent a high contrast feature, and therefore other aspects of the image may be used; typically, the CAT scan is used to identify radiodense features, such as calcifications, or brachytherapy seeds, and therefore the goal of the image registration process would be to ensure that these features are accurately located in the fused image model.
  • a co-registered CT image with PET scan can also provide diagnostic information that can be mapped to TRUS frame of reference for image guidance.
  • the pre-operative imaging comprises use of the same imaging modality as used intra-operatively, generally along with an additional imaging technology.
  • an ultrasound volume of the patient's prostate may be obtained, for example, through rotation of a TRUS probe, and the gland boundary is segmented in an ultrasound image.
  • the ultrasound images acquired at various angular positions of the TRUS probe during rotation can be reconstructed to a rectangular grid uniformly through intensity interpolation to generate a 3D TRUS volume.
  • Other ultrasound methods may be employed without departing from the scope of the technology.
  • the MRI or CAT scan volume is registered to the 3D TRUS volume (or vice versa), and a registered image of the 3D TRUS volume is generated in the same frame of reference (FOR) as the MRI or CAT scan image.
  • this registration occurs prior to a diagnostic or therapeutic intervention.
  • both data sets may be fully processed, with the registration of the 3D TRUS volume information completed.
  • a fully fused volume model is available.
  • the deviation of a prior 3D TRUS scan from a subsequent one will be small, so features from the real-time scan can be aligned with those of the prior imaging procedure.
  • the fused image from the MRI (or CAT) scan provides better localization of the suspect pathological tissue, and therefore guidance of the diagnostic biopsy or therapeutic intervention.
  • the suspect voxels from the MRI are highlighted in the TRUS image, which during a procedure would be presented in 2D on a display screen to guide the urologist.
  • the process therefore seeks to register 3 sets of data; the MRI (or other scan) information, the pre-operative 3D TRUS information, and the real time TRUS used during the procedure.
  • the preoperative 3D TRUS and the inter-operative TRUS are identical apparatus, and therefore would provide maximum similarity and either minimization of artifacts or present the same artifacts.
  • the 3D TRUS preoperative scan can be obtained using the same TRUS scanner and immediately pre-operative, though it is preferred that the registration of the images proceed under the expertise of a radiologist or medical scanning technician, who may not be immediately available during that period.
  • a plan may be defined manually, semi-automatically, or in certain cases, automatically.
  • the plan for example in a prostate biopsy procedure, defines both the location of the samples to be acquired, as well as the path to be taken by the biopsy instrument in order to avoid undue damage to tissues.
  • the plan may be updated in real-time. For example, if the goal of the plan is to sample a volume of tissue on 1.5 mm spatial distances, but the accuracy of sampling is ⁇ 0.5 mm, then subsequent sampling targets may be defined adaptively based on the actual sampling location during the procedure.
  • the course of treatment including both the location of the laser and its excitation parameters, may be determined based on both the actual location of a fiber optic tip, as well as a measured temperature, and perhaps an inter-operatively determined physiological response to the therapy, such as changes in circulation pattern, swelling, and the like.
  • the registered image and the geometric transformation that relates, for example, a MRI scan volume with an ultrasound volume can be used to guide a medical procedure such as, for example, biopsy or brachytherapy.
  • regions of interest identified on the MRI scan are usually defined by a radiologist based on information available in MRI prior to biopsy, and may be a few points, point clouds representing regions, or triangulated meshes.
  • the 3D TRUS may also reveal features of interest for biopsy, which may also be marled as regions of interest. Because of the importance of registration of the regions of interest in the MRI scan with the TRUS used intra-operatively, in a manual or semiautomated pre-operative image processing method, the radiologist can override or control the image fusion process according to his or her discretion.
  • a segmented MRI and 3D TRUS is obtained from a patient for the prostate grand.
  • the MRI and TRUS data is registered and transformations applied to form a fused image in which voxels of the MRI and TRUS images physically correspond to one another. Regions of interest are then identified either from the source images or from the fused image. The regions of interest are then communicated to the real-time ultrasound system, which tracks the earlier TRUS image. Because the ultrasound image is used for real time guidance, typically the transformation/alignment takes place on the MRI data, which can then be superposed or integrated with the ultrasound data.
  • the real-time TRUS display is supplemented with the MRI (or CAT or other scan) data, and an integrated display presented to the operating urologist.
  • haptic feedback may be provided so that the urologist can “feel” features when using a tracker.
  • the MRI or CAT scan data may be used to provide a coordinate frame of reference for the procedure, and the TRUS image modified in real-time to reflect an inverse of the ultrasound distortion. That is, the MRI or CAT data typically has a precise and undistorted geometry.
  • the ultrasound image may be geometrically distorted by phase velocity variations in the propagation of the ultrasound waves through the tissues, and to a lesser extent, by reflections and resonances. Since the biopsy instrument itself is rigid, it will correspond more closely to the MRI or CAT model than the TRUS model, and therefore a urologist seeking to acquire a biopsy sample may have to make corrections in course if guided by the TRUS image.
  • TRUS image is normalized to the MRI coordinate system, then such corrections may be minimized. This requires that the TRUS data be modified according to the fused image volume model in real time.
  • graphics processors GPU or APU, multicore CPU, FPGA
  • other computing technologies make this feasible.
  • the urologist is presented with a 3D display of the patient's anatomy, supplemented by and registered to the real-time TRUS data.
  • Such 3D displays may be effectively used with haptic feedback.
  • the first is a frame of reference transformation, due to the fact that the MRI image is created as a set of slices in parallel planes (rectangular coordinate system) which will generally differ from the image plane of the TRUS, defined by the probe angle (cylindrical coordinate system, with none of the cylindrical axes aligned with a coordinate axis of the MRI).
  • the second transformation represents the elastic deformation of the objects within the image to properly aligned surfaces, landmarks, etc.
  • the segmentation and/or digitizing may be carried out semi-automatically (manual control over automated image processing tasks) or automatically using computer software.
  • computer software which may be suitable includes 3D Slicer (www.slicer.org), an open source software package capable of automatic image segmentation, manual editing of images, fusion and co-registering of data using rigid and non-rigid algorithms, and tracking of devices for image-guided procedures.
  • a further object provides a system for combining information from plurality of medical imaging modalities, comprising: an input port configured to receive at least two first volumetric images using a first volumetric imaging modality of an anatomical region representing respectively different states of elastic deformation, and at least two second volumetric images using a second volumetric imaging modality, of the anatomical region representing respectively different states of elastic deformation; at least one processor configured to define an elastic soft tissue model for at least a portion of the anatomical region encompassed by the first volumetric image, and to label features of the anatomical region based on at least the first volumetric image and the soft tissue model; and a memory configured to store the defined elastic soft tissue model
  • the first imaging modality may comprise at least one of positron emission tomography, computed tomography, magnetic resonance imaging, magnetic resonance spectrography imaging, and elastography.
  • the anatomical region may comprise an organ selected from the group consisting of prostate, heart, lung, kidney, liver, bladder, ovaries, and thyroid.
  • the therapeutic intervention may be selected from one or more selected from the group consisting of laser ablation, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, direct injection of a photothermal or photodynamic agent, and radiotherapy.
  • the differentially visualized anatomical region may be at least one anatomical structure to be spared in an invasive procedure, selected from the group consisting of a nerve bundle, a urethra, a rectum and a bladder.
  • the registered features may comprise anatomical landmarks selected from the group consisting of a urethra, a urethra at a prostate base, a urethra at an apex, a verumontanum, a calcification and a cyst, a seminal vesicle, an ejaculatory duct, a bladder and a rectum.
  • the method may further comprise acquiring a tissue sample from a location determined based on at least the first imaging modality and the second imaging modality.
  • the method may further comprise delivering a therapeutic intervention at a location determined based on at least the first imaging modality and the second imaging modality.
  • the method may further comprise performing an image-guided at least partially automated procedure selected from the group consisting of laser ablation, high intensity focused ultrasound, cryotherapy, radio frequency, brachytherapy, IMRT, and robotic surgery.
  • the differentially visualized anatomical region may comprise at least one of a suspicious lesion for targeted biopsy, a suspicious lesion for targeted therapy, and a lesion for targeted dose delivery.
  • the method may further comprise automatically defining a plan comprising an target and an invasive path to reach the target.
  • the plan may be defined based on the first imaging modality, and is adapted in real time based on at least the second imaging modality.
  • the plan may comprise a plurality of targets.
  • a plurality of anatomical features may be consistently labeled in the first volumetric image and the second volumetric image.
  • the soft tissue model may comprise an elastic triangular mesh approximating a surface of an organ.
  • the anatomical landmark registration may be performed rigidly using a simultaneous landmark and surface registration algorithm.
  • An affine registration may be performed.
  • the registering may comprise an elastic registration based on at least one of an intensity, a binary mask, and surfaces and landmarks.
  • the model may be derived from a plurality of training datasets representing different states of deformation of an organ of a respective human using the first imaging modality and the second imaging modality.
  • the method may further comprise identifying a mismatch of corresponding anatomical features of the first volumetric image and the second volumetric image, and updating the registration to converge the corresponding anatomical features to reduce the mismatch based on corrections of an elastic deformation model constrained by object boundaries.
  • FIG. 1 shows a typical workflow for a surgeon in using a fusion platform for mapping plan from a pre-procedural planning image to the intra-procedural live image
  • FIG. 4 shows an object process diagram for non-rigid elastic image registration, using rigid and/or affine registration as an initialization, wherein multiple labeled objects are used to compute the correspondence while satisfying the regularization constraints;
  • FIG. 5 shows an object process diagram for planning a laser ablation of the prostate gland, in which a radiologist/radiation oncologist analyzes multiparametric MRI (mpMRI) images of a prostate and plans the location of needle tip, trajectory and duration of needle application; and
  • mpMRI multiparametric MRI
  • FIGS. 6A and 6B show a conceptual diagram for planning a laser ablation of the prostate gland, in which FIG. 6A shows the target lesion identified by an expert in sagittal and transverse images, and FIG. 6B shows the plan for laser ablation in the two orthogonal directions.
  • the present invention will be described with respect to a process, which may be carried out through interaction with a user or automatically.
  • imaging systems including but not limited to MRI, ultrasound, PET, CT, SPECT, X-ray, and the like may be used for either pre-operative or intra-operative imaging, but that a preferred scheme employs a fusion of MRI and/or CT and/or PET and ultrasound imaging for the pre-operative imaging, and trans-urethral ultrasound for intra-operative real time imaging in a prostate diagnosis or therapeutic procedure.
  • one or more pre-procedure “planning” images are used to plan a procedure and one or more intra-procedure “live” images used to guide the procedure.
  • prostate biopsy and ablation is typically done under ultrasound guidance. While speed of imaging and cost make ultrasound an ideal imaging modality for guiding biopsy, ultrasound images are insufficient and ineffective at identifying prostate cancer.
  • Multi-parametric MRI mpMRI
  • mpMRI consists of various protocols for MR imaging including T2-weighted imaging, Diffusion-weighted imaging (DWI), Dynamic contrast-enhanced (DCE) and MR Spectroscopic imaging (MRSI).
  • Radiologists are best placed at analyzing the MRI images for detection and grading the prostate cancer. However, it remains challenging to take the information from radiologists and present to urologists or surgeons who use ultrasound as imaging method for performing a biopsy. Likewise, MRI is ideal for identifying the sensitive surrounding structures that must be spared in order to preserve quality of life after the procedure.
  • Kumar et al.'s method uses a prostate surface-based non-rigid image registration method.
  • the method uses only triangulated prostate boundaries as input to registration and performs a point-wise image registration only at the surface boundaries and then interpolates the information from boundaries to inside the object.
  • Significant drawbacks include lack of information from surrounding structures, requiring significant skills, knowledge and effort to provide a good manual image registration between MRI and ultrasound, which is very challenging, especially when surgeons are not very skilled at reading and interpreting MR images.
  • the results can be variable since there can be significant difference in orientation and shape of gland between MRI and transrectal ultrasound.
  • the prostate internal structures and details are also completely ignored. Therefore, any internal twisting, rotation or non-rigid distortion is not accounted for, which may lead to poor results especially when an endo-rectal coil is used in MRI.
  • the plan is mapped as a region of interest, leaving it up to the surgeon to interpret how to properly sample a certain region. Also, in case of misregistration, there is no way disclosed to edit or refine the registration.
  • the method plans location, trajectory and depth of needle insertion optimized such that there is maximum likelihood of sampling the malignancy while minimizing number of biopsy cores.
  • FIG. 2 shows a method according to the present technology for rigid registration.
  • I 1 (x) and I 2 (x) represent the planning and live images, respectively with x being the coordinate system.
  • w i 's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B.
  • the cost could be sum of the squared intensity differences or a mutual information based metric, in case of binary objects, the cost may be relative overlap. In case of surfaces, the cost could be a symmetric distance between corresponding points.
  • R represents the rigid transformation matrix that includes rotation and translation in 3D frame of reference.
  • the needle placement is computed in advance such that the computed location, depth and trajectory maximize dosage/energy delivery at the malignancy while minimizing exposure to surrounding region.
  • FIG. 3 shows a method for affine registration.
  • I 1 (x) and I 2 (x) represent the planning and live images, respectively with x being the coordinate system.
  • w i 's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B.
  • the cost could be sum of squared intensity differences or a mutual information based metric, in case of binary objects, the cost may be relative overlap. In case of surfaces, the cost could be symmetric distance between corresponding points.
  • A represents the affine transformation matrix that registers image I 1 to frame of reference of image I 2 .
  • the procedure is preferably performed under intra-procedural image guidance, with the information from pre-procedure mapped to an intra-procedure image using a combination of rigid, affine and elastic registration, as shown in FIG. 4 , which shows an object process diagram for non-rigid elastic image registration using rigid and/or affine registration as an initialization.
  • the method uses multiple labeled objects to compute the correspondence while satisfying the regularization constraints.
  • the surgeon identifies the same landmarks, features and structures as the pre-procedure image and labels them consistently. This may be done automatically or manually after acquiring an initial intra-procedural scan.
  • the registration method uses the labels in pre-procedure and intra-procedure images to identify the structural correspondence and registers the images using a combination of rigid, affine and elastic registration.
  • Rigid or rigid+ affine registered planning image I 1′ having labeled objects ⁇ 1,i ′ for i ⁇ 1, and landmarks X j 's for j ⁇ 1; and Intra-operative planning image I 2 , labeled objects ⁇ 2,i , and landmarks, Y j 's for i ⁇ j ⁇ 1.
  • FIG. 5 shows an object process diagram for planning a laser ablation of the prostate gland.
  • a radiologist/radiation oncologist performs analysis of mpMRI images of prostate and plans the location of needle tip, trajectory and duration of needle application.
  • the landmarks and features are used as “soft-landmarks” or “hard-landmarks” points and the structures as surface meshes ( FIG. 6 ).
  • Soft-landmarks represent the landmarks that may not correspond exactly with each other and there may be some tolerance or level of confidence that will be refined during registration.
  • Hard-landmarks refer to landmarks that are assumed to match exactly and their correspondence is not allowed to change during registration.
  • FIGS. 6A and 6B show a conceptual diagram for planning a laser ablation of the prostate gland.
  • FIG. 6A shows the target lesion identified by an expert in sagittal and transverse images.
  • FIG. 6B shows the plan for laser ablation in the two orthogonal directions.
  • a and B represent the planned needles, which ablate the area shown in hatched lines. The ablated area covers the planned target.
  • the registration provides a surgeon with image fusion such that the information from pre-procedure or planning images is mapped to the frame of reference of the intra-procedure or live images.
  • the mapped information contains at least one structural image, target area to be treated and a plan for the procedure.
  • the plan may be in the form of needle location and trajectory along with the duration of needle application, if needed.
  • FIG. 1 shows the overall workflow of a surgeon, where the images planned by an expert (radiologist/radiation oncologist) are fused with a live imaging modality such as ultrasound for real-time guidance while taking advantage of diagnostic capabilities of the pre-procedural planning image.
  • the pre-procedure image is registered with the live image using a combination of rigid, affine and non-rigid elastic registration.
  • the registration provides a correspondence or a deformation map, which is used to map planning information from the frame of reference of the planning image to the live image.
  • the method permits a radiologist, radiation oncologist or an oncological image specialist to analyze pre-operative images, identify and label various structures including the objects of interest, say the prostate from the above detailed examples.
  • the structures identified and labeled by the imaging specialist could include external and internal structures and landmarks such as bladder, urethra, rectum, seminal vesicles, nerve bundles, fibromuscular stroma and prostate zones. These structures are identified and stored as either points, binary masks or surface meshes. Each such structure is labeled uniquely.
  • the method includes an automatically (or semi-automatically) generated plan for the entire procedure.
  • FIGS. 2 , 3 and 4 represent the rigid, affine and non-rigid elastic registration methods.
  • An expert or a computer algorithm identifies and labels various anatomical structures and landmarks in the planning image.
  • image I 1 (x) represent the structural planning image.
  • the structural image could be a T2-weighted transversally acquired MRI image.
  • the subscript 1 corresponds to the planning or pre-procedural image.
  • objects may also be represented by surfaces, in which case, the objects will consist of the vertices and triangles joining the vertices.
  • the expert provides the plan for a procedure on the structural image.
  • a surgeon loads the planning image I 1 along with the object labels or surface meshes, landmarks and the plan.
  • the planning image I 1 is projected to the intra-procedure image I 2 acquired during the procedure.
  • the labels and landmarks may be defined in the image I 2 either manually or automatically.
  • the labels in the target image I 2 are automatically computed by letting the planning image I 1 deform to the shape of the target image I 2 .
  • the object maps defined in planning image also participate in the registration such that segmentation (object labeling) and registration (computation of correspondence) happens at the same time in the target image.
  • FIG. 4 shows one way of performing the registration between the pre-procedure planning image and the intra-operative image.
  • the method uses the object maps along with the intensity information and the landmark correspondences to compute the correspondence between the images.
  • the resulting deformation map is used to map the plan from frame of reference of the planning image to the intra-procedural image.
  • FIGS. 5 , 6 A and 6 B represent an embodiment where the plan is a needle-based laser ablation plan.
  • the radiologist or radiation oncologist analyses the MRI image and automatically or manually computes a target region along with labeling the surrounding sensitive tissue, i.e., the safety zone.
  • the automated method embedded in the current method computes the trajectory, location, energy settings and the duration of application of laser such that the target region is completely ablated while the safety zone is spared.
  • MRI data which may include post-segmented MR image data, pre-segmented interpreted MRI data, the original MRI scans, suspicion index data, and/or instructions or a plan, may be communicated to a urologist,
  • the MRI data may be stored in a DICOM format, in another industry-standard format, or in a proprietary format unique to the imaging modality or processing platform generating the medical images.
  • the urology center where the MRI data is received may contain an image-guided biopsy or therapy system such as the Artemis, UroStation (Koelis, La Tronche, France), or BiopSee (MedCom GmbH, Darmstadt, Germany).
  • the image-guided biopsy system may comprise hardware and/or software configured to work in conjunction with a urology center's preexisting hardware and/or software.
  • a mechanical tracking arm may be connected to a preexisting ultrasound machine, and a computer programmed with suitable software may be connected to the ultrasound machine or the arm.
  • a tracking arm on the system may be attached to an ultrasound probe and an ultra sound scan is performed.
  • a two-dimensional (2-D) or 3D model of the prostate may be generated using the ultrasonic images produced by the scan, and segmentation of the model may be performed. Pre-processed ultrasound image data and post-processed ultrasound image data may be transmitted to the urology center. Volumetry may also be performed, including geometric or planimetric volumetry. Segmentation and/or volumetry may be performed manually or automatically by the image guided biopsy system. Preselected biopsy sites (e.g., selected by the radiologist during the analysis) may be incorporated into and displayed on the model. All of this ultrasound data generated from these processes may be electronically stored on the urology center's server via a communications link.
  • processing of the MRI data or ultrasound data may be carried out manually, automatically, or semi-automatically.
  • This may be accomplished through the use of segmentation software, such as Segasist Prostate Auto-Contouring, which may be included in the image-guided biopsy system.
  • segmentation software such as Segasist Prostate Auto-Contouring
  • Such software may also be used to perform various types of contour modification, including manual delineation, smoothing, rotation, translation, and edge snapping.
  • the software is capable of being trained or calibrated, in which it observes, captures, and saves the user's contouring and editing preferences over time and applies this knowledge to contour new images.
  • This software need not be hosted locally, but rather, may be hosted on a remote server or in a cloud computing environment.
  • MRI data may be integrated with the image-guided biopsy system.
  • the fusion process may be aided by the use of the instructions included with the MRI data.
  • the fusion process may include registration of the MR and ultrasonic images, which may include manual or automatic selection of fixed anatomical landmarks in each image modality. Such landmarks may include the base and apex of the prostatic urethra.
  • the two images may be substantially aligned and then one image superimposed onto the other.
  • Registration may also be performed with models of the regions of interest. These models of the regions of interest, or target areas, may also be superimposed on the digital prostate model.
  • the fusion process thus seeks to anatomically align the 3D models obtained by the radiological imaging, e.g., MRI, with the 3D models obtained by the ultrasound imaging, using anatomical landmarks as anchors and performing a warping of at least one of the models to confirm with the other.
  • the radiological analysis is preserved, such that information from the analysis relevant to suspicious regions or areas of interest are conveyed to the urologist.
  • the fused models are then provided for use with the real-time ultrasound system, to guide the urologist in obtaining biopsy samples or performing a therapeutic procedure.
  • the 3D MR image is integrated or fused with real-time ultrasonic images, based on a 3D ultrasound model obtained prior to the procedure (perhaps immediately prior). This allows the regions of interest to be viewed under real-time ultrasonic imaging so that they can be targeted during biopsy or therapy.
  • biopsy tracking and targeting using image fusion may be performed by the urologist for diagnosis and management of prostate cancer.
  • Targeted biopsies may be more effective and efficient for revealing cancer than non-targeted, systematic biopsies.
  • Such methods are particularly useful in diagnosing the ventral prostate gland, where malignancy may not always be detected with biopsy.
  • the ventral prostate gland, as well as other areas of the prostate often harbor malignancy in spite of negative biopsy.
  • Targeted biopsy addresses this problem by providing a more accurate diagnosis method. This may be particularly true when the procedure involves the use of multimodal MRI. Additionally, targeting of the suspicious areas may reduce the need for taking multiple biopsy samples or performing saturation biopsy.
  • Saturation biopsy is a multicore biopsy procedure in which a greater number of samples are obtained from throughout the prostate than with a standard biopsy. Twenty or more samples may be obtained during saturation biopsy, and sometimes more than one hundred. This procedure may increase tumor detection in high-risk cases.
  • the benefits of such a procedure are often outweighed by its drawbacks, such as the Inherent trauma to the prostate, the higher incidence of side effects, the additional use of analgesia or anesthesia, and the high cost of processing the large amount of samples.
  • focused saturation biopsy may be performed to exploit the benefits of a saturation biopsy while minimizing the drawbacks.
  • a physician may sample four or more cores, all from the suspected area. This procedure avoids the need for high-concentration sampling in healthy areas of the prostate. Further, this procedure will not only improve detection, but will enable one to determine the extent of the disease.
  • image-guided biopsy systems such as the Artemis may also be used in accordance with the current technology for performing an improved non-targeted, systematic biopsy under 3D ultrasonic guidance.
  • the biopsy locations are not always symmetrically distributed and may be clustered.
  • non-targeted systematic biopsy may be performed under the guidance of 3D ultrasonic imaging. This may allow for more even distribution of biopsy sites and wider sampling over conventional techniques.
  • the image data may be used as a map to assist the image-guided biopsy system in navigation of the biopsy needle, as well as tracking and recording the navigation.
  • the process described above may further include making treatment decisions and carrying out the treatment of prostate cancer using the image-guided biopsy system.
  • the current invention provides physicians with information that can help them and patients make decisions about the course of care, whether it be watchful waiting, hormone therapy, targeted thermal ablation, nerve sparing robotic surgery, or radiation therapy.
  • CT computed tomography
  • CT scans may be fused with MRI data to provide more accurate prediction of the correct staging, more precise target volume identification, and improved target delineation.
  • MRI in combination with biopsy, will enhance patient selection for focal ablation by helping to localize clinically significant tumor foci.
  • HIFU for treatment of prostate cancer in conjunction with the methods and apparatus previously described.
  • An example of a commercially available HIFU system is the Sonablate 500 by Focus Surgery, Inc. (Indianapolis, Ind.), which is a HIFU therapy device that operates under the guidance of 3D ultrasound imaging.
  • Such treatment systems can be improved by being configured to operate under the guidance of a fused MRI-ultrasound image.
  • temperatures in the tissue being ablated may be closely monitored and the subsequent zone of necrosis (thermal lesion) visualized, and used to update a real-time tissue model.
  • Temperature monitoring for the visualization of a treated region may reduce recurrence rates of local tumor after therapy.
  • Techniques for the foregoing may include microwave radiometry, ultrasound, impedance tomography, MRI, monitoring shifts in diagnostic pulse-echo ultrasound, and the real-time and in vivo monitoring of the spatial distribution of heating and temperature elevation, by measuring the local propagation velocity of sound through an elemental volume of such tissue structure, or through analysis of changes in backscattered energy.
  • Other traditional methods of monitoring tissue temperature include thermometry, such as ultrasound thermometry and the use of a thermocouple.
  • MRI may also be used to monitor treatment, ensure tissue destruction, and avoid overheating surrounding structures. Further, because ultrasonic imaging is not always adequate for accurately defining areas that have been treated, MRI may be used to evaluate the success of the procedure. For instance, MRI may be used for assessment of extent of necrosis shortly after therapy and for long-term surveillance for residual or recurrent tumor that may then undergo targeted biopsy.
  • post-operative image fusion that is, performing an imaging procedure after completion of an interventional procedure, and fusing or integrating pre-operative and/or intra-operative imaging data to help understand the post-operative anatomy. For example, after aggressive therapy, a standard anatomical model of soft tissue may no longer be accurate, but by integrating the therapeutic intervention data, a more accurate understanding, imaging, and image analysis may be provided.
  • a diagnostic and treatment image generation system includes at least one database containing image data from two different modalities, such as MRI and ultrasound data, and an image-guided biopsy and/or therapy system.
  • the diagnostic and treatment image generation system may also include a computer programmed to aid in the transmission of the image data and/or the fusion of the data using the image-guided biopsy system.
  • a computer readable storage medium has a non-transitory computer program stored thereon, to control an automated system to carry out various methods disclosed herein.

Abstract

A system and method combines information from a plurality of medical imaging modalities, such as PET, CT, MRI, MRSI, Ultrasound, Echo Cardiograms, Photoacoustic Imaging and Elastography for a medical image guided procedure, such that a pre-procedure image using one of these imaging modalities, is fused with an intra-procedure imaging modality used for real time image guidance for a medical procedure for any soft tissue organ or gland such as prostate, skin, heart, lung, kidney, liver, bladder, ovaries, and thyroid, wherein the soft tissue deformation and changes between the two imaging instances are modeled and accounted for automatically.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a non-provisional of U.S. Provisional Patent Application 61/691,758, filed Aug. 12, 2012, the entirety of which is expressly incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present disclosure relates to systems and methods for image guided medical and surgical procedures.
  • 2. Description of the Art
  • U.S. Pat Pub. 2009/0054772 (EP20050781862), expressly incorporated herein by reference, entitled “Focused ultrasound therapy system”, provides a method for performing a High Intensity Focused Ultrasound (HIFU) procedure for specific clinical application. Basic image registration is performed for fusion from a diagnostic modality such as Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) to ultrasound, only through body positioning, referred to as “immobilization”, resulting in only image registration via horizontal movement and zoom factor. See also U.S. Pat. No. 8,224,420, expressly incorporated herein by reference, which provides a mechanical positioning means for moving said ultrasound energy applicator to position the applicator so that the energy application zone intersects said magnetic resonance volume within said region of the subject.
  • U.S. Pub. Pat. 2007/0167762, expressly incorporated herein by reference, entitled “Ultrasound System for interventional treatment”, provides an ultrasound system that can load a “wide-area” image signal such as CT or MRI that can be loaded and fused with the ultrasound image, using a manual definition of position of lesions and needle insertion position at the time of procedure.
  • U.S. Pub. App. 2010/02906853, expressly incorporated herein by reference, entitled “Fusion of 3D volumes with CT reconstruction” discloses a method for registration of ultrasound device in three dimensions to a C-arm scan, the method including acquiring a baseline volume, acquiring images in which ultrasound device is disposed, locating the device within the images, registering the location of the device to the baseline volume, acquiring an ultrasound volume from the ultrasound device, registering the ultrasound volume to the baseline volume, and performing fusion imaging to display a view of the ultrasound device in the baseline volume. Thus, a mutual information based method is provided to register and display a 3D ultrasound image fused with a CT image.
  • U.S. Pub. App. 2011/0178389, expressly incorporated herein by reference, entitled “Fused image modalities guidance” discloses a system and method for registration of medical images, which registers a previously obtained volume(s) onto an ultrasound volume during an ultrasound procedure, to produce a multimodal image, which may be used to guide a medical procedure. In one arrangement, the multimodal image includes MRI information presented in the framework of a Trans Rectal Ultrasound (TRUS) image during a TRUS procedure.
  • Prostate cancer is one of the most common types of cancer affecting men. It is a slow growing cancer, which is easily treatable if identified at an early stage. A prostate cancer diagnosis often leads to surgery or radiation therapy. Such treatments are costly and can cause serious side effects, including incontinence and erectile dysfunction. Unlike many other types of cancer, prostate cancer is not always lethal and often is unlikely to spread or cause harm. Many patients who are diagnosed with prostate cancer receive radical treatment even though it would not prolong the patient's life, ease pain, or significantly increase the patient's health.
  • Prostate cancer may be diagnosed by taking a biopsy of the prostate, which is conventionally conducted under the guidance of ultrasound imaging. Ultrasound imaging has high spatial resolution, and is relatively inexpensive and portable. However, ultrasound imaging has relatively low tissue discrimination ability. Accordingly, ultrasound imaging provides adequate imaging of the prostate organ, but it does not provide adequate imaging of tumors within the organ due to the similarity of cancer tissue and benign tissues, as well as the lack of tissue uniformity. Due to the inability to visualize the cancerous portions within the organ with ultrasound, the entire prostate must be considered during the biopsy. Thus, in the conventional prostate biopsy procedure, a urologist relies on the guidance of two-dimensional ultrasound to systematically remove tissue samples from various areas throughout the entire prostate, including areas that are free from cancer.
  • Magnetic Resonance Imaging (MRI) has long been used to evaluate the prostate and surrounding structures. MRI is in some ways superior to ultrasound imaging because it has very good soft tissue contrast. There are several types of MRI techniques, including T2 weighted imaging, diffusion weighted imaging, and dynamic contrast imaging. Standard T2-weighted imaging does not discriminate cancer from other processes with acceptable accuracy. Diffusion-weighted imaging and dynamic contrast imaging may be integrated with traditional T2-weighted imaging to produce multi-parametric MRI. The use of multi-parametric MRI has been shown to improve sensitivity over any single parameter and may enhance overall accuracy in cancer diagnosis.
  • As with ultrasound imaging, MRI also has limitations. For instance, it has a relatively long imaging time, requires specialized and costly facilities, and is not well-suited for performance by a urologist at a urology center. Furthermore, performing direct prostate biopsy within MRI machines is not practical for a urologist at a urology center.
  • To overcome these shortcomings and maximize the usefulness of the MRI and ultrasound imaging modalities, methods and devices have been developed for digitizing medical images generated by multiple imaging modalities (e.g., ultrasound and MRI) and fusing or integrating multiple images to form a single composite image. This composite image includes information from each of the original images that were fused together. A fusion or integration of Magnetic Resonance (MR) images with ultrasound-generated images has been useful in the analysis of prostate cancer within a patient. Image-guided biopsy systems, such as the Artemis produced by Eigen (See, e.g., U.S. Pub. App. Nos. 2012/0087557, 2011/0184684, 2011/0178389, 2011/0081057, 2010/0207942, 2010/0172559, 2010/0004530, 2010/0004526, 2010/0001996, 2009/0326555, 2009/0326554, 2009/0326363, 2009/0324041, 2009/0227874, and U.S. Pat. Nos. 8,278,913, 8,175,350, 8,064,664, 7,942,829, 7,942,060, 7,875,039, 7,856,130, 7,832,114, and 7,804,989, expressly incorporated herein by reference), and UroStation developed by Koelis (see, e.g., U.S. Pub. App. Nos. 2012/0245455, 2011/0081063, and U.S. Pat. No. 8,369,592, expressly incorporated herein by reference), have been invented to aid in fusing MRI and ultrasonic modalities. These systems are three-dimensional (3D) image-guided prostate biopsy systems that provide tracking of biopsy sites within the prostate.
  • Until now, however, such systems have not been adequate for enabling MRI-ultrasound fusion to be performed by a urologist at a urology center. The use of such systems for MRI-ultrasound fusion necessarily requires specific MRI data, including MRI scans, data related to the assessment of those scans, and data produced by the manipulation of such data. Such MRI data, however, is not readily available to urologists and it would be commercially impractical for such MRI data to be generated at a urology center. This is due to many reasons, including urologists' lack of training or expertise, as well as the lack of time, to do so. Also, it is uncertain whether a urologist can profitably implement an image-guided biopsy system in his or her practice while contemporaneously attempting to learn to perform MRI scans. Furthermore, even if a urologist invested the time and money in purchasing MRI equipment and learning to perform MRI scans, the urologist would still be unable to perform the MRI-ultrasound fusion because a radiologist is needed for the performance of advanced MRI assessment and manipulation techniques which are outside the scope of a urologist's expertise.
  • MRI is generally considered to offer the best soft tissue contrast of all imaging modalities. Both anatomical (e.g., T1, T2) and functional MRI, e.g. dynamic contrast-enhanced (DCE), magnetic resonance spectroscopic imaging (MRSI) and diffusion-weighted imaging (DWI) can help visualize and quantify regions of the prostate based on specific attributes. Zonal structures within the gland cannot be visualized clearly on T1 images. However a hemorrhage can appear as high-signal intensity after a biopsy to distinguish normal and pathologic tissue. In T2 images, zone boundaries can be easily observed. Peripheral zone appears higher in intensity relative to the central and transition zone. Cancers in the peripheral zone are characterized by their lower signal intensity compared to neighboring regions. DCE improves specificity over T2 imaging in detecting cancer. It measures the vascularity of tissue based on the flow of blood and permeability of vessels. Tumors can be detected based on their early enhancement and early washout of the contrast agent. DWI measures the water diffusion in tissues. Increased cellular density in tumors reduces the signal intensity on apparent diffusion maps.
  • The use of imaging modalities other than trans-rectal ultrasound (TRUS) for biopsy and/or therapy typically provides a number of logistic problems. For instance, directly using MRI to navigate during biopsy or therapy can be complicated (e.g. requiring use of nonmagnetic materials) and expensive (e.g., MRI operating costs). This need for specially designed tracking equipment, access to an MRI machine, and limited availability of machine time has resulted in very limited use of direct MRI-guided biopsy or therapy. CT imaging is likewise expensive and has limited access, and poses a radiation risk for operators and patient.
  • Accordingly, one known solution is to register a pre-acquired image (e.g., an MRI or CT image), with a 3D TRUS image acquired during a procedure. Regions of interest identifiable in the pre-acquired image volume may be tied to corresponding locations within the TRUS image such that they may be visualized during/prior to biopsy target planning or therapeutic application. This solution allows a radiologist to acquire, analyze and annotate MRI/CT scan at the image acquisition facility while a urologist can still perform the procedure using live ultrasound in his/her clinic.
  • Consequently, there exists a need for improved systems and methods for performing image fusion for image-guided medical procedures.
  • SUMMARY
  • The present technology provides a method for combining information from plurality of medical imaging modalities, such as positron Emission Tomography (PET), Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Magnetic Resonance Spectroscopic Imaging (MRSI), Ultrasound, Echo Cardiograms and Elastography, supplemented by information obtained in advance by at least one other modality, which is properly registered to the real time image despite soft tissue movement, deformation, or change in size. Advantageously, the real time image is of a soft tissue organ or gland such as prostate, skin, heart, lung, kidney, liver, bladder, ovaries, and thyroid, and the supplemented real time image is used for a medical image guided procedure. The real time image may also be used for orthopedic or musculoskeletal procedures, or exercise physiology. A further real-time imaging type is endoscopy, or more generally, videography, which is in growing use, especially for minimally invasive procedures.
  • The medical procedure may be a needle based procedure, such as but not limited to biopsy, laser ablation, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, and/or direct injection of a photothermal or photodynamic agent. In these cases, for example, the medical professional seeks to treat a highly localized portion of an organ, while either avoiding a toxic or damaging therapy to adjacent structures, or to avoid waste of a valuable agent. However, the available real-time medical imaging modalities for guiding the localized treatment visualize the organ, but do not clearly delineate the portion of the organ to be treated. On the other hand, non-real time imaging modalities are available for defining locations sought to be treated with the localized treatment. In the case of soft tissues, in the time between the non-real time imaging and the real time procedure, the organ can shift, deform (especially as a result of the procedure itself), or change in size, thus substantially distorting the relationship between the real time image used to guide the procedure and the non-real time diagnostic or tissue localization image. A further complication is that the non-real time image may have a different intrinsic coordinate system from the real time imaging, leading to artifacts. Therefore, the present technology seeks to address these issues by compensating for differences in the patient's anatomy between acquisition of the non-real time image and the real time procedure, using, for example, general anatomical information, landmarks common to both images, and tissue and procedure models.
  • Typical medical procedures comprise image-guided non-needle based procedures such as but not limited to HIFU, IMRT, and robotic surgery.
  • The pre-operative imaging modality may thus be used to a identify target object or gland, and suspicious lesions of the object or gland, for targeted biopsy, targeted therapy, targeted dose delivery or a combination of the above.
  • The pre-operative imaging modality may be used to identify and annotate surrounding structures that need to be spared in order to minimize the impact of procedure on quality of life. In a specific embodiment, in a prostate related procedure, such structures may be nerve bundles, the urethra, rectum and bladder identified in a magnetic resonance (MR) image.
  • The pre-operative imaging modality may be used to identify and uniquely label anatomical landmarks for manual, semi-automated or automated registration. In a specific embodiment, in a prostate related procedure, such anatomical landmarks may be urethra at prostate base, urethra at apex, verumontanum, calcifications and cysts.
  • The pre-operative imaging modality may be used to identify and uniquely label anatomical structures for manual, semi-automated or automated registration. In a specific embodiment of the invention, in a prostate related procedure, such structures may be urethra, seminal vesicles, ejaculatory ducts, bladder and rectum.
  • A targeted biopsy may be performed for a malignancy to determine the extent of malignancy and best treatment option.
  • Needle guidance procedures may be provided where the pre-operative imaging modality is analyzed to plan the complete procedure or part of the procedure, such that anatomical locations of targets for needle placement is planned in advance, and the anatomical locations are guided by the real time imaging modality.
  • The needle locations and trajectories may be identified in advance based on the non-real time, pre-operative imaging modality, such that the target region is adequately sampled for biopsy to maximize the accuracy while minimizing number of samples for each target region.
  • The needle locations and trajectories may be identified in advance, such that a target region is effectively treated in a therapeutic procedure within the target area, while minimizing the damage to the surrounding tissue and structures. The trajectory may be optimized in a prostate procedure such that the needle insertion minimizes damage to important structures such as rectum and nerve bundles.
  • The duration of needle placement at each location in a therapeutic procedure may be optimized using a pre-operative imaging modality, to effectively design a treatment for the target region locally while sparing the surrounding tissues and structures.
  • Anatomical landmarks and/or structures identified in pre-operative image may also identified in the intra-operative (live) image and labeled consistently. The pre-operative image may also identify surfaces and boundaries, which can be defied or modeled as, for example, triangulated meshes. The surfaces may represent the entire anatomical structure/object or a part thereof. In some cases, a boundary may have no real anatomical correlate, and be defined virtually; however, an advantage arises if the virtual boundary can be consistently and accurately identified in both the pre-operative imaging and the real-time inter-operative imaging, since the facilitates registration and alignment of the images. The virtual features of the images may be based on generic anatomy, e.g., of humans or animals, or patient-specific. Labeled surfaces and landmarks in pre-operative and intra-operative images may be used for rigid registration. In a specific embodiment, if the bladder is labeled as “1” in pre-operative image, it is registered with object labeled “1” in intra-operative image. More generally, regions on an image are classified or segmented, and that classification or segment definition from the pre-operative imaging is applied to the inter-operative real time imaging.
  • There may be a plurality of landmarks and objects that are registered concurrently. In a specific embodiment for prostate procedures, if the bladder, prostate, rectum, urethra and seminal vesicles are labeled “1”, “2”, “3”, “4” and “5”, respectively, they intra-operative image employs the same labels to concurrently register the corresponding objects. The correspondence may be “hard-correspondence” or “soft-correspondence”, i.e., the landmarks may have absolute correspondence or a “fuzzy” correspondence. The availability of “soft-correspondence” permits or facilitates automated or semi-automated labeling of objects, since the real-time imaging is typically not used by a fully automated system to perform a procedure, and the skilled medical professional can employ judgment, especially if the labeling indicates a possible degree of unreliability, in relying on the automated labeling. Thus, a urologist in a prostate procedure can review the fused image in real time to determine whether there is sufficient consistency to proceed and rely on the pre-operative imaging information, or whether only the inter-operative real-time imaging is to be employed. Likewise, in some cases the pre-operative imaging labeling boundaries are imprecise, and therefore that the medical professional might wish to treat such boundaries as being advisory and not absolute.
  • The landmark and object registration may be performed rigidly using a simultaneous landmark and surface registration algorithm. A rigid registration may be optionally followed by an affine registration. An elastic registration method may follow the rigid or affine registration. An elastic registration method may be at least one of intensity based, binary mask based, surfaces- and landmarks-based method or a combination of these methods. A deformation model may be computed from a number of training datasets is used for image registration. The deformation model models the deformation of the object of interest, for example, a prostate goes through deformation upon application of an external tool such as ultrasound transducer or endo-rectal coil. The training datasets may include sets of corresponding planning images and live modality images for same patient. Thus, one aspect of the technology provides that pre-operative imaging is obtained under conditions that model a soft tissue deformation that might occur during the real-time imaging. The correspondence may be further refined by identifying and defining mismatching corresponding features between the pre-procedure and intra-procedure images. In a specific embodiment, in a prostate, a calcification may be seen in both MRI (pre-procedure) and ultrasound (inter-procedure) images, and if these anatomical landmarks mismatch slightly, a user may identify these landmarks visually and select them by click of a mouse; alternately, an automated indication of mismatch may be generated. An algorithm can then refine the correspondence such that the boundaries of the object of interest do not move while the deformation inside the object gets updated. The deformation inside the object of interest may thus follow an elastic deformation model based on the new landmarks constrained by object boundaries.
  • An image registration method may therefore be provided that maps a region of interest from a pre-procedural (planning) image to an intra-procedural (live) image, along with a complete plan such that the plan and the region of interest are mapped and deformed to conform to the shape of the object during the procedure.
  • The technology provides a method of image, fusion where the mapped plan may be displayed as one or more overlays on a live imaging modality display during an image guided procedure. In some cases, the fusion need not be an overlay, and may be supplemental information through a different modality, such as voice or sonic feedback, force-feedback or proprioceptive feedback, a distinct display (without overlay), or the like. In the case of an overlay, different types of information may be distinguished by color, intensity, depth (on a stereoscopic display), icons, or other known means. The plan may be indicated by static images or graphics, animated graphics, and/or acoustic information (e.g., voice synthesis feedback).
  • A planning image can also be overlaid on the live imaging modality during an image guided procedure such that the images can be toggled back and forth, or displayed together in a real-time “fused” display.
  • The mapped plan may be further adjusted to account for a new shape of object revealed during real-time imaging. This may be done using an automated method, semi-automatic method, or manual method or a combination thereof.
  • A pre-procedure planning image may be used to plan the procedure such that the plan is embedded in an electronic, printed, or interactive web-based report.
  • The present technology identifies imaging modalities clearly including landmarks, objects and intensity information, to perform registration using a combination of rigid, affine and non-rigid elastic registration.
  • The modeling of the objects within an image may thus comprise a segmentation of anatomical features.
  • The method may further comprise transforming coordinate systems of various imaging modalities. The system may further comprise at least one modeling processor configured to perform real-time model updates of a patient soft-tissue to ensure that a pre-operative image remains accurately registered with an intra-operative image.
  • The annotated regions of the medical imaging scan or the plan may be generated by a computer-aided diagnosis or therapeutic planning system. At least apportion of the pre-operative imaging may be conducted at a remote location from the therapeutic or diagnostic procedure, and the information conveyed between the two through the Internet, preferably over a virtual private network. A true private network may also be used, or simply encrypted files communicate over otherwise public channels. The physical separation of the imaging modalities facilitates professional specialization, since experts at different aspects of the process need not be collocated.
  • The present technology permits porting information from a planning image frame of reference to a live imaging modality for guiding a medical procedure. The plan may be defined as a region of interest and needle placement or a method to plan a treatment or biopsy, for example.
  • The present technology may employ not only object boundaries, but also surrounding or internal information for registration, and thus is may be employed in applications where there is significant internal deformation that cannot be modeled using boundaries alone.
  • The phrase “image fusion” is sometimes used to define the process of registering two images that are acquired via different imaging modalities or at different time instances. The registration/fusion of images obtained from different modalities creates a number of complications. The shape of soft tissues in two images may change between acquisitions of each image. Likewise, a diagnostic or therapeutic procedure can alter the shape of the object that was previously imaged. Further, in the case of prostate imaging the frame of reference (FOR) of the acquired images is typically different. That is, multiple MRI volumes are obtained in high resolution transverse, coronal or sagittal planes respectively, with lower resolution representing the slice distance. These planes are usually in rough alignment with the patient's head-toe, anterior-posterior or left-right orientations. In contrast, TRUS images are often acquired while a patient lies on his side in a fetal position by reconstructing multiple rotated samples 2D frames to a 3D volume. The 2D image frames are obtained at various instances of rotation of the TRUS probe after insertion in to the rectal canal. The probe is inserted at an angle (approximately 30-45 degrees) to the patient's head-toe orientation. As a result the gland in MRI and TRUS will need to be rigidly aligned because their relative orientations are unknown at scan time. Typically, well-defined and invariant anatomical landmarks may be used to register the images, though since the margins of landmarks themselves vary with imaging modality, the registration may be imperfect or require discretion in interpretation. A further difficulty with these different modalities is that the intensity of objects in the images do not necessarily correspond. For instance, structures that appear bright in one modality (e.g., MRI) may appear dark in another modality (e.g., ultrasound). Thus, the logistical process of overlaying or merging the images requires perceptual optimization. In addition, structures identified in one image (soft tissue in MRI) may be entirely absent in another image. TRUS imaging causes further deformation of gland due to pressure exerted by the TRUS transducer on prostate. As a result, rigid registration is not sufficient to account for difference between MRI and TRUS images. Finally, the resolution of the images may also impact registration quality.
  • Due to the FOR differences, image intensity differences between MRI and TRUS images, and/or the potential for the prostate to change shape between imaging by the MRI and TRUS scans, one of the few known correspondences between the prostate images acquired by MRI and TRUS is the boundary/surface model of the prostate. That is, the prostate is an elastic object that has a gland boundary or surface model that defines the volume of the prostate. By defining the gland surface boundary in the dataset for each modality, the boundary can then be used as a reference for aligning both images. Thus, each point of the volume defined within the gland boundary of the prostate in one image should correspond to a point within a volume defined by a gland boundary of the prostate in the other image, and vice versa.
  • In seeking to register the surfaces, the data in each data set may be transformed, assuming elastic deformation of the prostate gland. Thus, the characteristics of soft tissue under shear and strain, compression, heating and/or inflammation, bleeding, coagulation, biopsy sampling, tissue resection, etc., as well as normal physiological changes for healthy and pathological tissue, over time, are modeled, and therefore these various effects accounted for during the pre-operative imaging and real-time intraprocedural imaging.
  • According to a first aspect, a system and method is provided for use in medical imaging of a prostate of a patient. The utility includes obtaining a first 3D image volume from an MRI imaging device. Typically, this first 3D image volume is acquired from data storage. That is, the first 3D image volume is acquired at a time prior to a current procedure. A first shape or surface model may be obtained from the MRI image (e.g., a triangulated mesh describing the gland). The surface model can be manually or automatically extracted from all co-registered MRI image modalities. That is, multiple MRI images may themselves be registered with each other as a first step. The 3D image processing may be automated, so that a technician need not be solely occupied by the image processing, which may take seconds or minutes. The MRI images may be T1, T2, DCE (dynamic contrast-enhanced), DWI (diffusion weighted imaging), ADC (apparent diffusion coefficient) or other.
  • Similarly, data from other imaging modalities, e.g., computer aided (or axial) tomography (CAT) scans can also be registered. In the case of a CAT scan, the surface of the prostate may not represent a high contrast feature, and therefore other aspects of the image may be used; typically, the CAT scan is used to identify radiodense features, such as calcifications, or brachytherapy seeds, and therefore the goal of the image registration process would be to ensure that these features are accurately located in the fused image model. A co-registered CT image with PET scan can also provide diagnostic information that can be mapped to TRUS frame of reference for image guidance.
  • In one embodiment, the pre-operative imaging comprises use of the same imaging modality as used intra-operatively, generally along with an additional imaging technology. Thus, an ultrasound volume of the patient's prostate may be obtained, for example, through rotation of a TRUS probe, and the gland boundary is segmented in an ultrasound image. The ultrasound images acquired at various angular positions of the TRUS probe during rotation can be reconstructed to a rectangular grid uniformly through intensity interpolation to generate a 3D TRUS volume. Of course, other ultrasound methods may be employed without departing from the scope of the technology. The MRI or CAT scan volume is registered to the 3D TRUS volume (or vice versa), and a registered image of the 3D TRUS volume is generated in the same frame of reference (FOR) as the MRI or CAT scan image. According to a preferred aspect, this registration occurs prior to a diagnostic or therapeutic intervention. The advantage here is that both data sets may be fully processed, with the registration of the 3D TRUS volume information completed. Thus, during a later real-time TRUS guided diagnostic or therapeutic procedure, a fully fused volume model is available. In general, the deviation of a prior 3D TRUS scan from a subsequent one will be small, so features from the real-time scan can be aligned with those of the prior imaging procedure. The fused image from the MRI (or CAT) scan provides better localization of the suspect pathological tissue, and therefore guidance of the diagnostic biopsy or therapeutic intervention. Therefore, the suspect voxels from the MRI are highlighted in the TRUS image, which during a procedure would be presented in 2D on a display screen to guide the urologist. The process therefore seeks to register 3 sets of data; the MRI (or other scan) information, the pre-operative 3D TRUS information, and the real time TRUS used during the procedure. Ideally, the preoperative 3D TRUS and the inter-operative TRUS are identical apparatus, and therefore would provide maximum similarity and either minimization of artifacts or present the same artifacts. Indeed, the 3D TRUS preoperative scan can be obtained using the same TRUS scanner and immediately pre-operative, though it is preferred that the registration of the images proceed under the expertise of a radiologist or medical scanning technician, who may not be immediately available during that period.
  • A plan may be defined manually, semi-automatically, or in certain cases, automatically. The plan, for example in a prostate biopsy procedure, defines both the location of the samples to be acquired, as well as the path to be taken by the biopsy instrument in order to avoid undue damage to tissues. In some cases, the plan may be updated in real-time. For example, if the goal of the plan is to sample a volume of tissue on 1.5 mm spatial distances, but the accuracy of sampling is ±0.5 mm, then subsequent sampling targets may be defined adaptively based on the actual sampling location during the procedure. Likewise, in laser therapy, the course of treatment, including both the location of the laser and its excitation parameters, may be determined based on both the actual location of a fiber optic tip, as well as a measured temperature, and perhaps an inter-operatively determined physiological response to the therapy, such as changes in circulation pattern, swelling, and the like.
  • The registered image and the geometric transformation that relates, for example, a MRI scan volume with an ultrasound volume can be used to guide a medical procedure such as, for example, biopsy or brachytherapy.
  • These regions of interest identified on the MRI scan are usually defined by a radiologist based on information available in MRI prior to biopsy, and may be a few points, point clouds representing regions, or triangulated meshes. Likewise, the 3D TRUS may also reveal features of interest for biopsy, which may also be marled as regions of interest. Because of the importance of registration of the regions of interest in the MRI scan with the TRUS used intra-operatively, in a manual or semiautomated pre-operative image processing method, the radiologist can override or control the image fusion process according to his or her discretion.
  • In a preferred embodiment, a segmented MRI and 3D TRUS is obtained from a patient for the prostate grand. The MRI and TRUS data is registered and transformations applied to form a fused image in which voxels of the MRI and TRUS images physically correspond to one another. Regions of interest are then identified either from the source images or from the fused image. The regions of interest are then communicated to the real-time ultrasound system, which tracks the earlier TRUS image. Because the ultrasound image is used for real time guidance, typically the transformation/alignment takes place on the MRI data, which can then be superposed or integrated with the ultrasound data.
  • During the procedure, the real-time TRUS display is supplemented with the MRI (or CAT or other scan) data, and an integrated display presented to the operating urologist. In some cases, haptic feedback may be provided so that the urologist can “feel” features when using a tracker.
  • It is noted that as an alternate, the MRI or CAT scan data may be used to provide a coordinate frame of reference for the procedure, and the TRUS image modified in real-time to reflect an inverse of the ultrasound distortion. That is, the MRI or CAT data typically has a precise and undistorted geometry. On the other hand the ultrasound image may be geometrically distorted by phase velocity variations in the propagation of the ultrasound waves through the tissues, and to a lesser extent, by reflections and resonances. Since the biopsy instrument itself is rigid, it will correspond more closely to the MRI or CAT model than the TRUS model, and therefore a urologist seeking to acquire a biopsy sample may have to make corrections in course if guided by the TRUS image. If the TRUS image, on the other hand, is normalized to the MRI coordinate system, then such corrections may be minimized. This requires that the TRUS data be modified according to the fused image volume model in real time. However, modern graphics processors (GPU or APU, multicore CPU, FPGA) and other computing technologies make this feasible.
  • According to another aspect, the urologist is presented with a 3D display of the patient's anatomy, supplemented by and registered to the real-time TRUS data. Such 3D displays may be effectively used with haptic feedback.
  • It is noted that two different image transformations are at play; the first is a frame of reference transformation, due to the fact that the MRI image is created as a set of slices in parallel planes (rectangular coordinate system) which will generally differ from the image plane of the TRUS, defined by the probe angle (cylindrical coordinate system, with none of the cylindrical axes aligned with a coordinate axis of the MRI). The second transformation represents the elastic deformation of the objects within the image to properly aligned surfaces, landmarks, etc.
  • The segmentation and/or digitizing may be carried out semi-automatically (manual control over automated image processing tasks) or automatically using computer software. One example of computer software which may be suitable includes 3D Slicer (www.slicer.org), an open source software package capable of automatic image segmentation, manual editing of images, fusion and co-registering of data using rigid and non-rigid algorithms, and tracking of devices for image-guided procedures.
  • See, e.g. (each of which is expressly incorporated herein by reference):
  • Caskey C F, Hlawitschka M, Qin S, Mahakian L M, Cardiff R D, et al. “An Open Environment CT-US Fusion for Tissue Segmentation during Interventional Guidance”, PLoS ONE 6(11): e27372. doi:10.1371/journal.pone.0027372 (Nov. 23, 2011) www.plosone.org/article/info %3Adoi %2F10.1371%2Fjournal.pone.0027372; Shogo Nakano, Miwa Yoshida, Kimihito Fujii, Kyoko Yorozuya, Yukako Mouri, Junko Kousaka, Takashi Fukutomi, Junko Kimura, Tsuneo Ishiguchi, Kazuko Ohno, Takao Mizumoto, and Michiko Harao, “Fusion of MRI and Sonography Image for Breast Cancer Evaluation Using Real-time Virtual Sonography with Magnetic Navigation: First Experience”, Jpn. J. Clin. Oncol. (2009) 39(9): 552-559 first published online Aug. 4, 2009 doi:10.1093/jjco/hyp087; Porter, Brian C., et al. “Three-dimensional registration and fusion of ultrasound and MRI using major vessels as fiducial markers.” Medical Imaging, IEEE Transactions on 20.4 (2001): 354-359; Kaplan, Irving, et al. “Real time MRI-ultrasound image guided stereotactic prostate biopsy.” Magnetic resonance imaging 20.3 (2002): 295-299; Jung, E. M., et al. “New real-time image fusion technique for characterization of tumor vascularisation and tumor perfusion of liver tumors with contrast-enhanced ultrasound, spiral CT or MRI: first results.” Clinical hemorheology and microcirculation 43.1 (2009): 57-69; Lindseth, Frank, et al. “Multimodal image fusion in ultrasound-based neuronavigation: improving overview and interpretation by integrating preoperative MRI with intra-operative 3D ultrasound.” Computer Aided Surgery 8.2 (2003): 49-69; Xu, Sheng, et al. “Real-time MRI-TRUS fusion for guidance of targeted prostate biopsies.” Computer Aided Surgery 13.5 (2008): 255-264; Singh, Anurag K., et al. “Initial clinical experience with real-time transrectal ultrasonography-magnetic resonance imaging fusion-guided prostate biopsy.” BJU international 101.7 (2007): 841-845; Pinto, Peter A., et al. “Magnetic resonance imaging/ultrasound fusion guided prostate biopsy improves cancer detection following transrectal ultrasound biopsy and correlates with multiparametric magnetic resonance imaging.” The Journal of urology 186.4 (2011): 1281-1285; Reynier, Christophe, et al. “MRI/TRUS data fusion for prostate brachytherapy. Preliminary results.” arXiv preprint arXiv:0801.2666 (2008); Schlaier, J. R., et al. “Image fusion of MR images and real-time ultrasonography: evaluation of fusion accuracy combining two commercial instruments, a neuronavigation system and a ultrasound system.” Acta neurochirurgica 146.3 (2004): 271-277; Wein, Wolfgang, Barbara Roper, and Nassir Navab. “Automatic registration and fusion of ultrasound with CT for radiotherapy.” Medical Image Computing and Computer-Assisted Intervention—MICCAI 2005 (2005): 303-311; Krücker, Jochen, et al. “Fusion of real-time transrectal ultrasound with preacquired MRI for multimodality prostate imaging.” Medical Imaging. International Society for Optics and Photonics, 2007; Singh, Anurag K., et al. “Simultaneous integrated boost of biopsy proven, MRI defined dominant intra-prostatic lesions to 95 Gray with IMRT: early results of a phase I NCI study.” Radiat Oncol 2 (2007): 36; Hadaschik, Boris A., et al. “A novel stereotactic prostate biopsy system integrating pre-interventional magnetic resonance imaging and live ultrasound fusion.” The Journal of urology (2011); Narayanan, R., et al. “MRI-ultrasound registration for targeted prostate biopsy.” Biomedical Imaging: From Nano to Macro, 2009. ISBI'09. IEEE International Symposium on. IEEE, 2009; Natarajan, Shyam, et al. “Clinical application of a 3D ultrasound-guided prostate biopsy system.” Urologic Oncology: Seminars and Original Investigations. Vol. 29. No. 3. Elsevier, 2011; Daanen, V., et al. “MRI/TRUS data fusion for brachytherapy.” The International Journal of Medical Robotics and Computer Assisted Surgery 2.3 (2006): 256-261; Sannazzari, G. L., et al. “CT-MRI image fusion for delineation of volumes in three-dimensional conformal radiation therapy in the treatment of localized prostate cancer.” British journal of radiology 75.895 (2002): 603-607; Kadoury, Samuel, et al. “Realtime TRUS/MRI fusion targeted-biopsy for prostate cancer: a clinical demonstration of increased positive biopsy rates.” Prostate Cancer Imaging. Computer-Aided Diagnosis, Prognosis, and Intervention (2010): 52-62; Comeau, Roch M., et al. “Intraoperative ultrasound for guidance and tissue shift correction in image-guided neurosurgery.” Medical Physics 27 (2000): 787; Turkbey, Baris, et al. “Documenting the location of prostate biopsies with image fusion.” BJU international 107.1 (2010): 53-57; Constantinos, S. P., Marios S. Pattichis, and Evangelia Micheli-Tzanakou. “Medical imaging fusion applications: An overview.” Signals, Systems and Computers, 2001. Conference Record of the Thirty-Fifth Asilomar Conference on. Vol. 2. IEEE, 2001; Xu, Sheng, et al. “Closed-loop control in fused MR-TRUS image-guided prostate biopsy.” Medical Image Computing and Computer-Assisted Intervention—MICCAI 2007 (2007): 128-135; Turkbey, Baris, et al. “Documenting the location of systematic transrectal ultrasound-guided prostate biopsies: correlation with multi-parametric MRI.” Cancer imaging: the official publication of the International Cancer Imaging Society 11 (2011): 31; Tang, Annie M., et al. “Simultaneous ultrasound and MRI system for breast biopsy: compatibility assessment and demonstration in a dual modality phantom.” Medical Imaging, IEEE Transactions on 27.2 (2008): 247-254; Wong, Alexander, and William Bishop. “Efficient least squares fusion of MRI and CT images using a phase congruency model.” Pattern Recognition Letters 29.3 (2008): 173-180; Ewertsen, Caroline, et al. “Biopsy guided by real-time sonography fused with MRI: a phantom study.” American Journal of Roentgenology 190.6 (2008): 1671-1674; Khoo, V. S., and D. L. Joon. “New developments in MRI for target volume delineation in radiotherapy.” British journal of radiology 79. Special Issue 1 (2006): S2-S15; and Nakano, Shogo, et al. “Fusion of MRI and sonography image for breast cancer evaluation using real-time virtual sonography with magnetic navigation: first experience.” Japanese journal of clinical oncology 39.9 (2009): 552-559.
  • See also U.S. Pat. Nos. 5,227,969; 5,299,253; 5,389,101; 5,411,026; 5,447,154; 5,531,227; 5,810,007; 6,200,255; 6,256,529; 6,325,758; 6,327,490; 6,360,116; 6,405,072; 6,512,942; 6,539,247; 6,561,980; 6,662,036; 6,694,170; 6,996,430; 7,079,132; 7,085,400; 7,171,255; 7,187,800; 7,201,715; 7,251,352; 7,266,176; 7,313,430; 7,379,769; 7,438,685; 7,520,856; 7,582,461; 7,619,059; 7,634,304; 7,658,714; 7,662,097; 7,672,705; 7,727,752; 7,729,744; 7,804,989; 7,831,082; 7,831,293; 7,850,456; 7,850,626; 7,856,130; 7,925,328; 7,942,829; 8,000,442; 8,016,757; 8,027,712; 8,050,736; 8,052,604; 8,057,391; 8,064,664; 8,067,536; 8,068,650; 8,077,936; 8,090,429; 8,111,892; 8,180,020; 8,135,198; 8,137,274; 8,137,279; 8,167,805; 8,175,350; 8,187,270; 8,189,738; 8,197,409; 8,206,299; 8,211,017; 8,216,161; 8,249,317; 8,275,182; 8,277,379; 8,277,398; 8,295,912; 8,298,147; 8,320,653; 8,337,434; and US Patent Application No. 2011/0178389, each of which is expressly incorporated herein by reference.
  • It is therefore an object to provide a method for combining information from plurality of medical imaging modalities, comprising: acquiring a first volumetric image using a first volumetric imaging modality of an anatomical region; defining an elastic soft tissue model for at least a portion of the anatomical region encompassed by the first volumetric image; labeling features of the anatomical region based on at least the first volumetric image and the soft tissue model, comprising at least features of the anatomical region which are visualized by both the first imaging modality and a second imaging modality, and features of the anatomical region which are poorly visualized in the second imaging modality; acquiring a second volumetric image of the anatomical region using the second imaging modality comprising a real time image; registering the features of the anatomical region which are visualized by both the first imaging modality and a second imaging modality, and the features of the anatomical region which are poorly visualized in the second imaging modality, with respect to the soft tissue model, such that the features of the anatomical region which are visualized by both the first imaging modality and a second imaging modality are linked, compensating for at least one distortion of the portion of the anatomical region between a first time of the first volumetric image and a second time of the second volumetric image; and presenting an output based on at least the features of the anatomical region which are poorly visualized in the second imaging modality in the real time image, compensated based on at least the registered features and the soft tissue model.
  • It is also an object to provide a method for combining information from plurality of medical imaging modalities, comprising: acquiring volumetric images using a first volumetric imaging modality of an anatomical region of a person under a plurality of states of deformation; acquiring volumetric images using a second volumetric imaging modality of the anatomical region of the person under a plurality of states of deformation; defining an elastic soft tissue model for the anatomical region comprising model parameters representing tissue compliance and surface properties; labeling features of the anatomical region based on at least the volumetric images of the first imaging modality, the volumetric images of the second imaging modality, and the soft tissue model, wherein the labeling aligns corresponding features and compensates for rigid, elastic and affine transform of the anatomical region between times for acquiring the volumetric images of the first imaging modality and the volumetric images of the second imaging modality; and presenting an output based on at least the labeled features of the anatomical region.
  • A further object provides a system for combining information from plurality of medical imaging modalities, comprising: an input port configured to receive at least two first volumetric images using a first volumetric imaging modality of an anatomical region representing respectively different states of elastic deformation, and at least two second volumetric images using a second volumetric imaging modality, of the anatomical region representing respectively different states of elastic deformation; at least one processor configured to define an elastic soft tissue model for at least a portion of the anatomical region encompassed by the first volumetric image, and to label features of the anatomical region based on at least the first volumetric image and the soft tissue model; and a memory configured to store the defined elastic soft tissue model
  • The first imaging modality may comprise at least one of positron emission tomography, computed tomography, magnetic resonance imaging, magnetic resonance spectrography imaging, and elastography. The anatomical region may comprise an organ selected from the group consisting of prostate, heart, lung, kidney, liver, bladder, ovaries, and thyroid. The therapeutic intervention may be selected from one or more selected from the group consisting of laser ablation, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, direct injection of a photothermal or photodynamic agent, and radiotherapy. The differentially visualized anatomical region may be at least one anatomical structure to be spared in an invasive procedure, selected from the group consisting of a nerve bundle, a urethra, a rectum and a bladder. The registered features may comprise anatomical landmarks selected from the group consisting of a urethra, a urethra at a prostate base, a urethra at an apex, a verumontanum, a calcification and a cyst, a seminal vesicle, an ejaculatory duct, a bladder and a rectum.
  • The method may further comprise acquiring a tissue sample from a location determined based on at least the first imaging modality and the second imaging modality.
  • The method may further comprise delivering a therapeutic intervention at a location determined based on at least the first imaging modality and the second imaging modality.
  • The method may further comprise performing an image-guided at least partially automated procedure selected from the group consisting of laser ablation, high intensity focused ultrasound, cryotherapy, radio frequency, brachytherapy, IMRT, and robotic surgery.
  • The differentially visualized anatomical region may comprise at least one of a suspicious lesion for targeted biopsy, a suspicious lesion for targeted therapy, and a lesion for targeted dose delivery.
  • The method may further comprise automatically defining a plan comprising an target and an invasive path to reach the target.
  • The plan may be defined based on the first imaging modality, and is adapted in real time based on at least the second imaging modality. The plan may comprise a plurality of targets.
  • A plurality of anatomical features may be consistently labeled in the first volumetric image and the second volumetric image. The soft tissue model may comprise an elastic triangular mesh approximating a surface of an organ. The anatomical landmark registration may be performed rigidly using a simultaneous landmark and surface registration algorithm. An affine registration may be performed. The registering may comprise an elastic registration based on at least one of an intensity, a binary mask, and surfaces and landmarks.
  • The model may be derived from a plurality of training datasets representing different states of deformation of an organ of a respective human using the first imaging modality and the second imaging modality.
  • The method may further comprise identifying a mismatch of corresponding anatomical features of the first volumetric image and the second volumetric image, and updating the registration to converge the corresponding anatomical features to reduce the mismatch based on corrections of an elastic deformation model constrained by object boundaries.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a typical workflow for a surgeon in using a fusion platform for mapping plan from a pre-procedural planning image to the intra-procedural live image;
  • FIG. 2 shows a method for rigid registration, in which I1(x) and I2(x) represent the planning and live images, respectively with x being the coordinate system, Ω1,i and Ω2,i represent domains of the objects labeled in images I1 and I2, respectively such that i=1, 2, 3, . . . represent object labels 1, 2, 3, etc., and wi's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B, and R represents the rigid transformation matrix that includes rotation and translation in 3D frame of reference;
  • FIG. 3 shows a method for affine registration, in which I1(x) and I2(x) represent the planning and live images, respectively with x being the coordinate system, Ω1,i and Ω2,i represent domains of the objects labeled in images I1 and I2, respectively such that i=1, 2, 3, . . . represent object labels 1, 2, 3, etc., and wi's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B;
  • FIG. 4 shows an object process diagram for non-rigid elastic image registration, using rigid and/or affine registration as an initialization, wherein multiple labeled objects are used to compute the correspondence while satisfying the regularization constraints;
  • FIG. 5 shows an object process diagram for planning a laser ablation of the prostate gland, in which a radiologist/radiation oncologist analyzes multiparametric MRI (mpMRI) images of a prostate and plans the location of needle tip, trajectory and duration of needle application; and
  • FIGS. 6A and 6B show a conceptual diagram for planning a laser ablation of the prostate gland, in which FIG. 6A shows the target lesion identified by an expert in sagittal and transverse images, and FIG. 6B shows the plan for laser ablation in the two orthogonal directions.
  • DESCRIPTION OF THE EMBODIMENTS
  • The present invention will be described with respect to a process, which may be carried out through interaction with a user or automatically. One skilled in the art will appreciate that various types of imaging systems, including but not limited to MRI, ultrasound, PET, CT, SPECT, X-ray, and the like may be used for either pre-operative or intra-operative imaging, but that a preferred scheme employs a fusion of MRI and/or CT and/or PET and ultrasound imaging for the pre-operative imaging, and trans-urethral ultrasound for intra-operative real time imaging in a prostate diagnosis or therapeutic procedure.
  • According to an embodiment of the present technology, one or more pre-procedure “planning” images are used to plan a procedure and one or more intra-procedure “live” images used to guide the procedure. For example, prostate biopsy and ablation is typically done under ultrasound guidance. While speed of imaging and cost make ultrasound an ideal imaging modality for guiding biopsy, ultrasound images are insufficient and ineffective at identifying prostate cancer. Multi-parametric MRI (mpMRI) has been shown to be very sensitive and specific for identifying, localizing and grading of prostate cancer. mpMRI consists of various protocols for MR imaging including T2-weighted imaging, Diffusion-weighted imaging (DWI), Dynamic contrast-enhanced (DCE) and MR Spectroscopic imaging (MRSI). Radiologists are best placed at analyzing the MRI images for detection and grading the prostate cancer. However, it remains challenging to take the information from radiologists and present to urologists or surgeons who use ultrasound as imaging method for performing a biopsy. Likewise, MRI is ideal for identifying the sensitive surrounding structures that must be spared in order to preserve quality of life after the procedure.
  • Recent advances in clinical research and accurate ablation have increased interest in focal ablation of the prostate, where the location of malignancy is known and the malignancy is treated locally with the surroundings remaining intact. For example, high-intensity focused ultrasound ablation of the prostate is performed under ultrasound guidance. However, due to limitations of ultrasound, it is hard to correlate the findings in pre-procedure MRI with the intra-procedure ultrasound. As a result, a much larger area is treated to ensure that the malignancy was treated properly. In other words, most such users perform a “cognitive” registration, i.e., use their own knowledge and interpretation of prostate anatomy to guide such a procedure while using an ineffective imaging method. The same challenge applies in robotic surgery where the nerve bundles are not seen very clearly under live optical imaging. As a result, nerve sparing remains a challenge in robotic surgery. Again, MR imaging provides the necessary information but there are no insufficient tools available to apply that information to a surgical method.
  • Although methods exist for performing MRI-TRUS image fusion, the methods suffer from significant drawbacks. For example, Kumar et al.'s method (see, U.S. Pub. App. 2010/02906853) uses a prostate surface-based non-rigid image registration method. The method uses only triangulated prostate boundaries as input to registration and performs a point-wise image registration only at the surface boundaries and then interpolates the information from boundaries to inside the object. Significant drawbacks include lack of information from surrounding structures, requiring significant skills, knowledge and effort to provide a good manual image registration between MRI and ultrasound, which is very challenging, especially when surgeons are not very skilled at reading and interpreting MR images. As a result, the results can be variable since there can be significant difference in orientation and shape of gland between MRI and transrectal ultrasound. In addition to outside structures for orienting or rigidly registering the prostate, the prostate internal structures and details are also completely ignored. Therefore, any internal twisting, rotation or non-rigid distortion is not accounted for, which may lead to poor results especially when an endo-rectal coil is used in MRI. In addition, the plan is mapped as a region of interest, leaving it up to the surgeon to interpret how to properly sample a certain region. Also, in case of misregistration, there is no way disclosed to edit or refine the registration.
  • In a specific embodiment of the invention, for a fusion guided biopsy procedure (see FIG. 2), the method plans location, trajectory and depth of needle insertion optimized such that there is maximum likelihood of sampling the malignancy while minimizing number of biopsy cores.
  • FIG. 2 shows a method according to the present technology for rigid registration. In FIG. 2, I1(x) and I2(x) represent the planning and live images, respectively with x being the coordinate system. Ω1,i and Ω2,i represent domains of the objects labeled in images I1 and I2, respectively such that i=1, 2, 3, . . . represent object labels 1, 2, 3, etc. For example, i=1, 2 and 3 may correspond to prostate, bladder and rectum, respectively. wi's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B. For example, for intensity based metrics, the cost could be sum of the squared intensity differences or a mutual information based metric, in case of binary objects, the cost may be relative overlap. In case of surfaces, the cost could be a symmetric distance between corresponding points. R represents the rigid transformation matrix that includes rotation and translation in 3D frame of reference.
  • Likewise, in another embodiment for a fusion guided focal ablation (see FIG. 3), the needle placement is computed in advance such that the computed location, depth and trajectory maximize dosage/energy delivery at the malignancy while minimizing exposure to surrounding region.
  • FIG. 3 shows a method for affine registration. In FIG. 3, I1(x) and I2(x) represent the planning and live images, respectively with x being the coordinate system. Ω1,i and Ω2,i represent domains of the objects labeled in images I1 and I2, respectively such that i=1, 2, 3, . . . represent object labels 1, 2, 3, etc. For example, i=1, 2 and 3 may correspond to prostate, bladder and rectum, respectively. wi's are relative weights for different costs and Sim(A,B) represents the similarity cost between two objects A and B. For example, for intensity based metrics, the cost could be sum of squared intensity differences or a mutual information based metric, in case of binary objects, the cost may be relative overlap. In case of surfaces, the cost could be symmetric distance between corresponding points. A represents the affine transformation matrix that registers image I1 to frame of reference of image I2.
  • The procedure is preferably performed under intra-procedural image guidance, with the information from pre-procedure mapped to an intra-procedure image using a combination of rigid, affine and elastic registration, as shown in FIG. 4, which shows an object process diagram for non-rigid elastic image registration using rigid and/or affine registration as an initialization. The method uses multiple labeled objects to compute the correspondence while satisfying the regularization constraints. During the procedure, the surgeon identifies the same landmarks, features and structures as the pre-procedure image and labels them consistently. This may be done automatically or manually after acquiring an initial intra-procedural scan. The registration method then uses the labels in pre-procedure and intra-procedure images to identify the structural correspondence and registers the images using a combination of rigid, affine and elastic registration.
  • According to the algorithm detailed in FIG. 4, two inputs are provided: Rigid or rigid+ affine registered planning image I1′, having labeled objects Ω1,i′ for i≧1, and landmarks Xj's for j≧1; and Intra-operative planning image I2, labeled objects Ω2,i, and landmarks, Yj's for i·j≧1.
  • The algorithm initializes T=I;
    perform an identity transformation;
    Minimize with respect to Titer:
  • i w 1 , i x ɛ T iter ( Ω 1 , i ) Ω 2 , i sim ( T iter ( I 1 ) , I 2 ) x + i w 3 , i x ɛ T iter ( Ω 1 , i ) Ω 2 , i Reg ( T iter ) x ;
  • Updated T based on intensity cost;
  • Minimize:
  • j w 2 , i sim ( T iter ( X j ) , Y j ) + i w 2 , j Reg ( T iter ) ;
  • Update Titer based on landmarks;
    If convergence,
  • Transform T=Titer
  • Registered image T(I1′),
  • Output mapped plan and labeled objects
  • If no convergence, iterate minimizations.
  • There are two different methods to perform the registrations: use the landmarks and features as “soft-correspondence” or “hard-correspondence” points and the structures as binary images, as shown in FIG. 5, which shows an object process diagram for planning a laser ablation of the prostate gland. A radiologist/radiation oncologist performs analysis of mpMRI images of prostate and plans the location of needle tip, trajectory and duration of needle application. The landmarks and features are used as “soft-landmarks” or “hard-landmarks” points and the structures as surface meshes (FIG. 6). “Soft-landmarks” represent the landmarks that may not correspond exactly with each other and there may be some tolerance or level of confidence that will be refined during registration. “Hard-landmarks” refer to landmarks that are assumed to match exactly and their correspondence is not allowed to change during registration.
  • FIGS. 6A and 6B show a conceptual diagram for planning a laser ablation of the prostate gland. FIG. 6A shows the target lesion identified by an expert in sagittal and transverse images. FIG. 6B shows the plan for laser ablation in the two orthogonal directions. A and B represent the planned needles, which ablate the area shown in hatched lines. The ablated area covers the planned target.
  • The registration provides a surgeon with image fusion such that the information from pre-procedure or planning images is mapped to the frame of reference of the intra-procedure or live images. The mapped information contains at least one structural image, target area to be treated and a plan for the procedure. The plan may be in the form of needle location and trajectory along with the duration of needle application, if needed.
  • FIG. 1 shows the overall workflow of a surgeon, where the images planned by an expert (radiologist/radiation oncologist) are fused with a live imaging modality such as ultrasound for real-time guidance while taking advantage of diagnostic capabilities of the pre-procedural planning image. The pre-procedure image is registered with the live image using a combination of rigid, affine and non-rigid elastic registration. The registration provides a correspondence or a deformation map, which is used to map planning information from the frame of reference of the planning image to the live image. The method permits a radiologist, radiation oncologist or an oncological image specialist to analyze pre-operative images, identify and label various structures including the objects of interest, say the prostate from the above detailed examples. The structures identified and labeled by the imaging specialist could include external and internal structures and landmarks such as bladder, urethra, rectum, seminal vesicles, nerve bundles, fibromuscular stroma and prostate zones. These structures are identified and stored as either points, binary masks or surface meshes. Each such structure is labeled uniquely. In addition, the method includes an automatically (or semi-automatically) generated plan for the entire procedure.
  • FIGS. 2, 3 and 4 represent the rigid, affine and non-rigid elastic registration methods. An expert or a computer algorithm identifies and labels various anatomical structures and landmarks in the planning image. Let image I1(x) represent the structural planning image. In one embodiment, the structural image could be a T2-weighted transversally acquired MRI image. The subscript 1 corresponds to the planning or pre-procedural image. Let Ω1,i represent the object labeled i, where i=1, 2, 3, . . . represent a unique label for an anatomical object. For example, if bladder is labeled as 1 in planning image, Ω1,1 consists of all the voxels corresponding to bladder in the image I1. Alternatively, objects may also be represented by surfaces, in which case, the objects will consist of the vertices and triangles joining the vertices. Let Xi represent the point landmarks in the planning image, where i=1, 2, 3, . . . represents the index of the point landmarks identified in the planning image either manually or using an automated method. In addition, the expert provides the plan for a procedure on the structural image.
  • During the procedure, a surgeon loads the planning image I1 along with the object labels or surface meshes, landmarks and the plan. The planning image I1 is projected to the intra-procedure image I2 acquired during the procedure. The labels and landmarks may be defined in the image I2 either manually or automatically. In one embodiment, the labels in the target image I2 are automatically computed by letting the planning image I1 deform to the shape of the target image I2. The object maps defined in planning image also participate in the registration such that segmentation (object labeling) and registration (computation of correspondence) happens at the same time in the target image.
  • FIG. 4 shows one way of performing the registration between the pre-procedure planning image and the intra-operative image. The method uses the object maps along with the intensity information and the landmark correspondences to compute the correspondence between the images. The resulting deformation map is used to map the plan from frame of reference of the planning image to the intra-procedural image.
  • FIGS. 5, 6A and 6B represent an embodiment where the plan is a needle-based laser ablation plan. In this embodiment, the radiologist or radiation oncologist analyses the MRI image and automatically or manually computes a target region along with labeling the surrounding sensitive tissue, i.e., the safety zone. The automated method embedded in the current method computes the trajectory, location, energy settings and the duration of application of laser such that the target region is completely ablated while the safety zone is spared.
  • MRI data, which may include post-segmented MR image data, pre-segmented interpreted MRI data, the original MRI scans, suspicion index data, and/or instructions or a plan, may be communicated to a urologist, The MRI data may be stored in a DICOM format, in another industry-standard format, or in a proprietary format unique to the imaging modality or processing platform generating the medical images.
  • The urology center where the MRI data is received may contain an image-guided biopsy or therapy system such as the Artemis, UroStation (Koelis, La Tronche, France), or BiopSee (MedCom GmbH, Darmstadt, Germany). Alternatively, the image-guided biopsy system may comprise hardware and/or software configured to work in conjunction with a urology center's preexisting hardware and/or software. For example, a mechanical tracking arm may be connected to a preexisting ultrasound machine, and a computer programmed with suitable software may be connected to the ultrasound machine or the arm. In this way, the equipment already found in a urology center can be adapted to serve as an image-guided biopsy system of the type described in this disclosure. A tracking arm on the system may be attached to an ultrasound probe and an ultra sound scan is performed.
  • A two-dimensional (2-D) or 3D model of the prostate may be generated using the ultrasonic images produced by the scan, and segmentation of the model may be performed. Pre-processed ultrasound image data and post-processed ultrasound image data may be transmitted to the urology center. Volumetry may also be performed, including geometric or planimetric volumetry. Segmentation and/or volumetry may be performed manually or automatically by the image guided biopsy system. Preselected biopsy sites (e.g., selected by the radiologist during the analysis) may be incorporated into and displayed on the model. All of this ultrasound data generated from these processes may be electronically stored on the urology center's server via a communications link.
  • As described above, processing of the MRI data or ultrasound data, including segmentation and volumetry, may be carried out manually, automatically, or semi-automatically. This may be accomplished through the use of segmentation software, such as Segasist Prostate Auto-Contouring, which may be included in the image-guided biopsy system. Such software may also be used to perform various types of contour modification, including manual delineation, smoothing, rotation, translation, and edge snapping. Further, the software is capable of being trained or calibrated, in which it observes, captures, and saves the user's contouring and editing preferences over time and applies this knowledge to contour new images. This software need not be hosted locally, but rather, may be hosted on a remote server or in a cloud computing environment. At the urology center, MRI data may be integrated with the image-guided biopsy system.
  • The fusion process may be aided by the use of the instructions included with the MRI data. The fusion process may include registration of the MR and ultrasonic images, which may include manual or automatic selection of fixed anatomical landmarks in each image modality. Such landmarks may include the base and apex of the prostatic urethra. The two images may be substantially aligned and then one image superimposed onto the other. Registration may also be performed with models of the regions of interest. These models of the regions of interest, or target areas, may also be superimposed on the digital prostate model.
  • The fusion process thus seeks to anatomically align the 3D models obtained by the radiological imaging, e.g., MRI, with the 3D models obtained by the ultrasound imaging, using anatomical landmarks as anchors and performing a warping of at least one of the models to confirm with the other. The radiological analysis is preserved, such that information from the analysis relevant to suspicious regions or areas of interest are conveyed to the urologist. The fused models are then provided for use with the real-time ultrasound system, to guide the urologist in obtaining biopsy samples or performing a therapeutic procedure.
  • Through the use of the described methods and systems, the 3D MR image is integrated or fused with real-time ultrasonic images, based on a 3D ultrasound model obtained prior to the procedure (perhaps immediately prior). This allows the regions of interest to be viewed under real-time ultrasonic imaging so that they can be targeted during biopsy or therapy.
  • In this way, biopsy tracking and targeting using image fusion may be performed by the urologist for diagnosis and management of prostate cancer. Targeted biopsies may be more effective and efficient for revealing cancer than non-targeted, systematic biopsies. Such methods are particularly useful in diagnosing the ventral prostate gland, where malignancy may not always be detected with biopsy. The ventral prostate gland, as well as other areas of the prostate, often harbor malignancy in spite of negative biopsy. Targeted biopsy addresses this problem by providing a more accurate diagnosis method. This may be particularly true when the procedure involves the use of multimodal MRI. Additionally, targeting of the suspicious areas may reduce the need for taking multiple biopsy samples or performing saturation biopsy.
  • The described methods and systems may also be used to perform saturation biopsy. Saturation biopsy is a multicore biopsy procedure in which a greater number of samples are obtained from throughout the prostate than with a standard biopsy. Twenty or more samples may be obtained during saturation biopsy, and sometimes more than one hundred. This procedure may increase tumor detection in high-risk cases. However, the benefits of such a procedure are often outweighed by its drawbacks, such as the Inherent trauma to the prostate, the higher incidence of side effects, the additional use of analgesia or anesthesia, and the high cost of processing the large amount of samples. Through use of the methods and systems of the current invention, focused saturation biopsy may be performed to exploit the benefits of a saturation biopsy while minimizing the drawbacks. After target areas suspicious of tumor are identified, a physician may sample four or more cores, all from the suspected area. This procedure avoids the need for high-concentration sampling in healthy areas of the prostate. Further, this procedure will not only improve detection, but will enable one to determine the extent of the disease.
  • These methods and systems of the current invention also enable physicians to later revisit the suspected areas for resampling over time in order to monitor the cancer's progression. Through active surveillance, physicians can assess the seriousness of the cancer and whether further treatment would be of benefit to the patient. Since many prostate cancers do not pose serious health threats, a surveillance program may often provide a preferable alternative to radical treatment, helping patients to avoid the risk of side effects associated with treatment.
  • In addition to MRI-ultrasound fusion, image-guided biopsy systems such as the Artemis may also be used in accordance with the current technology for performing an improved non-targeted, systematic biopsy under 3D ultrasonic guidance. When using conventional, unguided, systematic biopsy, the biopsy locations are not always symmetrically distributed and may be clustered. However, by attaching the image-guided biopsy system to an ultrasound probe, non-targeted systematic biopsy may be performed under the guidance of 3D ultrasonic imaging. This may allow for more even distribution of biopsy sites and wider sampling over conventional techniques. During biopsies performed using either MRI-ultrasound fusion or 3D ultrasonic guidance, the image data may be used as a map to assist the image-guided biopsy system in navigation of the biopsy needle, as well as tracking and recording the navigation.
  • The process described above may further include making treatment decisions and carrying out the treatment of prostate cancer using the image-guided biopsy system. The current invention provides physicians with information that can help them and patients make decisions about the course of care, whether it be watchful waiting, hormone therapy, targeted thermal ablation, nerve sparing robotic surgery, or radiation therapy. While computed tomography (CT) may be used, it can overestimate prostate volume by 35%. However, CT scans may be fused with MRI data to provide more accurate prediction of the correct staging, more precise target volume identification, and improved target delineation. For example, MRI, in combination with biopsy, will enhance patient selection for focal ablation by helping to localize clinically significant tumor foci.
  • White ultrasound at low intensities is commonly used for diagnostic and imaging applications, it can be used at higher intensities for therapeutic applications due to its ability to interact with biological tissues both thermally and mechanically. Thus, a further embodiment of the current invention contemplates the use of HIFU for treatment of prostate cancer in conjunction with the methods and apparatus previously described. An example of a commercially available HIFU system is the Sonablate 500 by Focus Surgery, Inc. (Indianapolis, Ind.), which is a HIFU therapy device that operates under the guidance of 3D ultrasound imaging. Such treatment systems can be improved by being configured to operate under the guidance of a fused MRI-ultrasound image.
  • During ablative therapy, temperatures in the tissue being ablated may be closely monitored and the subsequent zone of necrosis (thermal lesion) visualized, and used to update a real-time tissue model. Temperature monitoring for the visualization of a treated region may reduce recurrence rates of local tumor after therapy. Techniques for the foregoing may include microwave radiometry, ultrasound, impedance tomography, MRI, monitoring shifts in diagnostic pulse-echo ultrasound, and the real-time and in vivo monitoring of the spatial distribution of heating and temperature elevation, by measuring the local propagation velocity of sound through an elemental volume of such tissue structure, or through analysis of changes in backscattered energy. Other traditional methods of monitoring tissue temperature include thermometry, such as ultrasound thermometry and the use of a thermocouple.
  • MRI may also be used to monitor treatment, ensure tissue destruction, and avoid overheating surrounding structures. Further, because ultrasonic imaging is not always adequate for accurately defining areas that have been treated, MRI may be used to evaluate the success of the procedure. For instance, MRI may be used for assessment of extent of necrosis shortly after therapy and for long-term surveillance for residual or recurrent tumor that may then undergo targeted biopsy. Thus, another aspect of the technology provides post-operative image fusion, that is, performing an imaging procedure after completion of an interventional procedure, and fusing or integrating pre-operative and/or intra-operative imaging data to help understand the post-operative anatomy. For example, after aggressive therapy, a standard anatomical model of soft tissue may no longer be accurate, but by integrating the therapeutic intervention data, a more accurate understanding, imaging, and image analysis may be provided.
  • According to another aspect of the invention, a diagnostic and treatment image generation system includes at least one database containing image data from two different modalities, such as MRI and ultrasound data, and an image-guided biopsy and/or therapy system. The diagnostic and treatment image generation system may also include a computer programmed to aid in the transmission of the image data and/or the fusion of the data using the image-guided biopsy system.
  • In accordance with yet another aspect of the present invention, a computer readable storage medium has a non-transitory computer program stored thereon, to control an automated system to carry out various methods disclosed herein.
  • The present invention has been described in terms of the preferred embodiment, and it is recognized that equivalents, alternatives, and modifications, aside from those expressly stated, are possible and within the scope of the invention.

Claims (22)

What is claimed is:
1. A method for combining information from plurality of medical imaging modalities, comprising:
acquiring a first volumetric image using a first volumetric imaging modality of an anatomical region;
defining an elastic soft tissue model for at least a portion of the anatomical region encompassed by the first volumetric image;
labeling features of the anatomical region based on at least the first volumetric image and the soft tissue model, comprising at least features of the anatomical region which are visualized by both the first imaging modality and a second imaging modality, and features of the anatomical region which are poorly visualized in the second imaging modality;
acquiring a second volumetric image of the anatomical region using the second imaging modality comprising a real time image;
registering the features of the anatomical region which are visualized by both the first imaging modality and a second imaging modality, and the features of the anatomical region which are poorly visualized in the second imaging modality, with respect to the soft tissue model, such that the features of the anatomical region which are visualized by both the first imaging modality and a second imaging modality are linked, compensating for at least one distortion of the portion of the anatomical region between a first time of the first volumetric image and a second time of the second volumetric image; and
presenting an output based on at least the features of the anatomical region which are poorly visualized in the second imaging modality in the real time image, compensated based on at least the registered features and the soft tissue model.
2. The method according to claim 1, wherein the first imaging modality comprises at least one selected from the group consisting of positron emission tomography, computed tomography, magnetic resonance imaging, magnetic resonance spectrography imaging, photoacoustic imaging, high frequency ultrasound, and elastography.
3. The method according to claim 1, wherein the anatomical region comprises at least one organ selected from the group consisting of prostate, skin, heart, lung, kidney, liver, bladder, ovaries, and thyroid.
4. The method according to claim 1, wherein further comprising acquiring a tissue sample from a location determined based on at least the first imaging modality and the second imaging modality.
5. The method according to claim 1, wherein further comprising delivering a therapeutic intervention at a location determined based on at least the first imaging modality and the second imaging modality.
6. The method according to claim 5, wherein the therapeutic intervention includes one or more selected from the group consisting of laser ablation, radiofrequency ablation, high intensity focused ultrasound, brachytherapy, stem cell injection for ischemia of the heart, cryotherapy, direct injection of a photothermal or photodynamic agent, and radiotherapy.
7. The method according to claim 1, further comprising performing at least one image-guided at least partially automated procedure selected from the group consisting of high intensity focused ultrasound, IMRT, and robotic surgery.
8. The method according to claim 1, wherein the differentially visualized anatomical region comprises at least one selected from the group consisting of a suspicious lesion for targeted biopsy, a suspicious lesion for targeted therapy, and a lesion for targeted dose delivery.
9. The method according to claim 1, wherein the differentially visualized anatomical region is at least one anatomical structure to be spared in an invasive procedure, selected from the group consisting of a nerve bundle, a urethra, a rectum and a bladder.
10. The method according to claim 1, wherein the registered features comprise at least one anatomical landmark selected from the group consisting of a urethra, a urethra at a prostate base, a urethra at an apex, a verumontanum, a calcification and a cyst, a seminal vesicle, an ejaculatory duct, a bladder and a rectum.
11. The method according to claim 1, further comprising automatically defining a plan comprising a target and an invasive path to reach the target.
12. The method according to claim 11, wherein the plan is defined based on the first imaging modality, and is adapted in real time based on at least the second imaging modality.
13. The method according to claim 11, wherein the plan comprises a plurality of targets.
14. The method according to claim 1, wherein a plurality of anatomical features are consistently labeled in the first volumetric image and the second volumetric image.
15. The method according to claim 1, wherein the soft tissue model comprises an elastic triangular mesh approximating a surface of an organ.
16. The method according to claim 1, wherein the anatomical landmark registration is performed rigidly using a simultaneous landmark and surface registration algorithm.
17. The method according to claim 16, further comprising performing an affine registration.
18. The method according to claim 1, wherein the registering comprises an elastic registration based on at least one parameter selected from the group consisting of an intensity, a binary mask, and surfaces and landmarks.
19. The method according to claim 1, wherein the model is derived from a plurality of training datasets representing different states of deformation of an organ of a respective human using the first imaging modality and the second imaging modality.
20. The method according to claim 1, further comprising identifying a mismatch of corresponding anatomical features of the first volumetric image and the second volumetric image, and updating the registration to converge the corresponding anatomical features to reduce the mismatch based on corrections of an elastic deformation model constrained by object boundaries.
21. A method for combining information from plurality of medical imaging modalities, comprising:
acquiring volumetric images using a first volumetric imaging modality of an anatomical region of a person under a plurality of states of deformation;
acquiring volumetric images using a second volumetric imaging modality of the anatomical region of the person under a plurality of states of deformation;
defining an elastic soft tissue model for the anatomical region comprising model parameters representing tissue compliance and surface properties;
labeling features of the anatomical region based on at least the volumetric images of the first imaging modality, the volumetric images of the second imaging modality, and the soft tissue model, wherein the labeling aligns corresponding features and compensates for rigid, elastic and affine transform of the anatomical region between times for acquiring the volumetric images of the first imaging modality and the volumetric images of the second imaging modality; and
presenting an output based on at least the labeled features of the anatomical region.
22. A system for combining information from plurality of medical imaging modalities, comprising:
an input port configured to receive at least two first volumetric images using a first volumetric imaging modality of an anatomical region representing respectively different states of elastic deformation, and at least two second volumetric images using a second volumetric imaging modality, of the anatomical region representing respectively different states of elastic deformation;
at least one processor configured to define an elastic soft tissue model for at least a portion of the anatomical region encompassed by the first volumetric image, and to label features of the anatomical region based on at least the first volumetric image and the soft tissue model; and
a memory configured to store the defined elastic soft tissue model.
US13/835,479 2012-08-21 2013-03-15 System and method for image guided medical procedures Abandoned US20140073907A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/835,479 US20140073907A1 (en) 2012-09-12 2013-03-15 System and method for image guided medical procedures
PCT/US2013/055561 WO2014031531A1 (en) 2012-08-21 2013-08-19 System and method for image guided medical procedures

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261700273P 2012-09-12 2012-09-12
US13/835,479 US20140073907A1 (en) 2012-09-12 2013-03-15 System and method for image guided medical procedures

Publications (1)

Publication Number Publication Date
US20140073907A1 true US20140073907A1 (en) 2014-03-13

Family

ID=50233964

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/835,479 Abandoned US20140073907A1 (en) 2012-08-21 2013-03-15 System and method for image guided medical procedures
US14/024,025 Active 2034-12-21 US10835215B2 (en) 2012-09-12 2013-09-11 Method and apparatus for laser ablation under ultrasound guidance
US14/024,009 Abandoned US20140081253A1 (en) 2012-09-12 2013-09-11 Method and apparatus for laser ablation under ultrasound guidance
US17/065,468 Pending US20210100535A1 (en) 2012-09-12 2020-10-07 Method and apparatus for laser ablation under ultrasound guidance

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/024,025 Active 2034-12-21 US10835215B2 (en) 2012-09-12 2013-09-11 Method and apparatus for laser ablation under ultrasound guidance
US14/024,009 Abandoned US20140081253A1 (en) 2012-09-12 2013-09-11 Method and apparatus for laser ablation under ultrasound guidance
US17/065,468 Pending US20210100535A1 (en) 2012-09-12 2020-10-07 Method and apparatus for laser ablation under ultrasound guidance

Country Status (2)

Country Link
US (4) US20140073907A1 (en)
WO (1) WO2014043201A1 (en)

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110101980A1 (en) * 2009-10-29 2011-05-05 Yoshiharu Ohiwa Magnetic resonance imaging apparatus
US20130259355A1 (en) * 2012-03-30 2013-10-03 Yiannis Kyriakou Method for determining an artifact-reduced three-dimensional image data set and x-ray device
US20140228707A1 (en) * 2013-02-11 2014-08-14 Definiens Ag Coregistering Images of Needle Biopsies Using Multiple Weighted Landmarks
US20140286549A1 (en) * 2013-03-21 2014-09-25 Canon Kabushiki Kaisha Object information acquiring apparatus and control method thereof
US20140306126A1 (en) * 2011-11-21 2014-10-16 INSERM (Institut National de la Santé et de la Recherche Médicale) Prostate phantom, system for planning a focal therapy of a prostate cancer comprising such prostate phantom and method for planning a focal therapy of a prostate cancer implementing such system
US20140321726A1 (en) * 2013-04-25 2014-10-30 Samsung Medison Co., Ltd. Method and apparatus for image registration
CN104268885A (en) * 2014-10-07 2015-01-07 电子科技大学 MRI and MRSI data fusion method based on NMF
US20150117727A1 (en) * 2013-10-31 2015-04-30 Toshiba Medical Systems Corporation Medical image data processing apparatus and method
DE102014212089A1 (en) * 2014-06-24 2015-07-23 Siemens Aktiengesellschaft Method for monitoring the image of a minimally invasive procedure, image processing device and ultrasound image recording device
DE102014219581A1 (en) * 2014-09-26 2015-09-17 Siemens Aktiengesellschaft A method, apparatus and computer program for registering a medical image with an anatomical structure
US9177378B2 (en) 2013-02-11 2015-11-03 Definiens Ag Updating landmarks to improve coregistration as regions of interest are corrected
US20160027178A1 (en) * 2014-07-23 2016-01-28 Sony Corporation Image registration system with non-rigid registration and method of operation thereof
US20160038081A1 (en) * 2013-04-05 2016-02-11 Koninklijke Philips N.V. Real time energy depositing therapy system controlled by magnetic resonance rheology
RU2578184C1 (en) * 2015-03-06 2016-03-20 Государственное бюджетное образовательное учреждение высшего профессионального образования Первый Московский государственный медицинский университет им. И.М. Сеченова Министерства здравоохранения Российской Федерации (ГБОУ ВПО Первый МГМУ им. И.М. Сеченова Минздрава России) Method for dynamic magnetic resonance diagnosis of malignant tumours of ovaries
US20160128654A1 (en) * 2014-02-25 2016-05-12 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US20160253804A1 (en) * 2013-10-30 2016-09-01 Koninklijke Philips N.V. Assisting apparatus for assisting in registering an imaging device with a position and shape determination device
CN106501182A (en) * 2016-09-22 2017-03-15 南京大学 A kind of method of the intrinsic Zymography nondestructive measurement elasticity of utilization optoacoustic
US20170084022A1 (en) * 2015-09-23 2017-03-23 Analogic Corporation Real Time Image Based Risk Assessment for an Instrument along a Path to a Target in an object
US20170217102A1 (en) * 2016-01-29 2017-08-03 Siemens Medical Solutions Usa, Inc. Multi-Modality Image Fusion for 3D Printing of Organ Morphology and Physiology
WO2017165801A1 (en) * 2016-03-24 2017-09-28 The Regents Of The University Of California Deep-learning-based cancer classification using a hierarchical classification framework
US20170281233A1 (en) * 2013-02-19 2017-10-05 Stryker European Holdings I, Llc Software for use with deformity correction
US9805248B2 (en) 2014-08-29 2017-10-31 Definiens Ag Applying pixelwise descriptors to a target image that are generated by segmenting objects in other images
US9858665B2 (en) 2015-04-03 2018-01-02 Regents Of The University Of Minnesota Medical imaging device rendering predictive prostate cancer visualizations using quantitative multiparametric MRI models
US20180047183A1 (en) * 2015-06-30 2018-02-15 Brainlab Ag Medical Image Fusion with Reduced Search Space
WO2018175094A1 (en) * 2017-03-21 2018-09-27 Canon U.S.A., Inc. Methods, apparatuses and storage mediums for ablation planning and performance
WO2018183217A1 (en) * 2017-03-25 2018-10-04 Bianco Fernando J System and method for prostate cancer treatment under local anesthesia
US10154884B2 (en) 2016-06-02 2018-12-18 Stryker European Holdings I, Llc Software for use with deformity correction
CN109256023A (en) * 2018-11-28 2019-01-22 中国科学院武汉物理与数学研究所 A kind of measurement method of pulmonary airways microstructure model
US10255997B2 (en) 2016-07-12 2019-04-09 Mindshare Medical, Inc. Medical analytics system
RU2685920C1 (en) * 2019-01-10 2019-04-23 Федеральное государственное бюджетное учреждение "Национальный медицинский исследовательский центр хирургии им. А.В. Вишневского" Минздрава России Method for intraoperative evaluation of renal blood supply after extracorporal nephrectomy in conditions of pharmaco-cold ischemia without ureter crossing with orthotopic vessel replacement by ultrasonic examination data
RU2695007C1 (en) * 2018-12-27 2019-07-18 федеральное государственное бюджетное образовательное учреждение высшего образования "Первый Санкт-Петербургский государственный медицинский университет имени академика И.П. Павлова" Министерства здравоохранения Российской Федерации Method of treating echinococcal hepatic cyst of the type ce2b, ce3b
US20190295268A1 (en) * 2018-03-25 2019-09-26 Varian Medical Systems International Ag Deformable image registration with awareness of computed tomography (ct) reconstruction area
US10438096B2 (en) 2016-12-27 2019-10-08 Definiens Ag Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring
US10610305B2 (en) 2016-05-22 2020-04-07 DePuy Synthes Products, Inc. Systems and methods for intra-operative image acquisition and calibration
JP2020512166A (en) * 2017-03-20 2020-04-23 エグザクト イメージング インコーポレイテッド Method and system for visually assisting an operator of an ultrasound system
WO2020095078A1 (en) 2018-11-09 2020-05-14 Semmelweis Egyetem Exoskeleton for assisting surgical positioning, method for producing the exoskeleton
WO2020234409A1 (en) * 2019-05-22 2020-11-26 Koninklijke Philips N.V. Intraoperative imaging-based surgical navigation
US20210068900A1 (en) * 2019-09-11 2021-03-11 Ardeshir Rastinehad Method for providing clinical support for surgical guidance during robotic surgery
US10959780B2 (en) * 2015-06-26 2021-03-30 Therenva Method and system for helping to guide an endovascular tool in vascular structures
CN112654324A (en) * 2018-07-26 2021-04-13 柯惠有限合伙公司 System and method for providing assistance during surgery
CN113040873A (en) * 2019-12-27 2021-06-29 深圳市理邦精密仪器股份有限公司 Image processing method of ultrasound image, ultrasound apparatus, and storage medium
US11051902B2 (en) * 2015-09-09 2021-07-06 Koninklijke Philips N.V. System and method for planning and performing a repeat interventional procedure
CN113393427A (en) * 2021-05-28 2021-09-14 上海联影医疗科技股份有限公司 Plaque analysis method, plaque analysis device, computer equipment and storage medium
US20210353212A1 (en) * 2020-05-15 2021-11-18 Canon Medical Systems Corporation Magnetic resonance imaging apparatus, method, and storage medium
CN113842210A (en) * 2021-08-02 2021-12-28 应葵 Vertebral tumor microwave ablation operation simulation method and device
US20220008143A1 (en) * 2019-10-11 2022-01-13 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
US11257210B2 (en) * 2018-06-25 2022-02-22 The Royal Institution For The Advancement Of Learning / Mcgill University Method and system of performing medical treatment outcome assessment or medical condition diagnostic
US11291521B2 (en) * 2018-11-14 2022-04-05 Seung Joon IM Surgery assisting device using augmented reality
US11304683B2 (en) 2019-09-13 2022-04-19 General Electric Company Biopsy workflow using multimodal imaging
US20220304751A1 (en) * 2020-03-17 2022-09-29 Boe Technology Group Co., Ltd. Optical scale and method for coordinate system registration
CN115363715A (en) * 2022-08-30 2022-11-22 衡阳市中心医院 Gastrointestinal drainage puncture device
US11543482B2 (en) * 2018-10-16 2023-01-03 Koninklijke Philips N.V. Magnetic resonance imaging using motion-compensated image reconstruction
US11631171B2 (en) 2019-01-10 2023-04-18 Regents Of The University Of Minnesota Automated detection and annotation of prostate cancer on histopathology slides
US11633146B2 (en) 2019-01-04 2023-04-25 Regents Of The University Of Minnesota Automated co-registration of prostate MRI data
US11642174B2 (en) 2014-02-25 2023-05-09 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11701090B2 (en) 2017-08-16 2023-07-18 Mako Surgical Corp. Ultrasound bone registration with learning-based segmentation and sound speed calibration
US11887306B2 (en) 2021-08-11 2024-01-30 DePuy Synthes Products, Inc. System and method for intraoperatively determining image alignment

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2727868T3 (en) 2011-09-22 2019-10-21 Univ George Washington Systems for visualizing ablated tissue
JP5926806B2 (en) 2011-09-22 2016-05-25 ザ・ジョージ・ワシントン・ユニバーシティThe George Washingtonuniversity System and method for visualizing ablated tissue
US20150141847A1 (en) 2013-11-20 2015-05-21 The George Washington University Systems and methods for hyperspectral analysis of cardiac tissue
DE102014002762A1 (en) * 2014-03-04 2015-09-10 Storz Endoskop Produktions Gmbh Measuring device and measuring method for detecting an ambient temperature of a device as well as device and method for medical insufflation
US10420608B2 (en) 2014-05-20 2019-09-24 Verily Life Sciences Llc System for laser ablation surgery
KR102612185B1 (en) 2014-11-03 2023-12-08 460메디컬, 인크. Systems and methods for assessment of contact quality
JP2017537681A (en) 2014-11-03 2017-12-21 ザ・ジョージ・ワシントン・ユニバーシティThe George Washingtonuniversity Damage evaluation system and method
US10779904B2 (en) 2015-07-19 2020-09-22 460Medical, Inc. Systems and methods for lesion formation and assessment
KR20180100421A (en) * 2016-01-26 2018-09-10 더 리전츠 오브 더 유니버시티 오브 캘리포니아 A system for out-of-site laser therapy
JP7287888B2 (en) 2016-06-27 2023-06-06 ギャラリー,インコーポレイテッド A generator, a catheter with electrodes, and a method of treating a lung passageway
US11464568B2 (en) 2017-05-10 2022-10-11 Best Medical International, Inc. Customizable saturation biopsy
EP3496038A1 (en) * 2017-12-08 2019-06-12 Koninklijke Philips N.V. Registration of static pre-procedural planning data to dynamic intra-procedural segmentation data
EP4164476A1 (en) * 2020-06-12 2023-04-19 Medtronic Navigation, Inc. System and method for correlating proton resonance frequency thermometry with tissue temperatures
WO2021262355A1 (en) * 2020-06-22 2021-12-30 Medtronic Navigation, Inc. Protecting non-target tissue during ablation procedures and related systems and methods
CN112809207B (en) * 2021-02-23 2022-08-05 青岛理工大学 Multi-degree-of-freedom focused ultrasound-assisted laser processing device
DE102022205706A1 (en) * 2022-06-03 2023-12-14 Richard Wolf Gmbh System for carrying out transcutaneous photodynamic therapy (PDT) in an organ or organ segment of an organic body
CN116492049B (en) * 2023-06-29 2023-10-03 北京智愈医疗科技有限公司 Method and device for determining conformal ablation range of prostate

Family Cites Families (185)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5542915A (en) 1992-08-12 1996-08-06 Vidamed, Inc. Thermal mapping catheter with ultrasound probe
US5421819A (en) 1992-08-12 1995-06-06 Vidamed, Inc. Medical probe device
US5435805A (en) 1992-08-12 1995-07-25 Vidamed, Inc. Medical probe device with optical viewing capability
US5385544A (en) 1992-08-12 1995-01-31 Vidamed, Inc. BPH ablation method and apparatus
US5370675A (en) 1992-08-12 1994-12-06 Vidamed, Inc. Medical probe device and method
IL87649A (en) 1988-09-01 1992-07-15 Elscint Ltd Hyperthermic power delivery system
US5109859A (en) 1989-10-04 1992-05-05 Beth Israel Hospital Association Ultrasound guided laser angioplasty
US5409453A (en) 1992-08-12 1995-04-25 Vidamed, Inc. Steerable medical probe with stylets
US5769812A (en) 1991-07-16 1998-06-23 Heartport, Inc. System for cardiac procedures
US5584803A (en) 1991-07-16 1996-12-17 Heartport, Inc. System for cardiac procedures
US6406486B1 (en) 1991-10-03 2002-06-18 The General Hospital Corporation Apparatus and method for vasodilation
WO1993006780A1 (en) 1991-10-03 1993-04-15 The General Hospital Corporation Apparatus and method for vasodilation
US6325792B1 (en) 1991-11-06 2001-12-04 Casimir A. Swinger Ophthalmic surgical laser and method
US5720718A (en) 1992-08-12 1998-02-24 Vidamed, Inc. Medical probe apparatus with enhanced RF, resistance heating, and microwave ablation capabilities
US5630794A (en) 1992-08-12 1997-05-20 Vidamed, Inc. Catheter tip and method of manufacturing
US5514131A (en) 1992-08-12 1996-05-07 Stuart D. Edwards Method for the ablation treatment of the uvula
US5556377A (en) 1992-08-12 1996-09-17 Vidamed, Inc. Medical probe apparatus with laser and/or microwave monolithic integrated circuit probe
US5672153A (en) 1992-08-12 1997-09-30 Vidamed, Inc. Medical probe device and method
US5720719A (en) 1992-08-12 1998-02-24 Vidamed, Inc. Ablative catheter with conformable body
US5470308A (en) 1992-08-12 1995-11-28 Vidamed, Inc. Medical probe with biopsy stylet
US5456662A (en) 1993-02-02 1995-10-10 Edwards; Stuart D. Method for reducing snoring by RF ablation of the uvula
US6346074B1 (en) 1993-02-22 2002-02-12 Heartport, Inc. Devices for less invasive intracardiac interventions
US5630837A (en) 1993-07-01 1997-05-20 Boston Scientific Corporation Acoustic ablation
US5451221A (en) 1993-12-27 1995-09-19 Cynosure, Inc. Endoscopic light delivery system
WO1995029737A1 (en) 1994-05-03 1995-11-09 Board Of Regents, The University Of Texas System Apparatus and method for noninvasive doppler ultrasound-guided real-time control of tissue damage in thermal therapy
US5476461A (en) 1994-05-13 1995-12-19 Cynosure, Inc. Endoscopic light delivery system
US5582171A (en) 1994-07-08 1996-12-10 Insight Medical Systems, Inc. Apparatus for doppler interferometric imaging and imaging guidewire
US5545195A (en) 1994-08-01 1996-08-13 Boston Scientific Corporation Interstitial heating of tissue
US6423055B1 (en) 1999-07-14 2002-07-23 Cardiofocus, Inc. Phototherapeutic wave guide apparatus
US8025661B2 (en) 1994-09-09 2011-09-27 Cardiofocus, Inc. Coaxial catheter instruments for ablation with radiant energy
US6572609B1 (en) 1999-07-14 2003-06-03 Cardiofocus, Inc. Phototherapeutic waveguide apparatus
US5498258A (en) 1994-09-13 1996-03-12 Hakky; Said I. Laser resectoscope with laser induced mechanical cutting means
US5849005A (en) 1995-06-07 1998-12-15 Heartport, Inc. Method and apparatus for minimizing the risk of air embolism when performing a procedure in a patient's thoracic cavity
US6575969B1 (en) 1995-05-04 2003-06-10 Sherwood Services Ag Cool-tip radiofrequency thermosurgery electrode system for tumor ablation
EP0835075A4 (en) 1995-06-30 1999-06-23 Boston Scient Corp Ultrasound imaging catheter with a cutting element
US5753207A (en) 1995-08-21 1998-05-19 Beth Israel Deaconess Medical Center, Inc. Use of paramagnetic compounds to measure temperature and pH in vivo
US6615071B1 (en) 1995-09-20 2003-09-02 Board Of Regents, The University Of Texas System Method and apparatus for detecting vulnerable atherosclerotic plaque
AU709432B2 (en) 1995-09-20 1999-08-26 California Institute Of Technology Detecting thermal discrepancies in vessel walls
US6763261B2 (en) 1995-09-20 2004-07-13 Board Of Regents, The University Of Texas System Method and apparatus for detecting vulnerable atherosclerotic plaque
US5749848A (en) 1995-11-13 1998-05-12 Cardiovascular Imaging Systems, Inc. Catheter system having imaging, balloon angioplasty, and stent deployment capabilities, and method of use for guided stent deployment
US5800389A (en) 1996-02-09 1998-09-01 Emx, Inc. Biopsy device
US6203524B1 (en) 1997-02-10 2001-03-20 Emx, Inc. Surgical and pharmaceutical site access guide and methods
US5769843A (en) 1996-02-20 1998-06-23 Cormedica Percutaneous endomyocardial revascularization
US6047216A (en) 1996-04-17 2000-04-04 The United States Of America Represented By The Administrator Of The National Aeronautics And Space Administration Endothelium preserving microwave treatment for atherosclerosis
US5944687A (en) 1996-04-24 1999-08-31 The Regents Of The University Of California Opto-acoustic transducer for medical applications
US6022309A (en) 1996-04-24 2000-02-08 The Regents Of The University Of California Opto-acoustic thrombolysis
US5792070A (en) 1996-08-30 1998-08-11 Urologix, Inc. Rectal thermosensing unit
US7603166B2 (en) 1996-09-20 2009-10-13 Board Of Regents University Of Texas System Method and apparatus for detection of vulnerable atherosclerotic plaque
US5906636A (en) 1996-09-20 1999-05-25 Texas Heart Institute Heat treatment of inflamed tissue
US6459925B1 (en) 1998-11-25 2002-10-01 Fischer Imaging Corporation User interface system for mammographic imager
US5776062A (en) 1996-10-15 1998-07-07 Fischer Imaging Corporation Enhanced breast imaging/biopsy system employing targeted ultrasound
US5853368A (en) 1996-12-23 1998-12-29 Hewlett-Packard Company Ultrasound imaging catheter having an independently-controllable treatment structure
US6056742A (en) 1997-02-03 2000-05-02 Eclipse Surgical Technologies, Inc. Revascularization with laser outputs
US6024703A (en) 1997-05-07 2000-02-15 Eclipse Surgical Technologies, Inc. Ultrasound device for axial ranging
US6514249B1 (en) 1997-07-08 2003-02-04 Atrionix, Inc. Positioning system and method for orienting an ablation element within a pulmonary vein ostium
US6013072A (en) 1997-07-09 2000-01-11 Intraluminal Therapeutics, Inc. Systems and methods for steering a catheter through body tissue
US6500121B1 (en) 1997-10-14 2002-12-31 Guided Therapy Systems, Inc. Imaging, therapy, and temperature monitoring ultrasonic system
US6050943A (en) 1997-10-14 2000-04-18 Guided Therapy Systems, Inc. Imaging, therapy, and temperature monitoring ultrasonic system
US6375634B1 (en) 1997-11-19 2002-04-23 Oncology Innovations, Inc. Apparatus and method to encapsulate, kill and remove malignancies, including selectively increasing absorption of x-rays and increasing free-radical damage to residual tumors targeted by ionizing and non-ionizing radiation therapy
US6368318B1 (en) 1998-01-23 2002-04-09 The Regents Of The University Of California Opto-acoustic recanilization delivery system
US6440127B2 (en) * 1998-02-11 2002-08-27 Cosman Company, Inc. Method for performing intraurethral radio-frequency urethral enlargement
US6517534B1 (en) 1998-02-11 2003-02-11 Cosman Company, Inc. Peri-urethral ablation
US6199554B1 (en) 1998-03-27 2001-03-13 The Brigham And Women's Hospital, Inc. Method and apparatus for combining injury-mediated therapy and drug delivery
DK1098641T3 (en) 1998-07-27 2016-08-15 St Jude Pharmaceuticals Inc Chemically induced intracellular hyperthermia
US6149596A (en) 1998-11-05 2000-11-21 Bancroft; Michael R. Ultrasonic catheter apparatus and method
US6428532B1 (en) 1998-12-30 2002-08-06 The General Hospital Corporation Selective tissue targeting by difference frequency of two wavelengths
US6432102B2 (en) 1999-03-15 2002-08-13 Cryovascular Systems, Inc. Cryosurgical fluid supply
US6630142B2 (en) 1999-05-03 2003-10-07 Zymogenetics, Inc. Method of treating fibroproliferative disorders
US7426409B2 (en) 1999-06-25 2008-09-16 Board Of Regents, The University Of Texas System Method and apparatus for detecting vulnerable atherosclerotic plaque
US6387088B1 (en) 1999-06-30 2002-05-14 John H. Shattuck Photoionization enabled electrochemical material removal techniques for use in biomedical fields
US7935108B2 (en) 1999-07-14 2011-05-03 Cardiofocus, Inc. Deflectable sheath catheters
GB2352401B (en) 1999-07-20 2001-06-06 Ajoy Inder Singh Atheroma ablation
US6642274B1 (en) 1999-09-09 2003-11-04 Gary W. Neal Methods and compositions for preventing and treating prostate disorders
FR2798296B1 (en) 1999-09-13 2002-05-31 Centre Nat Rech Scient ASSEMBLY FOR HEAT TREATMENT OF BIOLOGICAL TISSUES AND METHOD FOR IMPLEMENTING SAME
US6542767B1 (en) 1999-11-09 2003-04-01 Biotex, Inc. Method and system for controlling heat delivery to a target
KR100771149B1 (en) 1999-12-10 2007-10-30 아이싸이언스 인터벤셔날 코포레이션 Treatment of ocular disease
US6751490B2 (en) 2000-03-01 2004-06-15 The Board Of Regents Of The University Of Texas System Continuous optoacoustic monitoring of hemoglobin concentration and hematocrit
US6494844B1 (en) 2000-06-21 2002-12-17 Sanarus Medical, Inc. Device for biopsy and treatment of breast tumors
US6419635B1 (en) 2000-08-11 2002-07-16 General Electric Compsany In situ tumor temperature profile measuring probe and method
US6398711B1 (en) * 2000-08-25 2002-06-04 Neoseed Technology Llc Pivoting needle template apparatus for brachytherapy treatment of prostate disease and methods of use
US6436059B1 (en) 2000-09-12 2002-08-20 Claudio I. Zanelli Detection of imd contact and alignment based on changes in frequency response characteristics
US6546276B1 (en) 2000-09-12 2003-04-08 Claudio I. Zanelli Ultrasonic based detection of interventional medical device contact and alignment
US6589174B1 (en) 2000-10-20 2003-07-08 Sunnybrook & Women's College Health Sciences Centre Technique and apparatus for ultrasound therapy
US6618620B1 (en) 2000-11-28 2003-09-09 Txsonics Ltd. Apparatus for controlling thermal dosing in an thermal treatment system
US7914453B2 (en) 2000-12-28 2011-03-29 Ardent Sound, Inc. Visual imaging system for ultrasonic probe
US6743226B2 (en) 2001-02-09 2004-06-01 Cosman Company, Inc. Adjustable trans-urethral radio-frequency ablation
US6559644B2 (en) 2001-05-30 2003-05-06 Insightec - Txsonics Ltd. MRI-based temperature mapping with error compensation
WO2003005889A2 (en) 2001-07-10 2003-01-23 Ams Research Corporation Surgical kit for treating prostate tissue
US6669693B2 (en) 2001-11-13 2003-12-30 Mayo Foundation For Medical Education And Research Tissue ablation device and methods of using
US7477925B2 (en) 2002-01-17 2009-01-13 Charlotte-Mecklenburg Hospital Authority Erythema measuring device and method
US7422568B2 (en) 2002-04-01 2008-09-09 The Johns Hopkins University Device, systems and methods for localized heating of a vessel and/or in combination with MR/NMR imaging of the vessel and surrounding tissue
US7359745B2 (en) 2002-05-15 2008-04-15 Case Western Reserve University Method to correct magnetic field/phase variations in proton resonance frequency shift thermometry in magnetic resonance imaging
US6997924B2 (en) 2002-09-17 2006-02-14 Biosense Inc. Laser pulmonary vein isolation
US7644715B2 (en) 2002-10-31 2010-01-12 Cooltouch, Incorporated Restless leg syndrome treatment
US7156816B2 (en) 2002-11-26 2007-01-02 Biosense, Inc. Ultrasound pulmonary vein isolation
US8088067B2 (en) 2002-12-23 2012-01-03 Insightec Ltd. Tissue aberration corrections in ultrasound therapy
US7201749B2 (en) 2003-02-19 2007-04-10 Biosense, Inc. Externally-applied high intensity focused ultrasound (HIFU) for pulmonary vein isolation
US7297154B2 (en) 2003-02-24 2007-11-20 Maxwell Sensors Inc. Optical apparatus for detecting and treating vulnerable plaque
US8383158B2 (en) 2003-04-15 2013-02-26 Abbott Cardiovascular Systems Inc. Methods and compositions to treat myocardial conditions
US7611462B2 (en) 2003-05-22 2009-11-03 Insightec-Image Guided Treatment Ltd. Acoustic beam forming in phased arrays including large numbers of transducer elements
WO2004106947A2 (en) 2003-05-23 2004-12-09 Johns Hopkins University Steady state free precession based magnetic resonance
CA2530298A1 (en) 2003-06-30 2005-01-20 Board Of Regents, The University Of Texas System Methods and apparatuses for fast chemical shift magnetic resonance imaging
US20050033315A1 (en) 2003-08-01 2005-02-10 Hankins Carol A. Apparatus and method for guiding a medical device
AU2004285412A1 (en) 2003-09-12 2005-05-12 Minnow Medical, Llc Selectable eccentric remodeling and/or ablation of atherosclerotic material
WO2005065337A2 (en) 2003-12-29 2005-07-21 Hankins Carol A Apparatus and method for guiding a medical device in multiple planes
US7470056B2 (en) 2004-02-12 2008-12-30 Industrial Measurement Systems, Inc. Methods and apparatus for monitoring a condition of a material
US7313155B1 (en) 2004-02-12 2007-12-25 Liyue Mu High power Q-switched laser for soft tissue ablation
US7508205B2 (en) 2004-04-29 2009-03-24 Koninklijke Philips Electronics N.V. Magnetic resonance imaging system, a method of magnetic resonance imaging and a computer program
US8235909B2 (en) 2004-05-12 2012-08-07 Guided Therapy Systems, L.L.C. Method and system for controlled scanning, imaging and/or therapy
US8137340B2 (en) 2004-06-23 2012-03-20 Applied Harmonics Corporation Apparatus and method for soft tissue ablation employing high power diode-pumped laser
US7632262B2 (en) 2004-07-19 2009-12-15 Nexeon Medical Systems, Inc. Systems and methods for atraumatic implantation of bio-active agents
EP2708191B1 (en) 2004-08-02 2021-03-03 V.V.T. Medical Ltd. Device for treating a vessel
US8409099B2 (en) 2004-08-26 2013-04-02 Insightec Ltd. Focused ultrasound system for surrounding a body tissue mass and treatment method
WO2006027783A2 (en) 2004-09-08 2006-03-16 Ramot At Tel Aviv University Ltd. Mri imaging and contrast method
US7742795B2 (en) 2005-03-28 2010-06-22 Minnow Medical, Inc. Tuned RF energy for selective treatment of atheroma and other target tissues and/or structures
US8396548B2 (en) 2008-11-14 2013-03-12 Vessix Vascular, Inc. Selective drug delivery in a lumen
US7758524B2 (en) 2004-10-06 2010-07-20 Guided Therapy Systems, L.L.C. Method and system for ultra-high frequency ultrasound treatment
US7367944B2 (en) 2004-12-13 2008-05-06 Tel Hashomer Medical Research Infrastructure And Services Ltd. Method and system for monitoring ablation of tissues
US7833281B2 (en) 2004-12-15 2010-11-16 Lehman Glen A Method and apparatus for augmentation of a sphincter
US8007440B2 (en) 2005-02-08 2011-08-30 Volcano Corporation Apparatus and methods for low-cost intravascular ultrasound imaging and for crossing severe vascular occlusions
US7699838B2 (en) 2005-02-16 2010-04-20 Case Western Reserve University System and methods for image-guided thermal treatment of tissue
JP4666142B2 (en) 2005-03-08 2011-04-06 株式会社ゼネシス Heat exchanger outer shell structure
US8801701B2 (en) * 2005-03-09 2014-08-12 Sunnybrook Health Sciences Centre Method and apparatus for obtaining quantitative temperature measurements in prostate and other tissue undergoing thermal therapy treatment
US7771418B2 (en) 2005-03-09 2010-08-10 Sunnybrook Health Sciences Centre Treatment of diseased tissue using controlled ultrasonic heating
US8765116B2 (en) * 2005-03-24 2014-07-01 Medifocus, Inc. Apparatus and method for pre-conditioning/fixation and treatment of disease with heat activation/release with thermoactivated drugs and gene products
US8187621B2 (en) 2005-04-19 2012-05-29 Advanced Cardiovascular Systems, Inc. Methods and compositions for treating post-myocardial infarction damage
US7571336B2 (en) 2005-04-25 2009-08-04 Guided Therapy Systems, L.L.C. Method and system for enhancing safety with medical peripheral device by monitoring if host computer is AC powered
US7799019B2 (en) 2005-05-10 2010-09-21 Vivant Medical, Inc. Reinforced high strength microwave antenna
US7621890B2 (en) 2005-06-09 2009-11-24 Endocare, Inc. Heat exchange catheter with multi-lumen tube having a fluid return passageway
US7621889B2 (en) 2005-06-09 2009-11-24 Endocare, Inc. Heat exchange catheter and method of use
US8016757B2 (en) 2005-09-30 2011-09-13 University Of Washington Non-invasive temperature estimation technique for HIFU therapy monitoring using backscattered ultrasound
US7874986B2 (en) 2006-04-20 2011-01-25 Gynesonics, Inc. Methods and devices for visualization and ablation of tissue
US20070239011A1 (en) 2006-01-13 2007-10-11 Mirabilis Medica, Inc. Apparatus for delivering high intensity focused ultrasound energy to a treatment site internal to a patient's body
US20070225544A1 (en) 2006-02-17 2007-09-27 Vance Waseet Apparatuses and techniques for bioactive drug delivery in the prostate gland
CA2649119A1 (en) 2006-04-13 2007-12-13 Mirabilis Medica, Inc. Methods and apparatus for the treatment of menometrorrhagia, endometrial pathology, and cervical neoplasia using high intensity focused ultrasound energy
US8235901B2 (en) 2006-04-26 2012-08-07 Insightec, Ltd. Focused ultrasound system with far field tail suppression
US8155416B2 (en) * 2008-02-04 2012-04-10 INTIO, Inc. Methods and apparatuses for planning, performing, monitoring and assessing thermal ablation
US7871406B2 (en) 2006-08-04 2011-01-18 INTIO, Inc. Methods for planning and performing thermal ablation
US8556888B2 (en) * 2006-08-04 2013-10-15 INTIO, Inc. Methods and apparatuses for performing and monitoring thermal ablation
DE102006040420A1 (en) 2006-08-29 2008-03-13 Siemens Ag Thermal ablation e.g. microwave ablation, implementing and monitoring device for treating tumor of patient, has magnet resonance system producing images composed of voxel, where geometry of voxel is adapted to form of ultrasonic focus
US20080146912A1 (en) 2006-12-18 2008-06-19 University Of Maryland, Baltimore Inter-communicator process for simultaneous mri thermography and radio frequency ablation
US7742567B2 (en) 2007-04-11 2010-06-22 Searete Llc Compton scattered X-ray visualization, imaging, or information provider with time of flight computation
US7623625B2 (en) 2007-04-11 2009-11-24 Searete Llc Compton scattered X-ray visualization, imaging, or information provider with scattering event locating
US7711089B2 (en) 2007-04-11 2010-05-04 The Invention Science Fund I, Llc Scintillator aspects of compton scattered X-ray visualization, imaging, or information providing
US8496653B2 (en) 2007-04-23 2013-07-30 Boston Scientific Scimed, Inc. Thrombus removal
US8478380B2 (en) 2007-05-04 2013-07-02 Wisconsin Alumni Research Foundation Magnetic resonance thermometry in the presence of water and fat
PL2148628T3 (en) 2007-05-07 2018-03-30 Rathore, Jaswant A method and a system for laser photoablation within a lens
US8052604B2 (en) 2007-07-31 2011-11-08 Mirabilis Medica Inc. Methods and apparatus for engagement and coupling of an intracavitory imaging and high intensity focused ultrasound probe
US8251908B2 (en) 2007-10-01 2012-08-28 Insightec Ltd. Motion compensated image-guided focused ultrasound therapy system
DE102007048970A1 (en) 2007-10-12 2009-04-23 Siemens Ag B0 field drift correction in a magnetic resonance tomographic temperature chart
US8439907B2 (en) 2007-11-07 2013-05-14 Mirabilis Medica Inc. Hemostatic tissue tunnel generator for inserting treatment apparatus into tissue of a patient
US8187270B2 (en) 2007-11-07 2012-05-29 Mirabilis Medica Inc. Hemostatic spark erosion tissue tunnel generator with integral treatment providing variable volumetric necrotization of tissue
US20100092424A1 (en) 2007-11-21 2010-04-15 Sanghvi Narendra T Method of diagnosis and treatment of tumors using high intensity focused ultrasound
US20110104052A1 (en) 2007-12-03 2011-05-05 The Johns Hopkins University Methods of synthesis and use of chemospheres
US8287602B2 (en) 2007-12-12 2012-10-16 Boston Scientific Scimed, Inc. Urinary stent
TWI406684B (en) 2008-01-16 2013-09-01 Univ Chang Gung Apparatus and method for real-time temperature measuring with the focused ultrasound system
DE102008014928B4 (en) 2008-03-19 2010-01-28 Siemens Aktiengesellschaft B0 field drift correction in a magnetic resonance tomographic temperature chart
US8278053B2 (en) 2008-05-19 2012-10-02 The Board Of Trustees Of The Leland Stanford Junior University Methods of studying a biomarker, and methods of detecting a biomarker
JP5649571B2 (en) * 2008-07-15 2015-01-07 コーニンクレッカ フィリップス エヌ ヴェ Safe resection
US8216161B2 (en) 2008-08-06 2012-07-10 Mirabilis Medica Inc. Optimization and feedback control of HIFU power deposition through the frequency analysis of backscattered HIFU signals
CN102149430B (en) 2008-09-09 2015-01-14 皇家飞利浦电子股份有限公司 Therapy system for depositing energy
US20110190662A1 (en) 2008-10-01 2011-08-04 Beacon Endoscopic Corporation Rapid exchange fna biopsy device with diagnostic and therapeutic capabilities
EP2349482B1 (en) 2008-10-24 2016-07-27 Mirabilis Medica Inc. Apparatus for feedback control of hifu treatments
US20130023714A1 (en) 2008-10-26 2013-01-24 Board Of Regents, The University Of Texas Systems Medical and Imaging Nanoclusters
US8319495B1 (en) 2008-10-27 2012-11-27 Yudong Zhu Multi-port RF systems and methods for MRI
US8256953B2 (en) 2008-10-31 2012-09-04 Yuhas Donald E Methods and apparatus for measuring temperature and heat flux in a material using ultrasound
CN102271603A (en) 2008-11-17 2011-12-07 明诺医学股份有限公司 Selective accumulation of energy with or without knowledge of tissue topography
US8425424B2 (en) 2008-11-19 2013-04-23 Inightee Ltd. Closed-loop clot lysis
US8311641B2 (en) 2008-12-04 2012-11-13 General Electric Company Method and apparatus for generating a localized heating
US8870772B2 (en) * 2008-12-29 2014-10-28 Perseus-Biomed Inc. Method and system for tissue recognition
US8361066B2 (en) 2009-01-12 2013-01-29 Ethicon Endo-Surgery, Inc. Electrical ablation devices
CN101810468B (en) 2009-02-20 2012-11-14 西门子公司 Method for reducing thermometric error of magnetic resonance
DE102009024589A1 (en) 2009-06-10 2010-12-23 Siemens Aktiengesellschaft Thermotherapy apparatus and method for performing thermotherapy
US8396532B2 (en) 2009-06-16 2013-03-12 MRI Interventions, Inc. MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time
US20110092880A1 (en) 2009-10-12 2011-04-21 Michael Gertner Energetic modulation of nerves
US9174065B2 (en) 2009-10-12 2015-11-03 Kona Medical, Inc. Energetic modulation of nerves
US8517962B2 (en) * 2009-10-12 2013-08-27 Kona Medical, Inc. Energetic modulation of nerves
US8295912B2 (en) 2009-10-12 2012-10-23 Kona Medical, Inc. Method and system to inhibit a function of a nerve traveling with an artery
US8469904B2 (en) 2009-10-12 2013-06-25 Kona Medical, Inc. Energetic modulation of nerves
US8368401B2 (en) 2009-11-10 2013-02-05 Insightec Ltd. Techniques for correcting measurement artifacts in magnetic resonance thermometry
US8482285B2 (en) 2010-01-19 2013-07-09 Insightec Ltd. Multibaseline PRF-shift magnetic resonance thermometry
US8427154B2 (en) 2010-04-12 2013-04-23 Rares Salomir Method and apparatus for magnetic resonance guided high intensity focused ultrasound focusing under simultaneous temperature monitoring
US8326010B2 (en) 2010-05-03 2012-12-04 General Electric Company System and method for nuclear magnetic resonance (NMR) temperature monitoring
DE102010061970B4 (en) 2010-11-25 2013-05-08 Siemens Aktiengesellschaft Method and device for determining an MR system-related phase information
WO2013141974A1 (en) 2012-02-08 2013-09-26 Convergent Life Sciences, Inc. System and method for using medical image fusion

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110101980A1 (en) * 2009-10-29 2011-05-05 Yoshiharu Ohiwa Magnetic resonance imaging apparatus
US9157977B2 (en) * 2009-10-29 2015-10-13 Kabushiki Kaisha Toshiba Magnetic resonance imaging apparatus with optimal excitation angle
US20140306126A1 (en) * 2011-11-21 2014-10-16 INSERM (Institut National de la Santé et de la Recherche Médicale) Prostate phantom, system for planning a focal therapy of a prostate cancer comprising such prostate phantom and method for planning a focal therapy of a prostate cancer implementing such system
US8992231B2 (en) * 2011-11-21 2015-03-31 Inserm (Institut National De La Sante Et De La Recherche Medicale) Prostate phantom, system for planning a focal therapy of a prostate cancer comprising such prostate phantom and method for planning a focal therapy of a prostate cancer implementing such system
US9218658B2 (en) * 2012-03-30 2015-12-22 Siemens Aktiengesellschaft Method for determining an artifact-reduced three-dimensional image data set and X-ray device
US20130259355A1 (en) * 2012-03-30 2013-10-03 Yiannis Kyriakou Method for determining an artifact-reduced three-dimensional image data set and x-ray device
US9060672B2 (en) * 2013-02-11 2015-06-23 Definiens Ag Coregistering images of needle biopsies using multiple weighted landmarks
US9177378B2 (en) 2013-02-11 2015-11-03 Definiens Ag Updating landmarks to improve coregistration as regions of interest are corrected
US20140228707A1 (en) * 2013-02-11 2014-08-14 Definiens Ag Coregistering Images of Needle Biopsies Using Multiple Weighted Landmarks
US11819246B2 (en) 2013-02-19 2023-11-21 Stryker European Operations Holdings Llc Software for use with deformity correction
US10881433B2 (en) 2013-02-19 2021-01-05 Stryker European Operations Holdings Llc Software for use with deformity correction
US10194944B2 (en) * 2013-02-19 2019-02-05 Stryker European Holdings I, Llc Software for use with deformity correction
US20170281233A1 (en) * 2013-02-19 2017-10-05 Stryker European Holdings I, Llc Software for use with deformity correction
US9572531B2 (en) * 2013-03-21 2017-02-21 Canon Kabushiki Kaisha Object information acquiring apparatus and control method thereof
US20140286549A1 (en) * 2013-03-21 2014-09-25 Canon Kabushiki Kaisha Object information acquiring apparatus and control method thereof
US10918283B2 (en) * 2013-04-05 2021-02-16 Koninklijke Philips N.V. Real time energy depositing therapy system controlled by magnetic resonance rheology
US20160038081A1 (en) * 2013-04-05 2016-02-11 Koninklijke Philips N.V. Real time energy depositing therapy system controlled by magnetic resonance rheology
US9355448B2 (en) * 2013-04-25 2016-05-31 Samsung Medison Co., Ltd. Method and apparatus for image registration
US20140321726A1 (en) * 2013-04-25 2014-10-30 Samsung Medison Co., Ltd. Method and apparatus for image registration
US20160253804A1 (en) * 2013-10-30 2016-09-01 Koninklijke Philips N.V. Assisting apparatus for assisting in registering an imaging device with a position and shape determination device
US9542529B2 (en) * 2013-10-31 2017-01-10 Toshiba Medical Systems Corporation Medical image data processing apparatus and method
US20150117727A1 (en) * 2013-10-31 2015-04-30 Toshiba Medical Systems Corporation Medical image data processing apparatus and method
US20160128654A1 (en) * 2014-02-25 2016-05-12 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US10758198B2 (en) 2014-02-25 2020-09-01 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11534127B2 (en) 2014-02-25 2022-12-27 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US11642174B2 (en) 2014-02-25 2023-05-09 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
US10765384B2 (en) * 2014-02-25 2020-09-08 DePuy Synthes Products, Inc. Systems and methods for intra-operative image analysis
DE102014212089A1 (en) * 2014-06-24 2015-07-23 Siemens Aktiengesellschaft Method for monitoring the image of a minimally invasive procedure, image processing device and ultrasound image recording device
US10426372B2 (en) * 2014-07-23 2019-10-01 Sony Corporation Image registration system with non-rigid registration and method of operation thereof
US20160027178A1 (en) * 2014-07-23 2016-01-28 Sony Corporation Image registration system with non-rigid registration and method of operation thereof
US9805248B2 (en) 2014-08-29 2017-10-31 Definiens Ag Applying pixelwise descriptors to a target image that are generated by segmenting objects in other images
US10474874B2 (en) 2014-08-29 2019-11-12 Definiens Ag Applying pixelwise descriptors to a target image that are generated by segmenting objects in other images
DE102014219581A1 (en) * 2014-09-26 2015-09-17 Siemens Aktiengesellschaft A method, apparatus and computer program for registering a medical image with an anatomical structure
CN104268885A (en) * 2014-10-07 2015-01-07 电子科技大学 MRI and MRSI data fusion method based on NMF
RU2578184C1 (en) * 2015-03-06 2016-03-20 Государственное бюджетное образовательное учреждение высшего профессионального образования Первый Московский государственный медицинский университет им. И.М. Сеченова Министерства здравоохранения Российской Федерации (ГБОУ ВПО Первый МГМУ им. И.М. Сеченова Минздрава России) Method for dynamic magnetic resonance diagnosis of malignant tumours of ovaries
US9858665B2 (en) 2015-04-03 2018-01-02 Regents Of The University Of Minnesota Medical imaging device rendering predictive prostate cancer visualizations using quantitative multiparametric MRI models
US10959780B2 (en) * 2015-06-26 2021-03-30 Therenva Method and system for helping to guide an endovascular tool in vascular structures
US10297042B2 (en) * 2015-06-30 2019-05-21 Brainlab Ag Medical image fusion with reduced search space
US20180047183A1 (en) * 2015-06-30 2018-02-15 Brainlab Ag Medical Image Fusion with Reduced Search Space
US11051902B2 (en) * 2015-09-09 2021-07-06 Koninklijke Philips N.V. System and method for planning and performing a repeat interventional procedure
US20210282886A1 (en) * 2015-09-09 2021-09-16 Koninklijke Philips N.V. System and method for planning and performing a repeat interventional procedure
US11950967B2 (en) * 2015-09-09 2024-04-09 Koninklijke Philips N.V. System and method for planning and performing a repeat interventional procedure
US20170084022A1 (en) * 2015-09-23 2017-03-23 Analogic Corporation Real Time Image Based Risk Assessment for an Instrument along a Path to a Target in an object
US10292678B2 (en) * 2015-09-23 2019-05-21 Analogic Corporation Real-time image based risk assessment for an instrument along a path to a target in an object
US20170217102A1 (en) * 2016-01-29 2017-08-03 Siemens Medical Solutions Usa, Inc. Multi-Modality Image Fusion for 3D Printing of Organ Morphology and Physiology
US10842379B2 (en) * 2016-01-29 2020-11-24 Siemens Healthcare Gmbh Multi-modality image fusion for 3D printing of organ morphology and physiology
WO2017165801A1 (en) * 2016-03-24 2017-09-28 The Regents Of The University Of California Deep-learning-based cancer classification using a hierarchical classification framework
US10939874B2 (en) 2016-03-24 2021-03-09 The Regents Of The University Of California Deep-learning-based cancer classification using a hierarchical classification framework
US10610305B2 (en) 2016-05-22 2020-04-07 DePuy Synthes Products, Inc. Systems and methods for intra-operative image acquisition and calibration
US10959782B2 (en) 2016-05-22 2021-03-30 DePuy Synthes Products, Inc. Systems and methods for intra-operative image acquisition and calibration
US11020186B2 (en) 2016-06-02 2021-06-01 Stryker European Operations Holdings Llc Software for use with deformity correction
US10154884B2 (en) 2016-06-02 2018-12-18 Stryker European Holdings I, Llc Software for use with deformity correction
US10603112B2 (en) 2016-06-02 2020-03-31 Stryker European Holdings I, Llc Software for use with deformity correction
US10251705B2 (en) 2016-06-02 2019-04-09 Stryker European Holdings I, Llc Software for use with deformity correction
US11553965B2 (en) 2016-06-02 2023-01-17 Stryker European Operations Holdings Llc Software for use with deformity correction
US10255997B2 (en) 2016-07-12 2019-04-09 Mindshare Medical, Inc. Medical analytics system
CN106501182A (en) * 2016-09-22 2017-03-15 南京大学 A kind of method of the intrinsic Zymography nondestructive measurement elasticity of utilization optoacoustic
US10565479B1 (en) 2016-12-27 2020-02-18 Definiens Gmbh Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring
US10438096B2 (en) 2016-12-27 2019-10-08 Definiens Ag Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring
US11857370B2 (en) * 2017-03-20 2024-01-02 National Bank Of Canada Method and system for visually assisting an operator of an ultrasound system
CN111093548A (en) * 2017-03-20 2020-05-01 精密成像有限公司 Method and system for visually assisting an operator of an ultrasound system
JP2020512166A (en) * 2017-03-20 2020-04-23 エグザクト イメージング インコーポレイテッド Method and system for visually assisting an operator of an ultrasound system
US20200008875A1 (en) * 2017-03-21 2020-01-09 Canon U.S.A., Inc. Methods, apparatuses and storage mediums for ablation planning and performance
WO2018175094A1 (en) * 2017-03-21 2018-09-27 Canon U.S.A., Inc. Methods, apparatuses and storage mediums for ablation planning and performance
GB2575389B (en) * 2017-03-25 2022-08-17 Smartblate Llc System and method for prostate cancer treatment under local anesthesia
US11583655B2 (en) * 2017-03-25 2023-02-21 SmartBlate, LLC System and method for prostate treatment under local anesthesia
WO2018183217A1 (en) * 2017-03-25 2018-10-04 Bianco Fernando J System and method for prostate cancer treatment under local anesthesia
GB2575389A (en) * 2017-03-25 2020-01-08 Smartblate Llc System and method for prostate cancer treatment under local anesthesia
US11701090B2 (en) 2017-08-16 2023-07-18 Mako Surgical Corp. Ultrasound bone registration with learning-based segmentation and sound speed calibration
US11574411B2 (en) * 2018-03-25 2023-02-07 Siemens Healthineers International Ag Deformable image registration based on an image mask generated based on a two-dimensional (2D) computed tomography (CT) image
US10902621B2 (en) * 2018-03-25 2021-01-26 Varian Medical Systems International Ag Deformable image registration based on masked computed tomography (CT) image
US20190295268A1 (en) * 2018-03-25 2019-09-26 Varian Medical Systems International Ag Deformable image registration with awareness of computed tomography (ct) reconstruction area
US11257210B2 (en) * 2018-06-25 2022-02-22 The Royal Institution For The Advancement Of Learning / Mcgill University Method and system of performing medical treatment outcome assessment or medical condition diagnostic
CN112654324A (en) * 2018-07-26 2021-04-13 柯惠有限合伙公司 System and method for providing assistance during surgery
US11543482B2 (en) * 2018-10-16 2023-01-03 Koninklijke Philips N.V. Magnetic resonance imaging using motion-compensated image reconstruction
WO2020095078A1 (en) 2018-11-09 2020-05-14 Semmelweis Egyetem Exoskeleton for assisting surgical positioning, method for producing the exoskeleton
DE112019005588T5 (en) 2018-11-09 2021-11-11 Semmelweis Egyetem Exoskeleton for assisting surgical positioning, methods of manufacturing the exoskeleton and a surgical method for using such an exoskeleton
US11291521B2 (en) * 2018-11-14 2022-04-05 Seung Joon IM Surgery assisting device using augmented reality
CN109256023A (en) * 2018-11-28 2019-01-22 中国科学院武汉物理与数学研究所 A kind of measurement method of pulmonary airways microstructure model
RU2695007C1 (en) * 2018-12-27 2019-07-18 федеральное государственное бюджетное образовательное учреждение высшего образования "Первый Санкт-Петербургский государственный медицинский университет имени академика И.П. Павлова" Министерства здравоохранения Российской Федерации Method of treating echinococcal hepatic cyst of the type ce2b, ce3b
US11633146B2 (en) 2019-01-04 2023-04-25 Regents Of The University Of Minnesota Automated co-registration of prostate MRI data
RU2685920C1 (en) * 2019-01-10 2019-04-23 Федеральное государственное бюджетное учреждение "Национальный медицинский исследовательский центр хирургии им. А.В. Вишневского" Минздрава России Method for intraoperative evaluation of renal blood supply after extracorporal nephrectomy in conditions of pharmaco-cold ischemia without ureter crossing with orthotopic vessel replacement by ultrasonic examination data
US11631171B2 (en) 2019-01-10 2023-04-18 Regents Of The University Of Minnesota Automated detection and annotation of prostate cancer on histopathology slides
WO2020234409A1 (en) * 2019-05-22 2020-11-26 Koninklijke Philips N.V. Intraoperative imaging-based surgical navigation
US11903650B2 (en) * 2019-09-11 2024-02-20 Ardeshir Rastinehad Method for providing clinical support for surgical guidance during robotic surgery
US20210068900A1 (en) * 2019-09-11 2021-03-11 Ardeshir Rastinehad Method for providing clinical support for surgical guidance during robotic surgery
US11304683B2 (en) 2019-09-13 2022-04-19 General Electric Company Biopsy workflow using multimodal imaging
US11918424B2 (en) 2019-10-11 2024-03-05 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
US11490986B2 (en) * 2019-10-11 2022-11-08 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
US20220008143A1 (en) * 2019-10-11 2022-01-13 Beyeonics Surgical Ltd. System and method for improved electronic assisted medical procedures
CN113040873A (en) * 2019-12-27 2021-06-29 深圳市理邦精密仪器股份有限公司 Image processing method of ultrasound image, ultrasound apparatus, and storage medium
US20220304751A1 (en) * 2020-03-17 2022-09-29 Boe Technology Group Co., Ltd. Optical scale and method for coordinate system registration
US11660042B2 (en) * 2020-05-15 2023-05-30 Canon Medical Systems Corporation Magnetic resonance imaging apparatus, method, and storage medium
US20210353212A1 (en) * 2020-05-15 2021-11-18 Canon Medical Systems Corporation Magnetic resonance imaging apparatus, method, and storage medium
CN113393427A (en) * 2021-05-28 2021-09-14 上海联影医疗科技股份有限公司 Plaque analysis method, plaque analysis device, computer equipment and storage medium
CN113842210A (en) * 2021-08-02 2021-12-28 应葵 Vertebral tumor microwave ablation operation simulation method and device
US11887306B2 (en) 2021-08-11 2024-01-30 DePuy Synthes Products, Inc. System and method for intraoperatively determining image alignment
CN115363715A (en) * 2022-08-30 2022-11-22 衡阳市中心医院 Gastrointestinal drainage puncture device

Also Published As

Publication number Publication date
US20140074078A1 (en) 2014-03-13
US20210100535A1 (en) 2021-04-08
WO2014043201A1 (en) 2014-03-20
US10835215B2 (en) 2020-11-17
US20140081253A1 (en) 2014-03-20

Similar Documents

Publication Publication Date Title
US20200085412A1 (en) System and method for using medical image fusion
US20140073907A1 (en) System and method for image guided medical procedures
US20210161507A1 (en) System and method for integrated biopsy and therapy
Wein et al. Automatic CT-ultrasound registration for diagnostic imaging and image-guided intervention
WO2014031531A1 (en) System and method for image guided medical procedures
JP5627677B2 (en) System and method for image-guided prostate cancer needle biopsy
Xu et al. Real-time MRI-TRUS fusion for guidance of targeted prostate biopsies
Hu et al. MR to ultrasound registration for image-guided prostate interventions
Jolesz Intraoperative imaging and image-guided therapy
US20110178389A1 (en) Fused image moldalities guidance
CA2918295A1 (en) Mri image fusion methods and uses thereof
Takamoto et al. Feasibility of intraoperative navigation for liver resection using real-time virtual sonography with novel automatic registration system
Cool et al. Design and evaluation of a 3D transrectal ultrasound prostate biopsy system
Maris et al. Toward autonomous robotic prostate biopsy: a pilot study
Ma et al. Surgical navigation system for laparoscopic lateral pelvic lymph node dissection in rectal cancer surgery using laparoscopic-vision-tracked ultrasonic imaging
Smit et al. Ultrasound-based navigation for open liver surgery using active liver tracking
Schumann State of the art of ultrasound-based registration in computer assisted orthopedic interventions
Rapetti et al. Virtual reality navigation system for prostate biopsy
Pohlman et al. Two‐dimensional ultrasound‐computed tomography image registration for monitoring percutaneous hepatic intervention
US20130085383A1 (en) Systems, methods and computer readable storage media storing instructions for image-guided therapies
Kadoury et al. Realtime TRUS/MRI fusion targeted-biopsy for prostate cancer: a clinical demonstration of increased positive biopsy rates
Das et al. Magnetic Resonance Imaging-Transrectal Ultrasound Fusion Biopsy of the Prostate—An Update
Chen et al. Towards transcervical ultrasound image guidance for transoral robotic surgery
Alameddine et al. Image Fusion Principles: Theory
De Silva et al. Evaluating the utility of intraprocedural 3D TRUS image information in guiding registration for displacement compensation during prostate biopsy

Legal Events

Date Code Title Description
AS Assignment

Owner name: CONVERGENT LIFE SCIENCES, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUMAR, DINESH;VOHRA, AMIT;SPERLING, DANIEL S.;SIGNING DATES FROM 20130313 TO 20130315;REEL/FRAME:030014/0107

AS Assignment

Owner name: CONVERGENT LIFE SCIENCES, INC., NEW JERSEY

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF ASSIGNOR NAME PREVIOUSLY RECORDED AT REEL: 030014 FRAME: 0107. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:KUMAR, DINESH;VOHRA, AMIT;SPERLING, DANNY S;SIGNING DATES FROM 20140703 TO 20140710;REEL/FRAME:033321/0595

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION