US20100305455A1 - Device for wavelength-selective imaging - Google Patents

Device for wavelength-selective imaging Download PDF

Info

Publication number
US20100305455A1
US20100305455A1 US12/853,757 US85375710A US2010305455A1 US 20100305455 A1 US20100305455 A1 US 20100305455A1 US 85375710 A US85375710 A US 85375710A US 2010305455 A1 US2010305455 A1 US 2010305455A1
Authority
US
United States
Prior art keywords
image
visible light
wavelengths
diagnostic
illuminated object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/853,757
Inventor
John V. Frangioni
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beth Israel Deaconess Medical Center Inc
Original Assignee
Beth Israel Deaconess Medical Center Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beth Israel Deaconess Medical Center Inc filed Critical Beth Israel Deaconess Medical Center Inc
Priority to US12/853,757 priority Critical patent/US20100305455A1/en
Assigned to BETH ISRAEL DEACONESS MEDICAL CENTER reassignment BETH ISRAEL DEACONESS MEDICAL CENTER ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FRANGIONI, JOHN V.
Publication of US20100305455A1 publication Critical patent/US20100305455A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/10Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors characterised by at least one potential-jump barrier or surface barrier, e.g. phototransistors
    • H01L31/101Devices sensitive to infrared, visible or ultraviolet radiation
    • H01L31/111Devices sensitive to infrared, visible or ultraviolet radiation characterised by at least three potential barriers, e.g. photothyristor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/30Measuring the intensity of spectral lines directly on the spectrum itself
    • G01J3/36Investigating two or more bands of a spectrum by separate detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/12Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/201Filters in the form of arrays
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • H01L27/14647Multicolour imagers having a stacked pixel-element structure, e.g. npn, npnpn or MQW elements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/148Charge coupled imagers
    • H01L27/14806Structural or functional details thereof
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/148Charge coupled imagers
    • H01L27/14831Area CCD imagers
    • H01L27/14837Frame-interline transfer
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/148Charge coupled imagers
    • H01L27/14831Area CCD imagers
    • H01L27/14843Interline transfer
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/148Charge coupled imagers
    • H01L27/14868CCD or CID colour imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/10Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors characterised by at least one potential-jump barrier or surface barrier, e.g. phototransistors
    • H01L31/101Devices sensitive to infrared, visible or ultraviolet radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image

Definitions

  • fluorescent dyes may be adapted for sequestration or preferential uptake at a location of medical interest, such as a lesion. The location may then be exposed to a light source that stimulates fluorescence of the dye to permit visualization that enhances a feature of the location.
  • Other emerging techniques employ phosphorescent, chemoluminescent, or scintillant substances to generate photons at one or more wavelengths suitable for imaging. These techniques have proven useful for medical imaging and surveillance, with applications including lesion imaging, calcium deposit imaging, and blood flow imaging.
  • CCDs Charge-coupled devices
  • Such imaging techniques have been enhanced with simultaneous capture and rendering of visible light images. This may, for example, provide a navigational tool at a surgical site, with the diagnostic image and the visible light image superimposed for improved visualization.
  • Charge-coupled devices provide one well-known system for converting incident photons, or light, into a measurable electronic charge.
  • current CCD systems that combine visible light and emission wavelength imaging typically employ commercially available components, and require at least two separate cameras: a first camera to capture the visible light image and a second camera for capturing the diagnostic emission wavelength which is commonly, though by no means exclusively, in the near-infrared range.
  • a two-camera system imposes the cost of an additional camera, as well as optics for splitting the visual light wavelengths from the emission wavelength and directing each to a separate transducer. There is also additional software complexity and processing overhead in order to synchronize and superimpose image data streams from the two cameras.
  • An imaging device captures both a visible light image and a diagnostic image, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • the visible light image (which may be color or grayscale) and the diagnostic image may be superimposed to display regions of diagnostic significance within a visible light image.
  • a number of imaging media may be used according to an intended application for the imaging device, and an imaging medium may have wavelengths above, below, or within the visible light spectrum.
  • the devices described herein may be advantageously packaged within a single integrated device or other solid state device, and/or employed in an integrated, single-camera medical imaging system, as well as many non-medical imaging systems that would benefit from simultaneous capture of visible-light wavelength images along with images at other wavelengths.
  • the system includes a device that captures photon intensity from an illuminated object, the device being exposed to an image through a filter wheel including one or more filters that selectively pass wavelengths of light to form a visible light image of the object and a filter that selectively passes wavelengths of light to form a diagnostic image of the object, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • the system includes a plurality of devices that capture photon intensity from an illuminated object, the devices being exposed to an image through a beam splitter and filters that selectively pass incident photons along a number of paths according to wavelength, each one of the plurality of devices that capture photon intensity being selectively exposed to an image including wavelengths passed along one of the number of paths, at least one of the paths selectively passing wavelengths to form a diagnostic image of the object, the diagnostic image corresponding to emissions from an imaging medium within the object, and at least one of the paths selectively passing wavelengths to form a visible light image of the object.
  • the system includes a device that captures photon intensity from an illuminated object at a plurality of pixel locations, each one of the plurality of pixel locations covered by a filter, at least one of the filters selectively passing wavelengths to form a visible light image of the object at a corresponding pixel location and at least one of the filters selectively passing wavelengths of light to form a diagnostic image of the object at a corresponding pixel location, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • the system includes a device that captures photon intensity from an illuminated object at a plurality of pixel locations, each one of the plurality of pixel locations including a plurality of successive diode junctions formed at the boundary of nested p-type and n-type semiconductor wells, each diode junction selectively detecting incident light over a range of wavelengths, at least one of the diode junctions detecting wavelengths of a visible light image of the object at that pixel location and at least one of the diode junctions detecting wavelengths of a diagnostic image of the object at that pixel location, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • the device that captures photon intensity may be a charge-coupled device.
  • the device may consist of an integrated circuit.
  • the imaging medium may be a fluorescent dye, a phosphorescent substance, a chemoluminscent substance, and/or a scintillant substance.
  • the imaging medium may be a substance introduced into the object.
  • the imaging medium may be a substance inherently present within the object.
  • the object may be an object within a surgical field.
  • the visible light image may be monochromatic.
  • the visible light image may include red, blue and green wavelengths of light.
  • the visible light image may include cyan, magenta, and yellow wavelengths of light.
  • the diagnostic image may include a near-infrared wavelength.
  • the diagnostic image may include an infrared wavelength.
  • the diagnostic image may include a plurality of diagnostic images, each at a different range of wavelengths.
  • the diagnostic image may be formed from one or more diagnostic wavelengths in the visible light range, the object being illuminated with
  • the visible light image and diagnostic image may be processed and displayed in a medical imaging system.
  • the medical imaging system may include a display for rendering a composite image including a superposition of the visible light image and the diagnostic image.
  • the medical imaging system may include one or more inputs for controlling at least one of a field of view of the object, a focus of the object, or a zoom of the object.
  • the medical imaging system may include a surgical tool.
  • the visible light image and diagnostic image may be processed and displayed in at least one of a machine vision system, an astronomy system, a military system, a geology system, or an industrial system.
  • the system may be packaged in a camera.
  • the camera may include a visible light image output, a diagnostic image output, and a combined image output, the combined image output providing a superposition of the visible light image and the diagnostic image.
  • the system may capture moving video, or the system may capture still images.
  • the system may include a solid state device that captures a visible light image of an object under illumination in digital form and a diagnostic image of the object in digital form, the diagnostic image corresponding to an intensity of emission from an imaging medium within the object.
  • the system may include a single camera that captures a visible light image of an object under illumination and a diagnostic image of the object, the diagnostic image corresponding to an intensity of emission from the object, the camera configured to provide a digital version of the visible light image and a digital version of the diagnostic image to an external display system.
  • a method may include the steps of illuminating an object to provide an image; capturing an image of the object that includes a visible light image and a diagnostic image, the diagnostic image corresponding to emissions from an imaging medium within the object; and storing the image.
  • Capturing an image may include passing the image through a filter wheel that exposes an image capture device to the image through a plurality of filters, at least one of the plurality of filters selectively passing wavelengths of light to form a visible light image of the object and at least one of the plurality of filters selectively passing wavelengths of light to form a diagnostic image of the object.
  • Capturing an image may include passing the image through a beam splitter and filters that selectively pass incident photons along a number of paths according to wavelength and exposing each one of a plurality of devices that capture photon intensity to an image including wavelengths passed along one of the number of paths, at least one of the paths selectively passing wavelengths to form a diagnostic image of the object, and at least one of the paths selectively passing wavelengths to form a visible light image of the object.
  • Capturing an image may include capturing the image at a plurality of pixel locations, each one of the plurality of pixel locations covered by a filter, at least one of the filters selectively passing wavelengths to form a visible light image of the object at a corresponding pixel location and at least one of the filters selectively passing wavelengths of light to form a diagnostic image of the object at a corresponding pixel location.
  • Capturing the image may include capturing the image at a plurality of pixel locations, each one of the plurality of pixel locations covered by a filter, at least one of the filters selectively passing wavelengths to form a visible light image of the object at a corresponding pixel location and at least one of the filters selectively passing wavelengths of light to form a diagnostic image of the object at a corresponding pixel location.
  • Capturing the image may include capturing the image at a plurality of pixel locations, each one of the plurality of pixel locations including a plurality of successive diode junctions formed at the boundary of nested p-type and n-type semiconductor wells, each diode junction selectively detecting incident light over a range of wavelengths, at least one of the diode junctions detecting wavelengths of a visible light image of the object and at least one of the diode junctions detecting wavelengths of a diagnostic image of the object at that pixel location.
  • FIG. 1 is a block diagram of a prior art imaging system
  • FIG. 2 is a block diagram of an imaging system with an integrated image capture device
  • FIG. 3 depicts an embodiment of an image capture device
  • FIG. 4 depicts an embodiment of an image capture device
  • FIG. 5 is a side view of an image capture device on a single, integrated semiconductor device
  • FIG. 6 is a top view of an image capture device on a single, integrated semiconductor device.
  • FIG. 7 is a side view of an image capture device on a single, integrated semiconductor device.
  • an image capture device for simultaneously capturing visible-light and near-infrared images.
  • the methods and systems described herein can be suitably adapted to a range of medical imaging applications where visible light tissue images may be usefully combined with diagnostic image information obtained from other specified wavelengths.
  • the systems may be applicable to a wide range of diagnostic or surgical applications where a target pathology, tissue type, or cell may be labeled with a fluorescent dye or other fluorescent substance.
  • the systems described herein may be adapted to any imaging application where a visible light image may be usefully enhanced with an image of one or more features that are functionally marked to emit photons outside the visible light range by a dye or other material that emits photons at a known wavelength, either inherently or in response to another known wavelength.
  • the systems may also be employed, for example, in a machine vision system, or in a variety of other military, industrial, geological, astronomical or other imaging systems.
  • FIG. 1 is a block diagram of a prior art imaging system.
  • the imaging system 100 may include a light source 102 directed at a surgical field 104 , a near-infrared camera 106 receiving near-infrared light from the surgical field 104 through a dichroic mirror 108 , and a visible light camera 110 receiving visible light reflected from the dichroic mirror 108 .
  • a processing and display system 112 receives data from the cameras 106 .
  • a dye source (not shown) containing a dye may also be included for introduction into an object in the surgical field 104 , such as through injection into the bloodstream of a patient.
  • the light source 102 may include a visible light source and an excitation light source that illuminate the surgical field 104 .
  • the light source 102 is depleted in the wavelength region where the dye or other emitting substance emits light, so that the light source 102 does not interfere with a diagnostic image obtained from the dye.
  • the near-infrared camera 106 captures a diagnostic image from an illuminated object within the surgical field at a wavelength or range of wavelengths that pass through the dichroic mirror 108 , while the visible light camera 110 captures a visible light image from the illuminated object within the surgical field.
  • the visible light image may be captured and rendered in a number of manners, such as “RGB” (red-green-blue), “CMY” (cyan-magenta-yellow), or monochromatic grayscale.
  • the diagnostic image corresponds to emissions from the dye or other imaging medium introduced to the object, such that the resulting image is based, for example, upon the distribution of the dye in within the object in the surgical field.
  • visible light typically includes light wavelengths from 400 nm to 700 nm
  • near-infrared light may include one or more wavelengths of about 810 nm, or more generally, wavelengths from about 700 nm to about 1000 nm.
  • the emission wavelength may be, for example, other near-infrared wavelengths, an infrared wavelength, or a far red wavelength. More generally, the emission wavelength may be any wavelength of emission that can be generated by an imaging medium introduced into an imaging subject and usefully captured for imaging with the devices described herein.
  • the imaging medium may include, for example, fluorescent dyes that emit in response to a stimulus wavelength, or a substance that emits photons at a known wavelength without stimulus, such as phosphorescent, chemoluminescent, or scintillant substances.
  • fluorescent substances such as semiconductor nanocrystals (a.k.a. “quantum dots”) may be used to emit photons at one or more specific wavelengths, such as infrared wavelengths (e.g., 1320 nm).
  • the imaging medium may include substances inherently present in the object being imaged, such as fluorescent or phosphorescent endogenous biological substances. In certain embodiments, the imaging medium may emit light in a range within the visible light spectrum.
  • imaging medium refers to any of the imaging media described above, or any other medium capable of emitting photons useful in locating regions of functional or diagnostic significance.
  • each diagnostic image may be shifted to a suitable visible light wavelength for purposes of display.
  • this pseudo-coloring employs a color specifically selected to provide a substantial color contrast with the object of the visible light image.
  • a color may be selected in advance, such as bright lime green for a diagnostic image over living tissue, or the color may be determined automatically by an algorithm designed to determine average background color and choose a suitable contrasting color.
  • the visible light and diagnostic images may be combined by the image processing and display system 112 and presented on a display where they may be used, for example, by a surgeon conducting a surgical procedure.
  • the processing and display system 112 may include any suitable hardware and software to combine and render the diagnostic and visible light images in any manner useful for a user of the system 100 , such as a composite image formed from a superposition of the diagnostic and visible light images, or side-by-side rendering of the diagnostic and visible light images.
  • the processing and display system 112 may be part of a medical imaging system that also includes, for example, inputs to control visual navigation of the surgical field, such as field of view (e.g., X and Y panning), zoom, and focus, or inputs for controlling a surgical tool associated with the system 100 . Similar controls may be provided for the non-medical applications noted above, with certain adaptations as appropriate, such as azimuth and elevation in place of X/Y panning for astronomical applications.
  • FIG. 2 is a block diagram of an imaging system with an integrated image capture device.
  • the imaging system 200 may include a light source 202 directed at a surgical field 204 , an image capture device 206 receiving light from the surgical field 204 , and a processing and display system 208 .
  • a dye source (not shown) containing in imaging medium such as a dye may also be included for introduction into an object in the surgical field 204 , such as through injection into the bloodstream of a patient.
  • the imaging system 200 may be in many respects like the imaging system 100 described above with reference to FIG. 1 . It will readily be appreciated that the imaging system 200 differs in at least one respect—the use of a single image capture system 206 .
  • the system 200 advantageously incorporates visible light and diagnostic wavelength imaging into a single device, the image capture system 206 .
  • This removes the need for additional external hardware, such as the dichroic mirror 108 of FIG. 1 , or additional hardware and/or software in the processing and display system 208 to perform additional processing for images from two separate image capture devices, which processing may range from image registration to matching of frame rates, image sizes, and other features of images from disparate cameras.
  • the image capture system 206 may be packaged as a single solid state device suitable for integration into a larger system, or as a camera with inputs for remote operation and/or outputs including a visible light output, a diagnostic image output, and a combined output that superimposes the diagnostic and visible light images.
  • the image capture system 206 may provide for capture of two or more wavelengths of diagnostic significance through adaptations of the systems described below. Thus two or more diagnostic images may be displayed, and/or superimposed on a visible light image in order to simultaneously visualize two or more features or characteristics of interest.
  • the image capture system 206 may provide still images, or may provide moving images, such as in a real-time display of a surgical field.
  • CCDs Charge-Coupled Devices
  • FF full-frame-transfer
  • FT frame-transfer
  • IL interline transfer
  • image-sensing architectures using charge-coupled devices are known, and may be usefully employed with the systems described herein, including frame-interline transfer devices, accordion devices, charge injection devices, and MOS X, Y addressable devices. All such devices are intended to fall within the meaning of “charge-coupled device” as that term is used herein. While all of these devices are useful for converting incident photons into measurable electronic charges, they are inherently monochromatic in nature. As such, color-imaging applications have been devised for these CCDs that selectively image different wavelengths. These techniques for wavelength selection may be adapted to the present system as described in greater detail below.
  • FIG. 3 depicts an embodiment of an image capture device.
  • the device 300 applies an adaptation of mechanical color wheels used for some conventional red-green-blue (“RGB”) imaging systems.
  • a image of an illuminated object may be focused through a lens and captured in four successive exposures, each synchronized with a filter wheel 302 having desired optical characteristics.
  • this includes a red filter 304 that selectively passes red light, a green filter 306 that selectively passes green light, a blue filter 308 that selectively passes blue light, and a near-infrared filter 310 that selectively passes near-infrared light.
  • the CCD 312 is exposed to the image through the red, green, and blue filters 304 , 306 , 308 collectively to capture a visible light image, and exposed to the image through the near-infrared filter 310 that selectively passes near-infrared emissions to capture a diagnostic image of interest, such as emission from a fluorescent dye.
  • Each exposure of the CCD 312 is sequentially read into data storage (not shown) where it can be reconstructed into a complete image.
  • data storage not shown
  • a number of other wavelengths may be selectively passed by the fourth filter to obtain a diagnostic image of the object, including infrared wavelengths or other wavelengths of interest, as generally described above.
  • additional filters may be added to the color wheel so that two or more emission wavelengths may be captured within the same image.
  • a color wheel with five or more filters is contemplated by the systems described herein.
  • a method according to the above system may include capturing an image that passes through the filter wheel 302 on the CCD 312 or other image capture device to obtain a visible light image and a diagnostic image.
  • FIG. 4 depicts an embodiment of an image capture device.
  • a multi-chip device 400 may employ optics to split an image into separate image planes.
  • a focused image is provided to the device 400 , such as by passing an image of an illuminated object through a lens.
  • a plurality of CCDs 402 or other image capture devices for measuring photon intensity are exposed to the image through a beam splitter 404 , with a CCD 402 placed in each image plane exiting the beam splitter 404 .
  • a filter may also be provided for each CCD 402 to selectively expose the CCD 402 to a range of wavelengths, so that the image is selectively passed along a number of paths according to wavelengths.
  • CCDs 402 may be operated synchronously to capture different incident wavelengths at or near the same point in time.
  • the beam-splitter 404 provides four different CCDs 402 , a first CCD with a filter that passes red light, a second CCD with a filter that passes green light, a third CCD with a filter that passes blue light, and a fourth CCD with a filter that passes near-infrared light.
  • the first three CCDs produce a visible light image
  • the fourth CCD produces a diagnostic image according to an imaging medium introduced into the object.
  • other wavelengths may be passed by the fourth filter, including infrared wavelengths or other wavelengths of interest.
  • additional light paths may be provided by the beam splitter with additional filtered CCDs for each path, so that two or more emission wavelengths may be captured within the same image.
  • additional light paths may be provided by the beam splitter with additional filtered CCDs for each path, so that two or more emission wavelengths may be captured within the same image.
  • a method according to the above system may include capturing an image that passes through the beam splitter 404 and filters that selectively pass incident photons along a number of paths according to wavelength, with each CCD (or other image capture device) capturing either a visible light image or a diagnostic image of the illuminated object.
  • FIG. 5 is a side view of an image capture device on a single, integrated semiconductor device.
  • the figure shows a CCD array 500 with wavelength selection using an integral filter array.
  • the CCD array 500 may include lenses 502 , a filter array 504 , gates 506 , photodiodes 508 , a substrate 510 , vertical charge-coupled devices (“VCCDs”) 512 , and an insulation layer 514 .
  • VCCDs vertical charge-coupled devices
  • the photodiodes 508 serve to detect the intensity of incident photons at pixel locations within a focused image, while the filter array 504 with appropriate characteristics are arranged over the photodiodes 508 such that different photodiodes are exposed to different wavelengths of incident light.
  • the filter array 504 may include, for example red filters that selectively pass red wavelengths (labeled “R”), green filters that selectively pass green wavelengths (labeled “G”), blue filters that selectively pass blue wavelengths (labeled “B”), and near-infrared filters that selectively pass near-infrared wavelengths (labeled “I”).
  • the VCCDs 512 may be formed vertically between the photodiodes 508 for transferring signals produced by photoelectric conversion in the photodiodes 508 .
  • the insulation layer 514 may be formed over the entire surface of the semiconductor substrate 510 (including the photodiodes 508 and the VCCD 512 ), and the plurality of gates 506 may be formed on the insulation layer 514 above each VCCD 512 for controlling the transfer of the photodiode signals.
  • a metal shielding layer for shielding light may be deposited over the gates 512 , except for the light-receiving regions of the photodiodes 508 .
  • a flat insulation film may then be deposited over the entire surface of the semiconductor substrate including the metal shielding layer.
  • the filter array 504 which passes either red (“R”), green (“G”), blue (“B”) (collectively for forming a visible light image), or near-infrared (“NI”) wavelengths (for a diagnostic image) may then be formed over each photodiode 508 corresponding to a pixel to be imaged from an illuminated object.
  • a top coating layer may be deposited on the filter layer 504 .
  • a lens 502 may be formed on the top coating layer for concentrating photons on each photodiode.
  • the filter array 504 separates the spectrum of incident light to selectively pass only the light of a predetermined wavelength, or range of wavelengths, to reach each of the photodiodes.
  • the metal shielding layer restricts incident light to the photodiodes 508 .
  • the incident light is converted into an electric signal in the photodiodes 508 , and transferred out to a processor under control of the gates 506 .
  • the filter array 504 may be dyed or otherwise masked or processed so that each photodiode 508 is exposed to a specific wavelength or range of wavelengths. It will be appreciated that the device of FIG. 5 is an example only, and that a number of different CCD topologies may be used with an integral filter array 504 , and may be suitably adapted to the systems described herein. It should also be appreciated that other photoactive substances may be included in place of, or in addition to, the filters in the filter array 504 , in order to enhance response at certain wavelengths, or to affect a shift in wavelength to a more suitable frequency for measurement by the photodiodes. All such variations are intended to fall within the scope of this description.
  • FIG. 6 is a top view of an image capture device on a single, integrated semiconductor device.
  • the figure depicts one possible arrangement of filters 602 for use with the systems described herein.
  • filters 602 are arranged to expose photodiodes to red (labeled “R”), green (labeled “G”), blue (labeled “B”), and near-infrared (labeled “I”) wavelengths.
  • R red
  • G green
  • B blue
  • I near-infrared
  • wavelengths may be passed by the fourth filter (“I”), including infrared wavelengths or other wavelengths of interest.
  • additional filters may be disposed upon the CCD, with suitable adjustments to the arrangement of filters, so that two or more emission wavelengths may be captured within the same image.
  • a system with an integral filter for five or more wavelengths is contemplated by the systems described above.
  • a method according to the above system may include capturing an image that passes through a filter array that selectively passes wavelengths of either a visible light image or a diagnostic image to a pixel location in a charge coupled device.
  • FIG. 7 is a side view of an image capture device on a single, integrated semiconductor device.
  • the device 700 may include a number of nested p-type and n-type wells, with a p-n diode junction formed at each well boundary that is sensitive to incident photons of a particular wavelength range. By measuring current across these p-n junctions while the device 700 is exposed to light, photon intensity over a number of contiguous wavelengths may be detected at the same location at the same time.
  • the device 700 includes an n-type substrate 702 , a p-well 704 within the n-type substrate, an n-well 706 within the p-well 704 , a p-well 708 within the n-well 706 , and an n-drain 710 within the p-well 708 .
  • a first detector 712 measures photocurrent across a first p-n junction 714 and is generally sensitive to blue wavelengths.
  • a second detector 716 measures photocurrent across a second p-n junction 718 , and is generally sensitive to green wavelengths.
  • a third detector 720 measures photocurrent across a third p-n junction 722 , and is generally sensitive to red wavelengths.
  • a fourth detector 724 measures photocurrent across a fourth p-n junction 726 , and is generally sensitive to an emission wavelength from an imaging medium within an object.
  • each active pixel region senses photocharge by integrating the photocurrent on the capacitance of a photodiode and the associated circuit node, and then buffering the resulting voltage through a readout amplifier.
  • a shallow, n-type, lightly-doped drain above the first p-type well may be employed to maximize blue response in the first photodiode.
  • the above quadruple-well system may be advantageously adapted to imaging systems where an emission wavelength is adjacent to, or nearly adjacent to, the visible light spectrum.
  • the near-infrared spectrum for example, is adjacent to the red wavelength spectrum, and may be measured with the device described above.
  • a method according to the above system may include capturing photon intensity from an illuminated object at a pixel location at a number of different wavelengths by measuring photocurrent at a plurality of successive diode junctions formed at the boundary of successively nested p-type and n-type semiconductor wells.
  • the near-infrared spectrum lies in a range that is particularly useful for certain medical imaging applications, due to the low absorption and autofluorescence of living-tissue components in this range.
  • the absorbances of hemoglobin, lipids, and water reach a cumulative minimum.
  • This so-called “near-infrared window” provides a useful spectrum for excitation and emission wavelengths in living-tissue imaging applications, and a number of fluorescent dyes using these wavelengths have been developed for medical imaging applications.
  • the quadruple-well device described above not only employs a convenient range of wavelengths adjacent to visible light, it accommodates a number of dyes that are known to be safe and effective for tissue imaging, such as the IR-786 or the carboxylic acid form of IRDye-78, available from LI-COR, Inc.
  • the single-CCD filter wheel may introduce significant time delays between different wavelength images, and may not perform well in high-speed imaging applications.
  • the multi-chip approach requires more CCD elements and additional processing in order to maintain registration and calibration between separately obtained images, all of which may significantly increase costs.
  • different applications of the systems described herein may have different preferred embodiments.
  • the system may be deployed as an aid to cardiac surgery, where it may be used intraoperatively for direct visualization of cardiac blood flow, for direct visualization of myocardium at risk for infarction, and for image-guided placement of gene therapy and other medicinals to areas of interest.
  • the system may be deployed as an aid to oncological surgery, where it may be used for direct visualization of tumor cells in a surgical field or for image-guided placement of gene therapy and other medicinals to an area of interest.
  • the system may be deployed as an aid to general surgery for direct visualization of any function amenable to imaging with fluorescent dyes, including blood flow and tissue viability.
  • the system may be used for sensitive detection of malignant cells or other skin conditions, and for non-surgical diagnosis of dermatological diseases using near-infrared ligands and/or antibodies.
  • the CCD systems described herein may be used as imaging hardware in conjunction with open-surgical applications, and may also be integrated into a laparoscope, an endoscope, or any other medical device that employs' an imaging system.
  • the systems may have further application in other non-medical imaging systems that combine visible light and non-visible light imaging.
  • the system described herein includes a wavelength-selective solid state device, such as a CCD or other semiconductor device, a semiconductor chip that includes the solid state device, a camera employing the solid state device, an imaging system employing the solid state device, and methods of imaging that employ the solid state device.
  • a camera using the imaging devices described above may include a lens, user inputs or a wired or wireless remote control input, the imaging device, processing to filter, store and otherwise manage captured images including functions such as superposition of diagnostic and visible light images and pseudo-coloring of the diagnostic image, and one or more outputs for providing the images to a remote device or system.
  • certain imaging technologies are more suitable to capturing certain wavelengths, including technologies such as gas photodiode or microplasma photodetectors, and certain substances or combinations of substances, such as Indium, Gallium, or Germanium may provide enhanced responsiveness over certain wavelength ranges.
  • Some of these are consistent with CMOS manufacturing and may be realized directly on a wafer with visible-light-imaging circuitry and other processing circuitry, or manufactured with micro-electro-mechanical systems technology and packaged within the same chip as related circuitry, or the solid state system may be provided as a chipset that is assembled and provided on a suitable circuit board. All such technologies as may be useful for visible light imaging and/or diagnostic imaging over the wavelengths described above may be used for the solid state devices described above, and are intended to fall within the scope of the invention.
  • cameras using the devices described herein may receive an image through a single lens, and provide both visible light and diagnostic images on a single output.
  • visible light and diagnostic images may be obtained from a single, solid-state device, reducing the requirement for moving parts, additional lenses and expensive optics, and post-processing associated with combining images from different sources.
  • a machine vision system may employ a fluorescent dye that selectively adheres to surfaces of a certain texture, or aggregates in undesirable surface defects.
  • a diagnostic image of the dye may assist in identifying and/or repairing these locations.

Abstract

An imaging device captures both a visible light image and a diagnostic image, the diagnostic image corresponding to emissions from an imaging medium within the object. The visible light image (which may be color or grayscale) and the diagnostic image may be superimposed to display regions of diagnostic significance within a visible light image. A number of imaging media may be used according to an intended application for the imaging device, and an imaging medium may have wavelengths above, below, or within the visible light spectrum. The devices described herein may be advantageously packaged within a single integrated device or other solid state device, and/or employed in an integrated, single-camera medical imaging system, as well as many non-medical imaging systems that would benefit from simultaneous capture of visible-light wavelength images along with images at other wavelengths.

Description

    RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 10/517,280 filed Jun. 24, 2005, which is a national stage filing under 35 U.S.C. 371 of International Appl. PCT/US03/16285 filed May 22, 2003, which claims priority to U.S. Appl. 60/382,524 filed May 22, 2002, each of which are incorporated by reference herein. International Appl. PCT/US03/16285 was published under PCT Article 21(2) in English.
  • GOVERNMENT INTERESTS
  • The United States Government has certain rights in this invention pursuant to Department of Energy Grant #DE-FG02-01ER63188.
  • BACKGROUND OF THE INVENTION
  • A number of medical imaging techniques have emerged for capturing still or moving pictures using dyes that can be safely introduced into living tissue. For example, fluorescent dyes may be adapted for sequestration or preferential uptake at a location of medical interest, such as a lesion. The location may then be exposed to a light source that stimulates fluorescence of the dye to permit visualization that enhances a feature of the location. Other emerging techniques employ phosphorescent, chemoluminescent, or scintillant substances to generate photons at one or more wavelengths suitable for imaging. These techniques have proven useful for medical imaging and surveillance, with applications including lesion imaging, calcium deposit imaging, and blood flow imaging.
  • Such imaging techniques have been enhanced with simultaneous capture and rendering of visible light images. This may, for example, provide a navigational tool at a surgical site, with the diagnostic image and the visible light image superimposed for improved visualization. Charge-coupled devices (“CCDs”) provide one well-known system for converting incident photons, or light, into a measurable electronic charge. As a significant disadvantage, current CCD systems that combine visible light and emission wavelength imaging typically employ commercially available components, and require at least two separate cameras: a first camera to capture the visible light image and a second camera for capturing the diagnostic emission wavelength which is commonly, though by no means exclusively, in the near-infrared range. A two-camera system imposes the cost of an additional camera, as well as optics for splitting the visual light wavelengths from the emission wavelength and directing each to a separate transducer. There is also additional software complexity and processing overhead in order to synchronize and superimpose image data streams from the two cameras.
  • There remains a need for an integrated device that captures images from visible light wavelengths and diagnostic emission wavelengths.
  • SUMMARY OF THE INVENTION
  • An imaging device captures both a visible light image and a diagnostic image, the diagnostic image corresponding to emissions from an imaging medium within the object. The visible light image (which may be color or grayscale) and the diagnostic image may be superimposed to display regions of diagnostic significance within a visible light image. A number of imaging media may be used according to an intended application for the imaging device, and an imaging medium may have wavelengths above, below, or within the visible light spectrum. The devices described herein may be advantageously packaged within a single integrated device or other solid state device, and/or employed in an integrated, single-camera medical imaging system, as well as many non-medical imaging systems that would benefit from simultaneous capture of visible-light wavelength images along with images at other wavelengths.
  • In one aspect, the system includes a device that captures photon intensity from an illuminated object, the device being exposed to an image through a filter wheel including one or more filters that selectively pass wavelengths of light to form a visible light image of the object and a filter that selectively passes wavelengths of light to form a diagnostic image of the object, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • In another aspect, the system includes a plurality of devices that capture photon intensity from an illuminated object, the devices being exposed to an image through a beam splitter and filters that selectively pass incident photons along a number of paths according to wavelength, each one of the plurality of devices that capture photon intensity being selectively exposed to an image including wavelengths passed along one of the number of paths, at least one of the paths selectively passing wavelengths to form a diagnostic image of the object, the diagnostic image corresponding to emissions from an imaging medium within the object, and at least one of the paths selectively passing wavelengths to form a visible light image of the object.
  • In another aspect, the system includes a device that captures photon intensity from an illuminated object at a plurality of pixel locations, each one of the plurality of pixel locations covered by a filter, at least one of the filters selectively passing wavelengths to form a visible light image of the object at a corresponding pixel location and at least one of the filters selectively passing wavelengths of light to form a diagnostic image of the object at a corresponding pixel location, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • In another aspect, the system includes a device that captures photon intensity from an illuminated object at a plurality of pixel locations, each one of the plurality of pixel locations including a plurality of successive diode junctions formed at the boundary of nested p-type and n-type semiconductor wells, each diode junction selectively detecting incident light over a range of wavelengths, at least one of the diode junctions detecting wavelengths of a visible light image of the object at that pixel location and at least one of the diode junctions detecting wavelengths of a diagnostic image of the object at that pixel location, the diagnostic image corresponding to emissions from an imaging medium within the object.
  • The device that captures photon intensity may be a charge-coupled device. The device may consist of an integrated circuit. The imaging medium may be a fluorescent dye, a phosphorescent substance, a chemoluminscent substance, and/or a scintillant substance. The imaging medium may be a substance introduced into the object. The imaging medium may be a substance inherently present within the object. The object may be an object within a surgical field. The visible light image may be monochromatic. The visible light image may include red, blue and green wavelengths of light. The visible light image may include cyan, magenta, and yellow wavelengths of light. The diagnostic image may include a near-infrared wavelength. The diagnostic image may include an infrared wavelength. The diagnostic image may include a plurality of diagnostic images, each at a different range of wavelengths. The diagnostic image may be formed from one or more diagnostic wavelengths in the visible light range, the object being illuminated with a light source that is depleted in the diagnostic wavelength range.
  • The visible light image and diagnostic image may be processed and displayed in a medical imaging system. The medical imaging system may include a display for rendering a composite image including a superposition of the visible light image and the diagnostic image. The medical imaging system may include one or more inputs for controlling at least one of a field of view of the object, a focus of the object, or a zoom of the object. The medical imaging system may include a surgical tool. The visible light image and diagnostic image may be processed and displayed in at least one of a machine vision system, an astronomy system, a military system, a geology system, or an industrial system.
  • The system may be packaged in a camera. The camera may include a visible light image output, a diagnostic image output, and a combined image output, the combined image output providing a superposition of the visible light image and the diagnostic image. The system may capture moving video, or the system may capture still images.
  • In another aspect, the system may include a solid state device that captures a visible light image of an object under illumination in digital form and a diagnostic image of the object in digital form, the diagnostic image corresponding to an intensity of emission from an imaging medium within the object.
  • In another aspect, the system may include a single camera that captures a visible light image of an object under illumination and a diagnostic image of the object, the diagnostic image corresponding to an intensity of emission from the object, the camera configured to provide a digital version of the visible light image and a digital version of the diagnostic image to an external display system.
  • In another aspect, a method may include the steps of illuminating an object to provide an image; capturing an image of the object that includes a visible light image and a diagnostic image, the diagnostic image corresponding to emissions from an imaging medium within the object; and storing the image.
  • Capturing an image may include passing the image through a filter wheel that exposes an image capture device to the image through a plurality of filters, at least one of the plurality of filters selectively passing wavelengths of light to form a visible light image of the object and at least one of the plurality of filters selectively passing wavelengths of light to form a diagnostic image of the object. Capturing an image may include passing the image through a beam splitter and filters that selectively pass incident photons along a number of paths according to wavelength and exposing each one of a plurality of devices that capture photon intensity to an image including wavelengths passed along one of the number of paths, at least one of the paths selectively passing wavelengths to form a diagnostic image of the object, and at least one of the paths selectively passing wavelengths to form a visible light image of the object.
  • Capturing an image may include capturing the image at a plurality of pixel locations, each one of the plurality of pixel locations covered by a filter, at least one of the filters selectively passing wavelengths to form a visible light image of the object at a corresponding pixel location and at least one of the filters selectively passing wavelengths of light to form a diagnostic image of the object at a corresponding pixel location. Capturing the image may include capturing the image at a plurality of pixel locations, each one of the plurality of pixel locations covered by a filter, at least one of the filters selectively passing wavelengths to form a visible light image of the object at a corresponding pixel location and at least one of the filters selectively passing wavelengths of light to form a diagnostic image of the object at a corresponding pixel location. Capturing the image may include capturing the image at a plurality of pixel locations, each one of the plurality of pixel locations including a plurality of successive diode junctions formed at the boundary of nested p-type and n-type semiconductor wells, each diode junction selectively detecting incident light over a range of wavelengths, at least one of the diode junctions detecting wavelengths of a visible light image of the object and at least one of the diode junctions detecting wavelengths of a diagnostic image of the object at that pixel location.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention will be appreciated more fully from the following further description thereof, with reference to the accompanying drawings, wherein:
  • FIG. 1 is a block diagram of a prior art imaging system;
  • FIG. 2 is a block diagram of an imaging system with an integrated image capture device;
  • FIG. 3 depicts an embodiment of an image capture device;
  • FIG. 4 depicts an embodiment of an image capture device;
  • FIG. 5 is a side view of an image capture device on a single, integrated semiconductor device;
  • FIG. 6 is a top view of an image capture device on a single, integrated semiconductor device; and
  • FIG. 7 is a side view of an image capture device on a single, integrated semiconductor device.
  • DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS OF THE INVENTION
  • To provide an overall understanding of the invention, certain illustrative embodiments will now be described, including an image capture device for simultaneously capturing visible-light and near-infrared images. It will be understood that the methods and systems described herein can be suitably adapted to a range of medical imaging applications where visible light tissue images may be usefully combined with diagnostic image information obtained from other specified wavelengths. For example, the systems may be applicable to a wide range of diagnostic or surgical applications where a target pathology, tissue type, or cell may be labeled with a fluorescent dye or other fluorescent substance. More generally, the systems described herein may be adapted to any imaging application where a visible light image may be usefully enhanced with an image of one or more features that are functionally marked to emit photons outside the visible light range by a dye or other material that emits photons at a known wavelength, either inherently or in response to another known wavelength. The systems may also be employed, for example, in a machine vision system, or in a variety of other military, industrial, geological, astronomical or other imaging systems. These and other applications of the systems described herein are intended to fall within the scope of the invention.
  • FIG. 1 is a block diagram of a prior art imaging system. The imaging system 100 may include a light source 102 directed at a surgical field 104, a near-infrared camera 106 receiving near-infrared light from the surgical field 104 through a dichroic mirror 108, and a visible light camera 110 receiving visible light reflected from the dichroic mirror 108. A processing and display system 112 receives data from the cameras 106. A dye source (not shown) containing a dye may also be included for introduction into an object in the surgical field 104, such as through injection into the bloodstream of a patient.
  • The light source 102 may include a visible light source and an excitation light source that illuminate the surgical field 104. Preferably, the light source 102 is depleted in the wavelength region where the dye or other emitting substance emits light, so that the light source 102 does not interfere with a diagnostic image obtained from the dye.
  • The near-infrared camera 106 captures a diagnostic image from an illuminated object within the surgical field at a wavelength or range of wavelengths that pass through the dichroic mirror 108, while the visible light camera 110 captures a visible light image from the illuminated object within the surgical field. The visible light image may be captured and rendered in a number of manners, such as “RGB” (red-green-blue), “CMY” (cyan-magenta-yellow), or monochromatic grayscale. The diagnostic image corresponds to emissions from the dye or other imaging medium introduced to the object, such that the resulting image is based, for example, upon the distribution of the dye in within the object in the surgical field.
  • It will be appreciated that visible light typically includes light wavelengths from 400 nm to 700 nm, while near-infrared light may include one or more wavelengths of about 810 nm, or more generally, wavelengths from about 700 nm to about 1000 nm. In other variations, the emission wavelength may be, for example, other near-infrared wavelengths, an infrared wavelength, or a far red wavelength. More generally, the emission wavelength may be any wavelength of emission that can be generated by an imaging medium introduced into an imaging subject and usefully captured for imaging with the devices described herein.
  • The imaging medium may include, for example, fluorescent dyes that emit in response to a stimulus wavelength, or a substance that emits photons at a known wavelength without stimulus, such as phosphorescent, chemoluminescent, or scintillant substances. As one useful example, fluorescent substances, such as semiconductor nanocrystals (a.k.a. “quantum dots”) may be used to emit photons at one or more specific wavelengths, such as infrared wavelengths (e.g., 1320 nm). The imaging medium may include substances inherently present in the object being imaged, such as fluorescent or phosphorescent endogenous biological substances. In certain embodiments, the imaging medium may emit light in a range within the visible light spectrum. This may be usefully employed as a diagnostic imaging source, provided the light source 102 is adequately depleted, such as through filtering, in a corresponding wavelength range so that the light source 102 does not produce reflected light within the diagnostic image wavelength range. The term “imaging medium” as used herein, refers to any of the imaging media described above, or any other medium capable of emitting photons useful in locating regions of functional or diagnostic significance. A “diagnostic image” as that term is used herein, refers to any image formed by detecting emissions from the imaging media described above.
  • Once captured, each diagnostic image may be shifted to a suitable visible light wavelength for purposes of display. In one embodiment, this pseudo-coloring employs a color specifically selected to provide a substantial color contrast with the object of the visible light image. A color may be selected in advance, such as bright lime green for a diagnostic image over living tissue, or the color may be determined automatically by an algorithm designed to determine average background color and choose a suitable contrasting color. The visible light and diagnostic images may be combined by the image processing and display system 112 and presented on a display where they may be used, for example, by a surgeon conducting a surgical procedure. The processing and display system 112 may include any suitable hardware and software to combine and render the diagnostic and visible light images in any manner useful for a user of the system 100, such as a composite image formed from a superposition of the diagnostic and visible light images, or side-by-side rendering of the diagnostic and visible light images.
  • The processing and display system 112 may be part of a medical imaging system that also includes, for example, inputs to control visual navigation of the surgical field, such as field of view (e.g., X and Y panning), zoom, and focus, or inputs for controlling a surgical tool associated with the system 100. Similar controls may be provided for the non-medical applications noted above, with certain adaptations as appropriate, such as azimuth and elevation in place of X/Y panning for astronomical applications.
  • It will be appreciated that certain of the terms and concepts introduced above are applicable to some or all of the embodiments described below, such as the terms diagnostic image and imaging medium, as well as the nature of the processing and display system, except as specifically noted below. It will also be appreciated that adaptations may be made. For example, where a single, integrated camera is provided for capturing both a visible light image and a diagnostic image, some of the image processing for pseudo-coloring the diagnostic image and superimposing the diagnostic image onto the visible light image may be provided by the camera, with an output of the processed image provided in any suitable format to a computer, display, or medical imaging system.
  • FIG. 2 is a block diagram of an imaging system with an integrated image capture device. The imaging system 200 may include a light source 202 directed at a surgical field 204, an image capture device 206 receiving light from the surgical field 204, and a processing and display system 208. A dye source (not shown) containing in imaging medium such as a dye may also be included for introduction into an object in the surgical field 204, such as through injection into the bloodstream of a patient. The imaging system 200 may be in many respects like the imaging system 100 described above with reference to FIG. 1. It will readily be appreciated that the imaging system 200 differs in at least one respect—the use of a single image capture system 206.
  • The system 200 advantageously incorporates visible light and diagnostic wavelength imaging into a single device, the image capture system 206. This removes the need for additional external hardware, such as the dichroic mirror 108 of FIG. 1, or additional hardware and/or software in the processing and display system 208 to perform additional processing for images from two separate image capture devices, which processing may range from image registration to matching of frame rates, image sizes, and other features of images from disparate cameras. The image capture system 206 may be packaged as a single solid state device suitable for integration into a larger system, or as a camera with inputs for remote operation and/or outputs including a visible light output, a diagnostic image output, and a combined output that superimposes the diagnostic and visible light images.
  • In certain embodiments, the image capture system 206 may provide for capture of two or more wavelengths of diagnostic significance through adaptations of the systems described below. Thus two or more diagnostic images may be displayed, and/or superimposed on a visible light image in order to simultaneously visualize two or more features or characteristics of interest. The image capture system 206 may provide still images, or may provide moving images, such as in a real-time display of a surgical field.
  • A number of technologies may be suitable adapted to the image capture system 206. Charge-Coupled Devices (“CCDs”), for example, are known for use in capturing digital images. These devices may be fabricated on silicon substrates and packaged as chips, employing various CCD technologies. For example, full-frame-transfer (“FF”) and frame-transfer (“FT”) devices employ MOS photocapacitors as detectors, while interline transfer (“IL”) devices use photodiodes and photocapacitors for each detector. These architectures are among the more commonly employed architectures in current CCD cameras. Each CCD technology has its own advantages and disadvantages, resulting in trade-offs between, for example, cost, design complexity, and performance. These technologies are generally adaptable to the systems described herein, and carry with them the corresponding design trade-offs, as will be appreciated by those of skill in the art.
  • Other image-sensing architectures using charge-coupled devices are known, and may be usefully employed with the systems described herein, including frame-interline transfer devices, accordion devices, charge injection devices, and MOS X, Y addressable devices. All such devices are intended to fall within the meaning of “charge-coupled device” as that term is used herein. While all of these devices are useful for converting incident photons into measurable electronic charges, they are inherently monochromatic in nature. As such, color-imaging applications have been devised for these CCDs that selectively image different wavelengths. These techniques for wavelength selection may be adapted to the present system as described in greater detail below.
  • FIG. 3 depicts an embodiment of an image capture device. The device 300 applies an adaptation of mechanical color wheels used for some conventional red-green-blue (“RGB”) imaging systems. A image of an illuminated object may be focused through a lens and captured in four successive exposures, each synchronized with a filter wheel 302 having desired optical characteristics. In the depicted embodiment, this includes a red filter 304 that selectively passes red light, a green filter 306 that selectively passes green light, a blue filter 308 that selectively passes blue light, and a near-infrared filter 310 that selectively passes near-infrared light. The CCD 312 is exposed to the image through the red, green, and blue filters 304, 306, 308 collectively to capture a visible light image, and exposed to the image through the near-infrared filter 310 that selectively passes near-infrared emissions to capture a diagnostic image of interest, such as emission from a fluorescent dye.
  • Each exposure of the CCD 312 is sequentially read into data storage (not shown) where it can be reconstructed into a complete image. It will be appreciated that a number of other wavelengths may be selectively passed by the fourth filter to obtain a diagnostic image of the object, including infrared wavelengths or other wavelengths of interest, as generally described above. It will further be appreciated that additional filters may be added to the color wheel so that two or more emission wavelengths may be captured within the same image. Thus a color wheel with five or more filters is contemplated by the systems described herein.
  • In another aspect, a method according to the above system may include capturing an image that passes through the filter wheel 302 on the CCD 312 or other image capture device to obtain a visible light image and a diagnostic image.
  • FIG. 4 depicts an embodiment of an image capture device. As shown in the figure, a multi-chip device 400 may employ optics to split an image into separate image planes. A focused image is provided to the device 400, such as by passing an image of an illuminated object through a lens. A plurality of CCDs 402 or other image capture devices for measuring photon intensity are exposed to the image through a beam splitter 404, with a CCD 402 placed in each image plane exiting the beam splitter 404. A filter may also be provided for each CCD 402 to selectively expose the CCD 402 to a range of wavelengths, so that the image is selectively passed along a number of paths according to wavelengths. It will be appreciated that other similar approaches may be used to apply differing wavelengths to a collection of CCDs 402 in a multi-chip CCD system, such as a prism or a wavelength separating optical device or devices. In such systems, the CCDs 402 may be operated synchronously to capture different incident wavelengths at or near the same point in time.
  • In FIG. 4, the beam-splitter 404 provides four different CCDs 402, a first CCD with a filter that passes red light, a second CCD with a filter that passes green light, a third CCD with a filter that passes blue light, and a fourth CCD with a filter that passes near-infrared light. The first three CCDs produce a visible light image, while the fourth CCD produces a diagnostic image according to an imaging medium introduced into the object. However, It will be appreciated that other wavelengths may be passed by the fourth filter, including infrared wavelengths or other wavelengths of interest. It will further be appreciated that additional light paths may be provided by the beam splitter with additional filtered CCDs for each path, so that two or more emission wavelengths may be captured within the same image. Thus a system with five or more CCDs is contemplated by the systems described herein.
  • In another aspect, a method according to the above system may include capturing an image that passes through the beam splitter 404 and filters that selectively pass incident photons along a number of paths according to wavelength, with each CCD (or other image capture device) capturing either a visible light image or a diagnostic image of the illuminated object.
  • FIG. 5 is a side view of an image capture device on a single, integrated semiconductor device. The figure shows a CCD array 500 with wavelength selection using an integral filter array. The CCD array 500 may include lenses 502, a filter array 504, gates 506, photodiodes 508, a substrate 510, vertical charge-coupled devices (“VCCDs”) 512, and an insulation layer 514.
  • In the CCD array 500, the photodiodes 508 serve to detect the intensity of incident photons at pixel locations within a focused image, while the filter array 504 with appropriate characteristics are arranged over the photodiodes 508 such that different photodiodes are exposed to different wavelengths of incident light. The filter array 504 may include, for example red filters that selectively pass red wavelengths (labeled “R”), green filters that selectively pass green wavelengths (labeled “G”), blue filters that selectively pass blue wavelengths (labeled “B”), and near-infrared filters that selectively pass near-infrared wavelengths (labeled “I”). The VCCDs 512 may be formed vertically between the photodiodes 508 for transferring signals produced by photoelectric conversion in the photodiodes 508. The insulation layer 514 may be formed over the entire surface of the semiconductor substrate 510 (including the photodiodes 508 and the VCCD 512), and the plurality of gates 506 may be formed on the insulation layer 514 above each VCCD 512 for controlling the transfer of the photodiode signals. A metal shielding layer for shielding light may be deposited over the gates 512, except for the light-receiving regions of the photodiodes 508. A flat insulation film may then be deposited over the entire surface of the semiconductor substrate including the metal shielding layer.
  • The filter array 504, which passes either red (“R”), green (“G”), blue (“B”) (collectively for forming a visible light image), or near-infrared (“NI”) wavelengths (for a diagnostic image) may then be formed over each photodiode 508 corresponding to a pixel to be imaged from an illuminated object. A top coating layer may be deposited on the filter layer 504. Finally, a lens 502 may be formed on the top coating layer for concentrating photons on each photodiode.
  • The filter array 504 separates the spectrum of incident light to selectively pass only the light of a predetermined wavelength, or range of wavelengths, to reach each of the photodiodes. The metal shielding layer restricts incident light to the photodiodes 508. The incident light is converted into an electric signal in the photodiodes 508, and transferred out to a processor under control of the gates 506.
  • The filter array 504 may be dyed or otherwise masked or processed so that each photodiode 508 is exposed to a specific wavelength or range of wavelengths. It will be appreciated that the device of FIG. 5 is an example only, and that a number of different CCD topologies may be used with an integral filter array 504, and may be suitably adapted to the systems described herein. It should also be appreciated that other photoactive substances may be included in place of, or in addition to, the filters in the filter array 504, in order to enhance response at certain wavelengths, or to affect a shift in wavelength to a more suitable frequency for measurement by the photodiodes. All such variations are intended to fall within the scope of this description.
  • FIG. 6 is a top view of an image capture device on a single, integrated semiconductor device. The figure depicts one possible arrangement of filters 602 for use with the systems described herein. In this integral filter array 602, four filters are arranged to expose photodiodes to red (labeled “R”), green (labeled “G”), blue (labeled “B”), and near-infrared (labeled “I”) wavelengths. Each two-by-two group of photodiodes may form a pixel, with four wavelength measurements being detected for that pixel at different photodiodes.
  • It will be appreciated that a number of other wavelengths may be passed by the fourth filter (“I”), including infrared wavelengths or other wavelengths of interest. It will further be appreciated that additional filters may be disposed upon the CCD, with suitable adjustments to the arrangement of filters, so that two or more emission wavelengths may be captured within the same image. Thus a system with an integral filter for five or more wavelengths is contemplated by the systems described above.
  • In another aspect, a method according to the above system may include capturing an image that passes through a filter array that selectively passes wavelengths of either a visible light image or a diagnostic image to a pixel location in a charge coupled device.
  • FIG. 7 is a side view of an image capture device on a single, integrated semiconductor device. As shown in the figure, the device 700 may include a number of nested p-type and n-type wells, with a p-n diode junction formed at each well boundary that is sensitive to incident photons of a particular wavelength range. By measuring current across these p-n junctions while the device 700 is exposed to light, photon intensity over a number of contiguous wavelengths may be detected at the same location at the same time.
  • More specifically, the device 700 includes an n-type substrate 702, a p-well 704 within the n-type substrate, an n-well 706 within the p-well 704, a p-well 708 within the n-well 706, and an n-drain 710 within the p-well 708. A first detector 712 measures photocurrent across a first p-n junction 714 and is generally sensitive to blue wavelengths. A second detector 716 measures photocurrent across a second p-n junction 718, and is generally sensitive to green wavelengths. A third detector 720 measures photocurrent across a third p-n junction 722, and is generally sensitive to red wavelengths. A fourth detector 724 measures photocurrent across a fourth p-n junction 726, and is generally sensitive to an emission wavelength from an imaging medium within an object.
  • A similar, triple-well structure is described, for example, in U.S. Pat. No. 5,965,875 to Merrill. In general, such a device operates on the principle that photons of longer wavelengths will penetrate more deeply into silicon before absorption. By alternately doping wells for p-type or n-type conductivity, a number of successive photodiodes are created at the p-n junctions of successive layers, each being sensitive to progressively longer wavelengths of photon emissions. Using well-known active pixel technology to sense photocurrents from these diodes (as shown by circuits labeled “iB”, “iG”, “iR”, and “iNI”), each active pixel region senses photocharge by integrating the photocurrent on the capacitance of a photodiode and the associated circuit node, and then buffering the resulting voltage through a readout amplifier. A shallow, n-type, lightly-doped drain above the first p-type well may be employed to maximize blue response in the first photodiode.
  • The above quadruple-well system may be advantageously adapted to imaging systems where an emission wavelength is adjacent to, or nearly adjacent to, the visible light spectrum. The near-infrared spectrum, for example, is adjacent to the red wavelength spectrum, and may be measured with the device described above.
  • In another aspect, a method according to the above system may include capturing photon intensity from an illuminated object at a pixel location at a number of different wavelengths by measuring photocurrent at a plurality of successive diode junctions formed at the boundary of successively nested p-type and n-type semiconductor wells.
  • The near-infrared spectrum lies in a range that is particularly useful for certain medical imaging applications, due to the low absorption and autofluorescence of living-tissue components in this range. Within a range of 700 nm to 900 nm, the absorbances of hemoglobin, lipids, and water reach a cumulative minimum. This so-called “near-infrared window” provides a useful spectrum for excitation and emission wavelengths in living-tissue imaging applications, and a number of fluorescent dyes using these wavelengths have been developed for medical imaging applications. Thus the quadruple-well device described above not only employs a convenient range of wavelengths adjacent to visible light, it accommodates a number of dyes that are known to be safe and effective for tissue imaging, such as the IR-786 or the carboxylic acid form of IRDye-78, available from LI-COR, Inc.
  • It will be appreciated that each of the systems described above presents trade-offs in terms of cost, speed, image quality, and processing complexity. For example, the single-CCD filter wheel may introduce significant time delays between different wavelength images, and may not perform well in high-speed imaging applications. By contrast, the multi-chip approach requires more CCD elements and additional processing in order to maintain registration and calibration between separately obtained images, all of which may significantly increase costs. As such, different applications of the systems described herein may have different preferred embodiments.
  • The systems described above have numerous surgical applications when used in conjunction with fluorescent dyes. For example, the system may be deployed as an aid to cardiac surgery, where it may be used intraoperatively for direct visualization of cardiac blood flow, for direct visualization of myocardium at risk for infarction, and for image-guided placement of gene therapy and other medicinals to areas of interest. The system may be deployed as an aid to oncological surgery, where it may be used for direct visualization of tumor cells in a surgical field or for image-guided placement of gene therapy and other medicinals to an area of interest. The system may be deployed as an aid to general surgery for direct visualization of any function amenable to imaging with fluorescent dyes, including blood flow and tissue viability. In dermatology, the system may be used for sensitive detection of malignant cells or other skin conditions, and for non-surgical diagnosis of dermatological diseases using near-infrared ligands and/or antibodies. More generally, the CCD systems described herein may be used as imaging hardware in conjunction with open-surgical applications, and may also be integrated into a laparoscope, an endoscope, or any other medical device that employs' an imaging system. The systems may have further application in other non-medical imaging systems that combine visible light and non-visible light imaging.
  • In various embodiments, the system described herein includes a wavelength-selective solid state device, such as a CCD or other semiconductor device, a semiconductor chip that includes the solid state device, a camera employing the solid state device, an imaging system employing the solid state device, and methods of imaging that employ the solid state device. A camera using the imaging devices described above may include a lens, user inputs or a wired or wireless remote control input, the imaging device, processing to filter, store and otherwise manage captured images including functions such as superposition of diagnostic and visible light images and pseudo-coloring of the diagnostic image, and one or more outputs for providing the images to a remote device or system.
  • It will be appreciated that certain imaging technologies are more suitable to capturing certain wavelengths, including technologies such as gas photodiode or microplasma photodetectors, and certain substances or combinations of substances, such as Indium, Gallium, or Germanium may provide enhanced responsiveness over certain wavelength ranges. Some of these are consistent with CMOS manufacturing and may be realized directly on a wafer with visible-light-imaging circuitry and other processing circuitry, or manufactured with micro-electro-mechanical systems technology and packaged within the same chip as related circuitry, or the solid state system may be provided as a chipset that is assembled and provided on a suitable circuit board. All such technologies as may be useful for visible light imaging and/or diagnostic imaging over the wavelengths described above may be used for the solid state devices described above, and are intended to fall within the scope of the invention.
  • As a significant advantage, cameras using the devices described herein may receive an image through a single lens, and provide both visible light and diagnostic images on a single output. As another significant advantage, visible light and diagnostic images may be obtained from a single, solid-state device, reducing the requirement for moving parts, additional lenses and expensive optics, and post-processing associated with combining images from different sources.
  • While medical imaging applications have been described, it will be appreciated that the principles of the systems described above may be readily adapted to other applications, such as machine vision or a variety of other military, industrial, geological, astronomical or other imaging systems. For example, a machine vision system may employ a fluorescent dye that selectively adheres to surfaces of a certain texture, or aggregates in undesirable surface defects. A diagnostic image of the dye may assist in identifying and/or repairing these locations.
  • Thus, while the invention has been disclosed in connection with the preferred embodiments shown and described in detail, various modifications and improvements thereon will become readily apparent to those skilled in the art. It should be understood that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative, and not in a limiting sense, and that the invention should be interpreted in the broadest sense allowable by law.

Claims (20)

1. A system comprising;
a plurality of solid state devices that capture photon intensity from an illuminated object, the solid state devices being exposed to an image of the illuminated object through a beam splitter and filters that selectively pass incident photons along a number of paths according to wavelength, each one of the solid state devices being selectively exposed to a portion of the image including wavelengths passed along one of the number of paths, at least one of the paths selectively passing infrared wavelengths to form a diagnostic image of the illuminated object at one of the solid state devices that monochromatically represents an intensity of infrared wavelengths from the illuminated object corresponding to emissions from an imaging medium within the illuminated object, and at least one other one of the paths selectively passing wavelengths to another one of the solid state devices to form a visible light image of the illuminated object;
an image processing system configured to pseudocolor the diagnostic image to provide a pseudocolored diagnostic image, and configured to superimpose the pseudocolored diagnostic image onto the visible light image to provide a processed image; and
a camera containing a lens, the plurality of solid state devices, and the image processing system, the camera further including one or more inputs for remote operation of the camera and plurality of outputs for an external display system, the plurality of outputs including a visible light output for the visible light image, a diagnostic image output for the diagnostic image, a combined output for the processed image.
2. The system of claim 1 wherein the imaging medium is at least one of a fluorescent dye, a phosphorescent substance, a chemoluminscent substance, or a scintillant substance.
3. The system of claim 1 wherein the imaging medium is a substance introduced into the illuminated object.
4. The system of claim 1 wherein the imaging medium is a substance inherently present within the illuminated object.
5. The system of claim 1 wherein the illuminated object is an object within a surgical field.
6. The system of claim 1 wherein the visible light image includes red, blue and green wavelengths of light.
7. The system of claim 1 wherein the visible light image includes cyan, magenta, and yellow wavelengths of light.
8. The system of claim 1 wherein the diagnostic image includes a near-infrared wavelength.
9. The system of claim 1 wherein the diagnostic image includes a plurality of diagnostic images, each at a different range of wavelengths.
10. The system of claim 1 wherein the diagnostic image is formed from one or more diagnostic wavelengths in the visible light range, the illuminated object being illuminated with a light source that is depleted in the diagnostic wavelength range.
11. The system of claim 1 wherein the visible light image and diagnostic image are processed and displayed in a medical imaging system.
12. The system of claim 11, further comprising a display adapted to receive and render the processed image from the camera.
13. The system of claim 11 wherein the one or more inputs for the camera control at least one of a field of view of the illuminated object, a focus of the illuminated object, or a zoom of the illuminated object.
14. The system of claim 1 wherein the visible light image and diagnostic image are processed and displayed in at least one of a machine vision system, an astronomy system, a military system, a geology system, or an industrial system.
15. The system of claim 1 wherein the camera is a video camera that capture moving video.
16. The system of claim 1 wherein camera captures still images.
17. The system of claim 1 further comprising a visible light source positioned to illuminate the illuminated object.
18. The system of claim 17 wherein the visible light source is depleted in a region corresponding to the diagnostic image.
19. The system of claim 1 further comprising an excitation light source having an emission wavelength selected to excite the imaging medium and positioned to illuminate the illuminated object.
20. The system of claim 1 wherein the plurality of solid state devices include a plurality of charge-coupled devices.
US12/853,757 2002-05-22 2010-08-10 Device for wavelength-selective imaging Abandoned US20100305455A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/853,757 US20100305455A1 (en) 2002-05-22 2010-08-10 Device for wavelength-selective imaging

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US38252402P 2002-05-22 2002-05-22
PCT/US2003/016285 WO2003100925A2 (en) 2002-05-22 2003-05-22 Device for wavelength-selective imaging
US10/517,280 US7794394B2 (en) 2002-05-22 2003-05-22 Device for wavelength-selective imaging
US12/853,757 US20100305455A1 (en) 2002-05-22 2010-08-10 Device for wavelength-selective imaging

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US10/517,280 Continuation US7794394B2 (en) 2002-05-22 2003-05-22 Device for wavelength-selective imaging
PCT/US2003/016285 Continuation WO2003100925A2 (en) 2002-05-22 2003-05-22 Device for wavelength-selective imaging

Publications (1)

Publication Number Publication Date
US20100305455A1 true US20100305455A1 (en) 2010-12-02

Family

ID=29584421

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/517,280 Active - Reinstated 2027-03-12 US7794394B2 (en) 2002-05-22 2003-05-22 Device for wavelength-selective imaging
US12/853,757 Abandoned US20100305455A1 (en) 2002-05-22 2010-08-10 Device for wavelength-selective imaging

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/517,280 Active - Reinstated 2027-03-12 US7794394B2 (en) 2002-05-22 2003-05-22 Device for wavelength-selective imaging

Country Status (3)

Country Link
US (2) US7794394B2 (en)
AU (1) AU2003248559A1 (en)
WO (1) WO2003100925A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110261179A1 (en) * 2010-04-22 2011-10-27 General Electric Company Imaging system for fluorescence guided surgery based on fixed magnification lens and digital zoom
US20110261175A1 (en) * 2010-04-22 2011-10-27 General Electric Company Multiple channel imaging system and method for fluorescence guided surgery
US20120026339A1 (en) * 2010-07-28 2012-02-02 National University Corporation Kochi University White balance adjustment method and imaging device
JP2012178542A (en) * 2011-01-31 2012-09-13 Canon Inc Photoelectric conversion device
US20130038782A1 (en) * 2011-08-12 2013-02-14 Ocean Thin Films, Inc. Method to improve filter wheel imaging system data capture rate and add functionality through the use of an improved filter wheel design
US20130044126A1 (en) * 2011-08-16 2013-02-21 Fujifilm Corporation Image display method and apparatus
US8937646B1 (en) * 2011-10-05 2015-01-20 Amazon Technologies, Inc. Stereo imaging using disparate imaging devices
JP2016138789A (en) * 2015-01-27 2016-08-04 地方独立行政法人北海道立総合研究機構 Spectral imaging system
CN107003244A (en) * 2015-01-27 2017-08-01 株式会社日立高新技术 Multicolor fluorescence analysis device

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8620410B2 (en) * 2002-03-12 2013-12-31 Beth Israel Deaconess Medical Center Multi-channel medical imaging system
AU2003248559A1 (en) * 2002-05-22 2003-12-12 Beth Israel Deaconess Medical Center Device for wavelength-selective imaging
US20060055800A1 (en) * 2002-12-18 2006-03-16 Noble Device Technologies Corp. Adaptive solid state image sensor
JP4385284B2 (en) * 2003-12-24 2009-12-16 ソニー株式会社 Imaging apparatus and imaging method
KR100542370B1 (en) 2004-07-30 2006-01-11 한양대학교 산학협력단 Vision-based augmented reality system using invisible marker
WO2006020661A2 (en) * 2004-08-10 2006-02-23 Nicholson Bruce A Sodium screen digital traveling matte methods and apparatus
US20080260225A1 (en) * 2004-10-06 2008-10-23 Harold Szu Infrared Multi-Spectral Camera and Process of Using Infrared Multi-Spectral Camera
US7355182B2 (en) * 2004-10-06 2008-04-08 Harold Szu Infrared multi-spectral camera and process of using infrared multi-spectral camera
DE102005013044B4 (en) * 2005-03-18 2007-08-09 Siemens Ag Fluorescence scanner
JPWO2007074923A1 (en) * 2005-12-27 2009-06-04 オリンパス株式会社 Luminescence measuring device and luminescence measuring method
WO2007098392A2 (en) * 2006-02-21 2007-08-30 Glaxo Group Limited Method and system for chemical specific spectral analysis
JP5015496B2 (en) * 2006-06-01 2012-08-29 ルネサスエレクトロニクス株式会社 Solid-state imaging device, imaging method, and imaging system
JP2007336362A (en) * 2006-06-16 2007-12-27 Fujifilm Corp Information reader
US20080004533A1 (en) * 2006-06-30 2008-01-03 General Electric Company Optical imaging systems and methods
WO2008076467A2 (en) * 2006-07-03 2008-06-26 Beth Israel Deaconess Medical Center, Inc. Intraoperative imaging methods
CN101513065B (en) 2006-07-11 2012-05-09 索尼株式会社 Using quantum nanodots in motion pictures or video games
US9080977B2 (en) * 2006-10-23 2015-07-14 Xenogen Corporation Apparatus and methods for fluorescence guided surgery
US7767967B2 (en) * 2006-11-01 2010-08-03 Sony Corporation Capturing motion using quantum nanodot sensors
GB2443664A (en) * 2006-11-10 2008-05-14 Autoliv Dev An infra red object detection system de-emphasizing non relevant hot objects
US20090021598A1 (en) * 2006-12-06 2009-01-22 Mclean John Miniature integrated multispectral/multipolarization digital camera
DE502006007337D1 (en) * 2006-12-11 2010-08-12 Brainlab Ag Multi-band tracking and calibration system
SG178785A1 (en) * 2007-02-13 2012-03-29 Univ Singapore An imaging device and method
JP5380690B2 (en) * 2008-01-08 2014-01-08 オリンパスメディカルシステムズ株式会社 Endoscope objective optical system and endoscope system using the same
US7888763B2 (en) * 2008-02-08 2011-02-15 Omnivision Technologies, Inc. Backside illuminated imaging sensor with improved infrared sensitivity
JP4987790B2 (en) * 2008-04-15 2012-07-25 オリンパスメディカルシステムズ株式会社 Imaging device
EP2133918B1 (en) 2008-06-09 2015-01-28 Sony Corporation Solid-state imaging device, drive method thereof and electronic apparatus
US8084739B2 (en) 2008-07-16 2011-12-27 Infrared Newco., Inc. Imaging apparatus and methods
US8686365B2 (en) * 2008-07-28 2014-04-01 Infrared Newco, Inc. Imaging apparatus and methods
WO2010047807A1 (en) * 2008-10-22 2010-04-29 Tom Chang Light detection circuit for ambient light and proximity sensor
JP5271062B2 (en) * 2008-12-09 2013-08-21 富士フイルム株式会社 Endoscope apparatus and method of operating the same
JP2011072530A (en) * 2009-09-30 2011-04-14 Fujifilm Corp Imaging apparatus
EP2579084B1 (en) * 2010-06-03 2023-11-08 Nikon Corporation Microscope device
US8295572B2 (en) * 2010-12-10 2012-10-23 National Taiwan University Dual-spectrum heat pattern separation algorithm for assessing chemotherapy treatment response and early detection
KR101262507B1 (en) * 2011-04-11 2013-05-08 엘지이노텍 주식회사 Pixel, pixel array and method for manufacturing the pixel array and image sensor inclduding the pixel array
KR101942337B1 (en) 2011-05-12 2019-01-25 디퍼이 신테스 프로덕츠, 인코포레이티드 Pixel array area optimization using stacking scheme for hybrid image sensor with minimal vertical interconnects
RU2616653C2 (en) * 2012-06-05 2017-04-18 Хайпермед Имэджинг, Инк. Methods and device for coaxial image forming with multiple wavelengths
IN2015MN00020A (en) 2012-07-26 2015-10-16 Olive Medical Corp
CN104619237B (en) 2012-07-26 2018-03-30 德普伊辛迪斯制品公司 The pulse modulated illumination schemes of YCBCR in light deficiency environment
BR112015001369A2 (en) 2012-07-26 2017-07-04 Olive Medical Corp CMOS Minimum Area Monolithic Image Sensor Camera System
MX356890B (en) 2012-07-26 2018-06-19 Depuy Synthes Products Inc Continuous video in a light deficient environment.
CN103006180B (en) * 2012-12-01 2014-10-22 清华大学深圳研究生院 Integrated vein search projection device and system
EP2961310B1 (en) 2013-02-28 2021-02-17 DePuy Synthes Products, Inc. Videostroboscopy of vocal chords with cmos sensors
AU2014233486B2 (en) 2013-03-15 2018-11-15 DePuy Synthes Products, Inc. Viewing trocar with integrated prism for use with angled endoscope
WO2014144947A1 (en) 2013-03-15 2014-09-18 Olive Medical Corporation Super resolution and color motion artifact correction in a pulsed color imaging system
CA2906975A1 (en) 2013-03-15 2014-09-18 Olive Medical Corporation Minimize image sensor i/o and conductor counts in endoscope applications
US10362240B2 (en) 2013-03-15 2019-07-23 DePuy Synthes Products, Inc. Image rotation using software for endoscopic applications
BR112015022944A2 (en) 2013-03-15 2017-07-18 Olive Medical Corp calibration using distal cap
AU2014233475B2 (en) 2013-03-15 2018-07-26 DePuy Synthes Products, Inc. System and method for removing speckle from a scene lit by a coherent light source
US10517469B2 (en) 2013-03-15 2019-12-31 DePuy Synthes Products, Inc. Image sensor synchronization without input clock and data transmission clock
CA2906821A1 (en) 2013-03-15 2014-09-18 Olive Medical Corporation Scope sensing in a light controlled environment
CN105246395B (en) 2013-03-15 2019-01-22 德普伊新特斯产品公司 Comprehensive fixed pattern noise is eliminated
US9777913B2 (en) 2013-03-15 2017-10-03 DePuy Synthes Products, Inc. Controlling the integral light energy of a laser pulse
AU2014233518C1 (en) 2013-03-15 2019-04-04 DePuy Synthes Products, Inc. Noise aware edge enhancement
WO2014165805A2 (en) * 2013-04-04 2014-10-09 Children's National Medical Center Device and method for generating composite images for endoscopic surgery of moving and deformable anatomy
US9987093B2 (en) 2013-07-08 2018-06-05 Brainlab Ag Single-marker navigation
US9860520B2 (en) * 2013-07-23 2018-01-02 Sirona Dental Systems Gmbh Method, system, apparatus, and computer program for 3D acquisition and caries detection
CN104799874B (en) * 2014-01-28 2018-09-18 上海西门子医疗器械有限公司 Projecting method, device and system and the Medical Devices of medical image
US10084944B2 (en) 2014-03-21 2018-09-25 DePuy Synthes Products, Inc. Card edge connector for an imaging sensor
CN106461462B (en) 2014-03-21 2019-03-26 海佩尔梅德影像有限公司 Compact optical sensor
US9655519B2 (en) 2014-03-21 2017-05-23 Hypermed Imaging, Inc. Systems and methods for performing an imaging test under constrained conditions
US11206987B2 (en) * 2015-04-03 2021-12-28 Suzhou Caring Medical Co., Ltd. Method and apparatus for concurrent imaging at visible and infrared wavelengths
CN104799829A (en) * 2015-05-18 2015-07-29 郑州麦德杰医疗科技有限公司 Vein imaging system
US10798310B2 (en) 2016-05-17 2020-10-06 Hypermed Imaging, Inc. Hyperspectral imager coupled with indicator molecule tracking
US11595595B2 (en) 2016-09-27 2023-02-28 Rxsafe Llc Verification system for a pharmacy packaging system
US10187593B2 (en) 2016-09-27 2019-01-22 Rxsafe Llc Verification system for a pharmacy packaging system
US10852236B2 (en) * 2017-09-12 2020-12-01 Curadel, LLC Method of measuring plant nutrient transport using near-infrared imaging
JP6927407B2 (en) * 2018-02-23 2021-08-25 株式会社村田製作所 Biological signal sensor
CN111447423A (en) * 2020-03-25 2020-07-24 浙江大华技术股份有限公司 Image sensor, imaging apparatus, and image processing method

Citations (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4238760A (en) * 1978-10-06 1980-12-09 Recognition Equipment Incorporated Multi-spectrum photodiode devices
US4807026A (en) * 1986-03-19 1989-02-21 Olympus Optical Co., Ltd. Electronic image pickup device for endoscopes
US4898175A (en) * 1986-12-26 1990-02-06 Olympus Optical Co., Ltd. Out-body observing apparatus
US4953539A (en) * 1986-12-26 1990-09-04 Olympus Optical Co., Ltd. Endoscope apparatus
US4974076A (en) * 1986-11-29 1990-11-27 Olympus Optical Co., Ltd. Imaging apparatus and endoscope apparatus using the same
US5092331A (en) * 1989-01-30 1992-03-03 Olympus Optical Co., Ltd. Fluorescence endoscopy and endoscopic device therefor
US5174298A (en) * 1987-07-03 1992-12-29 General Electric Cgr S.A. Imaging process and system for transillumination with photon frequency marking
US5209220A (en) * 1989-10-05 1993-05-11 Olympus Optical Co., Ltd. Endoscope image data compressing apparatus
US5255087A (en) * 1986-11-29 1993-10-19 Olympus Optical Co., Ltd. Imaging apparatus and endoscope apparatus using the same
USRE34411E (en) * 1986-03-19 1993-10-19 Olympus Optical Co., Ltd. Electronic image pickup device for endoscopes
US5304809A (en) * 1992-09-15 1994-04-19 Luxtron Corporation Luminescent decay time measurements by use of a CCD camera
US5365084A (en) * 1991-02-20 1994-11-15 Pressco Technology, Inc. Video inspection system employing multiple spectrum LED illumination
US5465718A (en) * 1990-08-10 1995-11-14 Hochman; Daryl Solid tumor, cortical function, and nerve tissue imaging methods and device
US5469239A (en) * 1987-01-06 1995-11-21 Minolta Camera Kabushiki Kaisha Image sensing system
US5474910A (en) * 1993-10-15 1995-12-12 Alfano; Robert R. Method and device for detecting biological molecules and/or microorganisms within a desired area or space
US5550582A (en) * 1993-03-19 1996-08-27 Olympus Optical Co., Ltd. Endoscope-image processing apparatus for performing image processing of emphasis in endoscope image by pigment concentration distribution
US5660181A (en) * 1994-12-12 1997-08-26 Physical Optics Corporation Hybrid neural network and multiple fiber probe for in-depth 3-D mapping
US5697885A (en) * 1989-01-30 1997-12-16 Olympus Optical Co., Ltd. Endoscope for recording and displaying time-serial images
US5701903A (en) * 1994-06-23 1997-12-30 Asahi Kogaku Kogyo Kabushiki Kaisha Fluoroscopic apparatus
US5749830A (en) * 1993-12-03 1998-05-12 Olympus Optical Co., Ltd. Fluorescent endoscope apparatus
US5832931A (en) * 1996-10-30 1998-11-10 Photogen, Inc. Method for improved selectivity in photo-activation and detection of molecular diagnostic agents
US5845639A (en) * 1990-08-10 1998-12-08 Board Of Regents Of The University Of Washington Optical imaging methods
US5882301A (en) * 1995-12-13 1999-03-16 Yoshida; Akitoshi Measuring apparatus for intraocular substance employing light from eyeball
US5910816A (en) * 1995-06-07 1999-06-08 Stryker Corporation Imaging system with independent processing of visible an infrared light energy
US5965875A (en) * 1998-04-24 1999-10-12 Foveon, Inc. Color separation in an active pixel cell imaging array using a triple-well structure
US5983120A (en) * 1995-10-23 1999-11-09 Cytometrics, Inc. Method and apparatus for reflected imaging analysis
USRE36529E (en) * 1992-03-06 2000-01-25 The United States Of America As Represented By The Department Of Health And Human Services Spectroscopic imaging device employing imaging quality spectral filters
US6061591A (en) * 1996-03-29 2000-05-09 Richard Wolf Gmbh Arrangement and method for diagnosing malignant tissue by fluorescence observation
US6094281A (en) * 1993-01-01 2000-07-25 Canon Kabushiki Kaisha Image reading device with offset faces for visible and non-visible light sensors
US6122042A (en) * 1997-02-07 2000-09-19 Wunderman; Irwin Devices and methods for optically identifying characteristics of material objects
US6133953A (en) * 1997-02-20 2000-10-17 Sanyo Electric Co., Ltd. Color camera having a single imaging element and capable of restricting unwanted false color signal
US6166385A (en) * 1995-09-19 2000-12-26 Cornell Research Foundation, Inc. Multi-photon laser microscopy
US6181414B1 (en) * 1998-02-06 2001-01-30 Morphometrix Technologies Inc Infrared spectroscopy for medical imaging
US6198147B1 (en) * 1998-07-06 2001-03-06 Intel Corporation Detecting infrared and visible light
WO2001022870A1 (en) * 1999-09-24 2001-04-05 National Research Council Of Canada Method and apparatus for performing intra-operative angiography
US6241672B1 (en) * 1990-08-10 2001-06-05 University Of Washington Method and apparatus for optically imaging solid tumor tissue
US6293911B1 (en) * 1996-11-20 2001-09-25 Olympus Optical Co., Ltd. Fluorescent endoscope system enabling simultaneous normal light observation and fluorescence observation in infrared spectrum
US6326636B1 (en) * 1998-06-10 2001-12-04 Fuji Photo Film Co., Ltd. Radiation image read-out method and apparatus
US20020013512A1 (en) * 2000-05-25 2002-01-31 Fuji Photo Film Co., Ltd. Fluorescent endoscope apparatus
US20020014595A1 (en) * 2000-08-02 2002-02-07 Fuji Photo Film Co., Ltd Flourescent - light image display method and apparatus therefor
US20020022768A1 (en) * 2000-07-27 2002-02-21 Asahi Kogaku Kogyo Kabushiki Kaisha Optical system for the light source device of a video endoscope system
US6364829B1 (en) * 1999-01-26 2002-04-02 Newton Laboratories, Inc. Autofluorescence imaging system for endoscopy
US20020049389A1 (en) * 1996-09-04 2002-04-25 Abreu Marcio Marc Noninvasive measurement of chemical substances
US20020049386A1 (en) * 2000-10-06 2002-04-25 Yang Victor X.D. Multi-spectral fluorescence imaging and spectroscopy device
US6455908B1 (en) * 2001-03-09 2002-09-24 Applied Optoelectronics, Inc. Multispectral radiation detectors using strain-compensating superlattices
US7794394B2 (en) * 2002-05-22 2010-09-14 Beth Israel Deaconess Medical Center Device for wavelength-selective imaging

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3436798A1 (en) 1984-10-06 1986-04-17 Goetze Ag, 5093 Burscheid MECHANICAL SEAL
US6727521B2 (en) 2000-09-25 2004-04-27 Foveon, Inc. Vertical color filter detector group and array
EP1566142A1 (en) 2004-02-19 2005-08-24 Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek Tno Imaging of buried structures

Patent Citations (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4238760A (en) * 1978-10-06 1980-12-09 Recognition Equipment Incorporated Multi-spectrum photodiode devices
USRE34411E (en) * 1986-03-19 1993-10-19 Olympus Optical Co., Ltd. Electronic image pickup device for endoscopes
US4807026A (en) * 1986-03-19 1989-02-21 Olympus Optical Co., Ltd. Electronic image pickup device for endoscopes
US5255087A (en) * 1986-11-29 1993-10-19 Olympus Optical Co., Ltd. Imaging apparatus and endoscope apparatus using the same
US4974076A (en) * 1986-11-29 1990-11-27 Olympus Optical Co., Ltd. Imaging apparatus and endoscope apparatus using the same
US5105269A (en) * 1986-11-29 1992-04-14 Olympus Optical Co., Ltd. Imaging apparatus and endoscope apparatus with selectable wavelength ranges
US4953539A (en) * 1986-12-26 1990-09-04 Olympus Optical Co., Ltd. Endoscope apparatus
US4898175A (en) * 1986-12-26 1990-02-06 Olympus Optical Co., Ltd. Out-body observing apparatus
US5469239A (en) * 1987-01-06 1995-11-21 Minolta Camera Kabushiki Kaisha Image sensing system
US5174298A (en) * 1987-07-03 1992-12-29 General Electric Cgr S.A. Imaging process and system for transillumination with photon frequency marking
US5092331A (en) * 1989-01-30 1992-03-03 Olympus Optical Co., Ltd. Fluorescence endoscopy and endoscopic device therefor
US5697885A (en) * 1989-01-30 1997-12-16 Olympus Optical Co., Ltd. Endoscope for recording and displaying time-serial images
US6388702B1 (en) * 1989-01-30 2002-05-14 Olympus Optical Co., Ltd. Endoscope for recording and displaying time-serial image
US5209220A (en) * 1989-10-05 1993-05-11 Olympus Optical Co., Ltd. Endoscope image data compressing apparatus
US5465718A (en) * 1990-08-10 1995-11-14 Hochman; Daryl Solid tumor, cortical function, and nerve tissue imaging methods and device
US5845639A (en) * 1990-08-10 1998-12-08 Board Of Regents Of The University Of Washington Optical imaging methods
US6241672B1 (en) * 1990-08-10 2001-06-05 University Of Washington Method and apparatus for optically imaging solid tumor tissue
US5365084A (en) * 1991-02-20 1994-11-15 Pressco Technology, Inc. Video inspection system employing multiple spectrum LED illumination
USRE36529E (en) * 1992-03-06 2000-01-25 The United States Of America As Represented By The Department Of Health And Human Services Spectroscopic imaging device employing imaging quality spectral filters
US5304809A (en) * 1992-09-15 1994-04-19 Luxtron Corporation Luminescent decay time measurements by use of a CCD camera
US6094281A (en) * 1993-01-01 2000-07-25 Canon Kabushiki Kaisha Image reading device with offset faces for visible and non-visible light sensors
US5675378A (en) * 1993-03-19 1997-10-07 Olympus Optical Co., Ltd. Endoscope-image processing apparatus for performing image processing of emphasis in endoscope image by pigment concentration distribution
US5550582A (en) * 1993-03-19 1996-08-27 Olympus Optical Co., Ltd. Endoscope-image processing apparatus for performing image processing of emphasis in endoscope image by pigment concentration distribution
US5474910A (en) * 1993-10-15 1995-12-12 Alfano; Robert R. Method and device for detecting biological molecules and/or microorganisms within a desired area or space
US5749830A (en) * 1993-12-03 1998-05-12 Olympus Optical Co., Ltd. Fluorescent endoscope apparatus
US5701903A (en) * 1994-06-23 1997-12-30 Asahi Kogaku Kogyo Kabushiki Kaisha Fluoroscopic apparatus
US5660181A (en) * 1994-12-12 1997-08-26 Physical Optics Corporation Hybrid neural network and multiple fiber probe for in-depth 3-D mapping
US5910816A (en) * 1995-06-07 1999-06-08 Stryker Corporation Imaging system with independent processing of visible an infrared light energy
US6166385A (en) * 1995-09-19 2000-12-26 Cornell Research Foundation, Inc. Multi-photon laser microscopy
US6104939A (en) * 1995-10-23 2000-08-15 Cytometrics, Inc. Method and apparatus for reflected imaging analysis
US5983120A (en) * 1995-10-23 1999-11-09 Cytometrics, Inc. Method and apparatus for reflected imaging analysis
US5882301A (en) * 1995-12-13 1999-03-16 Yoshida; Akitoshi Measuring apparatus for intraocular substance employing light from eyeball
US6061591A (en) * 1996-03-29 2000-05-09 Richard Wolf Gmbh Arrangement and method for diagnosing malignant tissue by fluorescence observation
US20020049389A1 (en) * 1996-09-04 2002-04-25 Abreu Marcio Marc Noninvasive measurement of chemical substances
US5832931A (en) * 1996-10-30 1998-11-10 Photogen, Inc. Method for improved selectivity in photo-activation and detection of molecular diagnostic agents
US6293911B1 (en) * 1996-11-20 2001-09-25 Olympus Optical Co., Ltd. Fluorescent endoscope system enabling simultaneous normal light observation and fluorescence observation in infrared spectrum
US6122042A (en) * 1997-02-07 2000-09-19 Wunderman; Irwin Devices and methods for optically identifying characteristics of material objects
US6133953A (en) * 1997-02-20 2000-10-17 Sanyo Electric Co., Ltd. Color camera having a single imaging element and capable of restricting unwanted false color signal
US6181414B1 (en) * 1998-02-06 2001-01-30 Morphometrix Technologies Inc Infrared spectroscopy for medical imaging
US5965875A (en) * 1998-04-24 1999-10-12 Foveon, Inc. Color separation in an active pixel cell imaging array using a triple-well structure
US6326636B1 (en) * 1998-06-10 2001-12-04 Fuji Photo Film Co., Ltd. Radiation image read-out method and apparatus
US6198147B1 (en) * 1998-07-06 2001-03-06 Intel Corporation Detecting infrared and visible light
US6364829B1 (en) * 1999-01-26 2002-04-02 Newton Laboratories, Inc. Autofluorescence imaging system for endoscopy
WO2001022870A1 (en) * 1999-09-24 2001-04-05 National Research Council Of Canada Method and apparatus for performing intra-operative angiography
US20020013512A1 (en) * 2000-05-25 2002-01-31 Fuji Photo Film Co., Ltd. Fluorescent endoscope apparatus
US6468204B2 (en) * 2000-05-25 2002-10-22 Fuji Photo Film Co., Ltd. Fluorescent endoscope apparatus
US20020022768A1 (en) * 2000-07-27 2002-02-21 Asahi Kogaku Kogyo Kabushiki Kaisha Optical system for the light source device of a video endoscope system
US20020014595A1 (en) * 2000-08-02 2002-02-07 Fuji Photo Film Co., Ltd Flourescent - light image display method and apparatus therefor
US20020049386A1 (en) * 2000-10-06 2002-04-25 Yang Victor X.D. Multi-spectral fluorescence imaging and spectroscopy device
US6455908B1 (en) * 2001-03-09 2002-09-24 Applied Optoelectronics, Inc. Multispectral radiation detectors using strain-compensating superlattices
US7794394B2 (en) * 2002-05-22 2010-09-14 Beth Israel Deaconess Medical Center Device for wavelength-selective imaging

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110261179A1 (en) * 2010-04-22 2011-10-27 General Electric Company Imaging system for fluorescence guided surgery based on fixed magnification lens and digital zoom
US20110261175A1 (en) * 2010-04-22 2011-10-27 General Electric Company Multiple channel imaging system and method for fluorescence guided surgery
US20120026339A1 (en) * 2010-07-28 2012-02-02 National University Corporation Kochi University White balance adjustment method and imaging device
US9900484B2 (en) * 2010-07-28 2018-02-20 Semiconductor Components Industries, Llc White balance adjustment method and imaging device for medical instrument
JP2012178542A (en) * 2011-01-31 2012-09-13 Canon Inc Photoelectric conversion device
US8767102B2 (en) * 2011-08-12 2014-07-01 Pixelteq, Inc. Method to improve filter wheel imaging system data capture rate and add functionality through the use of an improved filter wheel design
US20130038782A1 (en) * 2011-08-12 2013-02-14 Ocean Thin Films, Inc. Method to improve filter wheel imaging system data capture rate and add functionality through the use of an improved filter wheel design
US20130044126A1 (en) * 2011-08-16 2013-02-21 Fujifilm Corporation Image display method and apparatus
US8933964B2 (en) * 2011-08-16 2015-01-13 Fujifilm Corporation Image display method and apparatus
US8937646B1 (en) * 2011-10-05 2015-01-20 Amazon Technologies, Inc. Stereo imaging using disparate imaging devices
US9325968B2 (en) 2011-10-05 2016-04-26 Amazon Technologies, Inc. Stereo imaging using disparate imaging devices
JP2016138789A (en) * 2015-01-27 2016-08-04 地方独立行政法人北海道立総合研究機構 Spectral imaging system
CN107003244A (en) * 2015-01-27 2017-08-01 株式会社日立高新技术 Multicolor fluorescence analysis device

Also Published As

Publication number Publication date
WO2003100925A3 (en) 2004-04-15
US7794394B2 (en) 2010-09-14
WO2003100925A2 (en) 2003-12-04
AU2003248559A1 (en) 2003-12-12
AU2003248559A8 (en) 2003-12-12
US20050285038A1 (en) 2005-12-29

Similar Documents

Publication Publication Date Title
US7794394B2 (en) Device for wavelength-selective imaging
US11438539B2 (en) Imaging device including an imaging cell having variable sensitivity
US6571119B2 (en) Fluorescence detecting apparatus
CN104081528B (en) Multispectral sensor
US7708686B2 (en) Color filter imaging array and method of formation
CN101238974B (en) Camera device
US10670526B2 (en) DNA sequencing system with stacked BSI global shutter image sensor
JP6947036B2 (en) Imaging equipment, electronic equipment
US10335019B2 (en) Image pickup element and endoscope device
US20200386620A1 (en) Sensor, solid-state imaging apparatus, and electronic apparatus
US20030048493A1 (en) Two sensor quantitative low-light color camera
CN113260835A (en) System and method for high precision multi-aperture spectral imaging
WO2019058995A1 (en) Photoelectric conversion element and imaging device
JP7242655B2 (en) Image sensor driving method
US10602919B2 (en) Imaging device
CN114144106A (en) Controlling integrated energy of laser pulses in hyperspectral imaging systems
Takemoto et al. Multi-storied photodiode CMOS image sensor for multiband imaging with 3D technology
JPWO2018207817A1 (en) Solid-state imaging device, imaging system and object identification system
Blair et al. A 120 dB dynamic range logarithmic multispectral imager for near-infrared fluorescence image-guided surgery
JP2009010627A (en) Solid-state imaging apparatus and camera using the same
US10777611B1 (en) Image sensor
WO2023100725A1 (en) Light detection device, electronic apparatus, and light detection system
WO2023181919A1 (en) Imaging element, method for manufacturing imaging element, and optical detection device
WO2023234069A1 (en) Imaging device and electronic apparatus
WO2022149488A1 (en) Light detection device and electronic apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: BETH ISRAEL DEACONESS MEDICAL CENTER, MASSACHUSETT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FRANGIONI, JOHN V.;REEL/FRAME:025383/0235

Effective date: 20050607

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION