US20020102018A1 - System and method for color characterization using fuzzy pixel classification with application in color matching and color match location - Google Patents

System and method for color characterization using fuzzy pixel classification with application in color matching and color match location Download PDF

Info

Publication number
US20020102018A1
US20020102018A1 US09/737,531 US73753100A US2002102018A1 US 20020102018 A1 US20020102018 A1 US 20020102018A1 US 73753100 A US73753100 A US 73753100A US 2002102018 A1 US2002102018 A1 US 2002102018A1
Authority
US
United States
Prior art keywords
color
pixel
pixels
image
categories
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/737,531
Other versions
US20040228526A9 (en
US7046842B2 (en
Inventor
Siming Lin
Dinesh Nair
Darren Schmidt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/375,453 external-priority patent/US6757428B1/en
Application filed by Individual filed Critical Individual
Priority to US09/737,531 priority Critical patent/US7046842B2/en
Publication of US20020102018A1 publication Critical patent/US20020102018A1/en
Publication of US20040228526A9 publication Critical patent/US20040228526A9/en
Application granted granted Critical
Publication of US7046842B2 publication Critical patent/US7046842B2/en
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/66Trinkets, e.g. shirt buttons or jewellery items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Definitions

  • the present invention relates to a method for characterizing colors in an image.
  • the invention also relates to a method for determining a measure of similarity between two color distributions of images or regions of interest using fuzzy pixel classification.
  • the invention also relates to a method for locating regions of a target image that match a template image with respect to color characterization.
  • Computer-implemented methods for characterizing the color information of an image or determining a measure of similarity between two color images have a wide array of applications in many fields.
  • color is a powerful descriptor that often simplifies object identification and information extraction from a scene.
  • Color characterization, location, and comparison is an important part of machine vision and is used in a large class of assembly and packaging inspection applications, e.g., to detect missing, misplaced, or damaged color components, defects on the surfaces of color objects, etc.
  • CBIR content-based image retrieval
  • a content-based image retrieval system a plurality of color images may be indexed.
  • color information regarding each image may be extracted and stored.
  • a searching step may then be performed, where the stored color information is used to find one or more indexed images that match the color information of a template image.
  • RGB Red, green, and blue
  • CMY complementary metal-oxide-semiconductor
  • the Hue, Saturation, Intensity (HSI) or Hue, Saturation, Luminance (HSL) color space was developed to specify color in terms that are easier for humans to quantify.
  • the hue component is color as we normally think; such as orange, green, violet, and so on (a rainbow is a way of visualizing the range of hues). Thus, hue represents the dominant color as perceived by an observer.
  • Saturation refers to the amount or richness of color present. Saturation is measured by the amount of white light mixed with a hue. In a pure spectrum, colors are fully saturated. Colors such as pink (red and white) and lavender (purple and white) are less saturated.
  • the intensity or light component refers to the amount of grayness present in the image.
  • HSI colors represented in HSI model space may be ideal for machine vision applications for several reasons.
  • HSI includes an intensity (luminance) component separated from the color information.
  • intensity luminance
  • hue and saturation more closely represents how humans perceive color. It may therefore be desirable to characterize colors in HSI space for color measurement and color matching.
  • HSI is modeled with cylindrical coordinates.
  • One possible model is a double cone model, i.e., two cones placed end to end or an inverted cone below another cone (see FIG. 4).
  • the hue is represented as the angle theta, varying from 0 degrees to 360 degrees.
  • Saturation corresponds to the radius or radial distance, varying from 0 to 1.
  • Intensity varies along the z-axis with 0 being black and 1 being white.
  • the Intensity I (or Luminance L) may also be represented by the equation:
  • Prior art systems use various techniques to measure and match colors. Those skilled in the art will be familiar with ‘thresholding’ an image. To threshold a color image, a threshold is applied to each of the three planes that make up the image. In RGB mode, to select a particular color, one will need to know the red, green and blue values that make up the color. In RGB mode it is not possible to separate color from intensity. Therefore, a characterization algorithm such as histogram intersection based on RGB space will be intensity sensitive.
  • HM is the quantized color histogram of the model image
  • H T is the quantized color histogram of the target image.
  • diff is a function defining the similarity measure of the quantized histograms.
  • N is the total number of bins
  • H M (k) is the number of pixels from the model image in bin k
  • H T (k) is the number of pixels from the target image in bin k.
  • ⁇ k 1 N ⁇ H M ⁇ ( k )
  • the histogram intersection method does not take into account the color similarity between a bin and its neighbors. For example, if the model image has all the pixels located in bin k but the target image has all the pixels located in bin k+ 1 , the similarity computed from histogram intersection method is 0. When the number of bins is large, this will cause a very similar image classified as a completely different image with similarity 0. A more robust color similarity measure that takes the similarity of the neighboring bins into account is desirable.
  • Color constancy which is the ability to have constant perception of a color over varying lighting conditions, as people do in most circumstances, is important when defining a similarity measure of color images. This is especially true for applications of image retrieval and machine vision.
  • Swain and Ballard's histogram intersection method has been proved sensitive to lighting change (J. Hafner, Efficient color histogram indexing for quadratic form distance functions, IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 17, no. 7, 1995).
  • a color characterization with color constancy capability is desirable.
  • U.S. Pat. No. 5,410,637 uses fuzzy logic to establish acceptable pass/fail tolerances for production or inventory samples.
  • the process first stores a series of training image samples which are labeled pass or fail according to visual inspections.
  • the initial value of the tolerance is a super ellipsoid determined by the high/low value of the pass samples in the training set.
  • a classifier template uses the super ellipsoid tolerances and ranks every sample in the training set.
  • the process then employs fuzzy logic to obtain an optimized tolerance which minimizes the sum of ranking error between the classifier template and the visual ranks.
  • the process essentially builds a pass/fail color classifier. This process cannot be used to measure the colors quantitatively in an image or to measure the quantitative color similarity between two objects in an image or in two separated images.
  • U.S. Pat. No. 5,085,325 implements a color sorting system and method.
  • the method creates a lookup table containing a series of 0's (accept) and 1's (reject) based on good and bad sample images.
  • the pixel value of the input image is used to address the lookup table, the output of the lookup table is either 1or 0. If the number of rejects (1's) accumulated is larger than a specified number K, the input image is rejected.
  • This color sorting method is based on a pixel-by-pixel comparison. A large memory is required to store the lookup table.
  • U.S. Pat. No. 5,751,450 provides a method for measuring the color difference of two digital images as a single ‘distance.’
  • This ‘distance’ is an average of the color differences of all corresponding pixels of the two images. Similar to the Jones' patent as described above, the cost of computation of the distance is very high.
  • This template image has to be stored in the computer memory for on-line color matching. If the size of the template is not the same as that of the target image, special operations for alignment or resizing the image must be done before the matching process can begin.
  • a further drawback of this approach is that it is impossible to have scale and rotation-invariant color matching based on the ‘distance’ measure.
  • U.S. Pat. No. 5,218,555 discloses a system for judging color difference between a single color and a reference color in CIE Lab color space.
  • the reference color value (L,a,b) is input from a computer keyboard and the Euclidean distance between the reference color and the inspected color is computed. If the Euclidean distance is smaller than a preset threshold e 1 , then it is judged that there is no substantial color difference. If the Euclidean distance is larger than a preset threshold e 2 , then it is judged that there is a substantial color difference. If the Euclidean distance is between larger than e 1 and smaller than e 2 , then fuzzy logical rule is applied to make decision about the difference. This system takes the human uncertainty of judging color difference into account to achieve a better judgment on the difference of two colors.
  • a method for automatically determining color features of a template image and using those features to locate color match regions in a target image has thus far been lacking.
  • some prior art methods require users to manually select or specify color features of the template image to be used in searching, e.g., by choosing a dominant color of the template image.
  • users may be required to manually threshold the template image, and this threshold information may be used in searching.
  • a method to automatically determine color features of a template image and use these features to perform a color match search is desirable.
  • an object of the present invention is to provide an improved system and method for effectively and accurately characterizing color for machine vision applications.
  • Another object of the invention is to provide improved systems and methods for locating regions or objects of a target image having color information that matches, at least to a degree, the color information of a template image.
  • Another object of the invention is to provide improved systems and methods for effectively and accurately measuring the color similarity of multiple-color images.
  • Still another object of the invention is to provide a machine vision system for measuring multiple colors, including black and white color, while the color measuring system is intensity independent within a large range of intensity variation.
  • Still another object of the invention is to provide a machine vision system for measuring multiple colors with different saturation values in an image, while the color measuring system comprises a wide range of intensity variation and is intensity independent.
  • Still another object of the invention is to provide a machine vision system for color matching that may quantitatively measure the color difference between two images or between two regions of interest in the same image.
  • Still another object of the invention is to provide a machine vision system for color matching that is not required to calculate the color difference based on pixel-by-pixel comparisons.
  • Still another object of the invention is to provide a machine vision system for color matching that is intensity independent within a large range of intensity variation.
  • Still another object of the invention is to provide a machine vision system for color matching that can distinguish colors with different saturation values.
  • Still another object of the invention is to provide a machine vision system for color matching that compensates for black and white color distribution in images.
  • a color characterization method is described herein which operates to characterize the colors of an image or region of an image.
  • the image may be obtained in HSI format, or alternatively may be converted from another format to HSI.
  • an image may be acquired in HSI format by National Instruments color image acquisition board PCI-1411.
  • the color characterization divides the HSI space into n color categories (also referred to as subspaces or bins), where n is the number of color categories of interest. The number of different color categories in the color space may be dependent on a desired complexity of the color characterization.
  • the method determines a color category for the respective pixel based on values of the respective pixel, i.e., hue, saturation and intensity values, wherein the color category is one of a plurality of possible color categories or bins (or sub-spaces) in the HSI space.
  • the number of pixels assigned to each category is then counted and normalized by the total number of pixels in the selected region of interest (or entire image), i.e., the percentage of pixels in each color category characterizes the colors of the image or ROI.
  • the percentage of pixels in each color category may also be used as a quantitative measurement of the color distribution of the image.
  • fuzzy membership or other functions may be applied to determine a desired distribution of pixels among color space bins. For example, pixels may be assigned to multiple bins during the color characterization method, e.g., on a fractional weighted basis. This increased complexity may result in more accurate color match location results.
  • a fuzzy membership or other function may be applied, based on where the pixel falls within the bin and/or where the pixel falls within the color space, based on color information of the pixel. This function may determine a contribution that the pixel should make to one or more bins. For example, the function may determine a set of values to assign to each of the one or more bins.
  • the function may determine that a portion of the weight of the pixel should be contributed to the neighboring bin that the pixel is near.
  • the function may determine a contribution that the pixel should make to any number of bins, wherein the sum of these contributions is 100%.
  • Any of various types of fuzzy membership functions may be applied, including, triangle fuzzy membership functions, trapezoid fuzzy membership functions, and step fuzzy membership functions.
  • Another embodiment of the invention comprises a color match location method that may use the color characterization method described above.
  • a target image may be searched in order to locate regions within the target image having matching color information.
  • a coarse-to-fine heuristic may be utilized, in which multiple passes of decreasing granularity are performed.
  • a first-pass search may operate to identify a list of candidate match regions. For example, the target image may be stepped across at a step interval, wherein color information of a target image region is characterized at each step, using the color characterization method described above. For each target image region, a measure of difference between the color characterization information of the target image region and the color characterization information of the template image may be calculated. If this difference is smaller than a threshold value, then the target image region may be added to a list of candidate regions.
  • a larger area (region) proximal to the candidate region may then be searched, e.g., by stepping through the proximal area using a smaller step size than was used in the first-pass search.
  • color information of a target image region within the proximal area may be characterized and compared to the template image color information.
  • the target image region within the area proximal to the initial candidate region that best matches the color information of the template image may be considered a second-pass candidate region.
  • the matching criteria used to determine whether a target image region is a second-pass candidate region are preferably stronger than the criteria used in the first-pass search, i.e., the value calculated as the difference between the color information of the target image region and the color information of the template image must be smaller than a smaller threshold value than was used in the first-pass search.
  • the process described above may be repeated for as many repetitions as desired.
  • the step size used is preferably smaller and the measure of color difference preferably must be smaller in order for a region to be considered a candidate, e.g., until a predetermined number of search passes are performed or until step sizes are as small as possible and/or matching criteria are as strict as possible.
  • any target image regions that still remain as candidate matching regions may be considered as final matches.
  • the color match location method described above may be useful in many applications.
  • the method may be especially useful in applications that do not require an exact location of the template image within the target image to be determined, with sub-pixel accuracy.
  • some applications may need to very quickly determine match locations to a degree of accuracy, but may not require the locations to be determined with the degree of preciseness that may be obtained if pattern information is also used in the matching.
  • This more coarse location determination may be suitable for many applications, e.g., to determine whether all color-coded pills are present in a blister pack.
  • the method may also be especially suitable for applications that do not require the spatial orientation of the matches to be determined.
  • the color characterization method may also be used to determine color similarity of a template image and a target image as a whole. For example, some applications do not require a target image to be searched for color match regions, but may simply require a determination of how closely the color information of the entire target image matches the color information of the template image.
  • FIG. 1 illustrates a computer system which performs color characterization and/or color matching according to one embodiment of the present invention
  • FIG. 2 illustrates an exemplary image acquisition (video capture) system for acquiring images
  • FIG. 3 is a high-level block diagram of the image acquisition system according to one embodiment
  • FIGS. 4, 5A and 5 B are graphical representations of HSI color space
  • FIG. 6 is a flowchart diagram illustrating one embodiment of a method for characterizing color information of a template image and/or a target image
  • FIG. 7 is a flowchart diagram illustrating one embodiment of a method for analyzing image pixels in order to determine a pixel distribution among HSI color bins
  • FIG. 8 is a flowchart diagram illustrating one embodiment of a method for locating regions of a target image that match a template image, with respect to color characterization;
  • FIG. 9 is a flowchart diagram illustrating one embodiment of a method for searching a target image to find regions having color information that match a template image
  • FIG. 10 illustrates an example of target image window movement during a first-pass search
  • FIG. 11 is a flowchart diagram illustrating one embodiment of a method for performing a first pass search through a target image
  • FIG. 12 is a flowchart diagram illustrating one embodiment of a method for performing pixel sharing or re-distribution after pixels have been assigned to HSI color bins;
  • FIGS. 13A, 13B, and 13 C illustrate examples using fuzzy membership functions to determine a desired fractional pixel distribution among HSI color bins
  • FIG. 14 is a flowchart diagram illustrating one embodiment of a method of using a fuzzy membership function to characterize the color information of the image
  • FIG. 15 illustrates an example of a graphical user interface (GUI) associated with color match location software according to one embodiment of the present invention
  • FIG. 16 illustrates a target image in which color match locations are visually indicated
  • FIG. 17 illustrates a display of information representing the color characterization of an image.
  • FIG. 1 Computer System
  • FIG. 1 illustrates a computer system 102 which may perform color match location according to one embodiment of the present invention.
  • the computer system 102 may comprise one or more processors, a memory medium, display, and an input device or mechanism, such as a keyboard or mouse, and any other components necessary for a computer system.
  • the computer system 102 may perform a color characterization analysis of a template image and may use information determined in this analysis to locate regions of a target image which match the template image, with respect to color characterization. Images that are to be matched are preferably stored in the computer memory and/or received by the computer from an external device.
  • the computer system 102 preferably includes one or more software programs operable to perform the color match location.
  • the software programs may be stored in a memory medium of the computer system 102 .
  • the term “memory medium” is intended to include various types of memory, including an installation medium, e.g., a CD-ROM, or floppy disks 104 , a computer system memory such as DRAM, SRAM, EDO RAM, Rambus RAM, etc., or a non-volatile memory such as a magnetic medium, e.g., a hard drive, or optical storage.
  • the memory medium may comprise other types of memory as well, or combinations thereof.
  • the memory medium may be located in a first computer in which the programs are executed, or may be located in a second different computer which connects to the first computer over a network. In the latter instance, the second computer may provide the program instructions to the first computer for execution.
  • the computer system 102 may take various forms, including a personal computer system, mainframe computer system, workstation, network appliance, Internet appliance, personal digital assistant (PDA), television system or other device.
  • PDA personal digital assistant
  • the term “computer system” can be broadly defined to encompass any device having a processor which executes instructions from a memory medium.
  • the software program(s) may be implemented in any of various ways, including procedure-based techniques, component-based techniques, and/or object-oriented techniques, among others.
  • the software program may be implemented using ActiveX controls, C++ objects, Java objects, Microsoft Foundation Classes (MFC), graphical programming techniques or other technologies or methodologies, as desired.
  • a CPU such as the host CPU, executing code and data from the memory medium comprises a means for performing color match location according to the methods or flowcharts described below.
  • FIG. 2 Machine Vision System
  • FIG. 2 illustrates a machine vision system or image acquisition system, which is an example of one application of the present invention.
  • the color match location techniques described herein may be used in various types of image processing, machine vision or motion control applications.
  • the computer 102 may be embodied in various form factors and/or architectures, e.g., a robot or embedded device, among others. It is also noted that the color match location techniques described herein may be performed in any of various manners, either in software, programmable logic, or hardware, or a combination thereof.
  • computer system 102 is coupled to a camera 112 and operates to receive one or more images.
  • the computer system 102 may be operable to perform a color characterization analysis to characterize the colors in a template image.
  • template image is used to refer to either an entire image, or a portion of an image, e.g., a region of interest (ROI).
  • ROI region of interest
  • the computer system 102 may also be operable to perform a search of a target image to locate target image regions that “match” the color characterization of the template image. As described below, the search may be performed to locate matching regions with any of various degrees of exactness, as appropriate for a particular application.
  • FIG. 3 Image Acquisition System Block Diagram
  • FIG. 3 is a high-level block diagram of the image acquisition system of FIG. 2 for acquiring an image for color characterization and/or color matching according to the present invention. It is noted that the block diagram of FIG. 3 is exemplary only, and other computer system architectures may be used as desired.
  • the present invention may be implemented in a “smart camera”, which integrates a sensor, analog to digital (A/D) converter, CPU, and communications devices together in a single unit.
  • A/D analog to digital
  • the present invention may be embodied in other architectures, devices, or embodiments, as desired.
  • the host computer 102 preferably comprises a CPU 202 , a bus bridge 204 , system memory 206 , and a peripheral bus 212 .
  • the CPU 202 is coupled to the bus bridge 204 .
  • the bus bridge 204 is coupled to the system memory 206 and the CPU 202 , and couples to the peripheral bus 212 .
  • the peripheral bus 212 is the PCI expansion bus, although other types of buses may be used.
  • the host computer system 102 also includes a video capture board (also referred to as an image acquisition board) 214 which is adapted for coupling to the video source 112 .
  • the video capture board 214 is preferably coupled to the peripheral bus 212 .
  • other peripheral devices 216 and 218 ) may be coupled to the peripheral bus 212 , such as audio cards, modems, graphics cards, network cards, etc.
  • the video source 112 supplies the analog or digital video signals to the video capture board 214 .
  • the video capture board 214 transfers digitized video frames to the system memory 206 through peripheral bus 212 and bus bridge 204 .
  • the video capture board 214 acquires the target image and transfers it to system memory 206 .
  • the user of the computer 102 may then select one or more regions of interest (ROI) in the target image which are desired to be searched for regions having color information that matches the color information of a template image.
  • the ROI may be the entire target image or a portion of the target image.
  • the system memory 206 may store a template image.
  • the system memory 206 may store the color characterization information of the template image instead of, or in addition to, the actual template image.
  • the system memory 206 also preferably stores software according to one embodiment of the present invention which operates to characterize the color information (color characterization software) of images, such as the template image and/or one or more acquired or specified target images.
  • the color characterization software in the system memory may operate on the template image to produce the color characterization information.
  • the system memory 206 may also receive and/or store one or more other images, such as selected ROIs in the template image or another image, or acquired target images or target image objects.
  • the system memory 206 also preferably stores software according to one embodiment of the present invention which operates to perform a color match location method (color match location software), as described below.
  • image may refer to any of various types of images.
  • An image may be a gray-level or color image.
  • An image may also be a complex image, in which pixel values have a real part and an imaginary part.
  • An image may be obtained from any of various sources, including a memory medium.
  • An image may, for example, be obtained from an image file, such as a BMP, TIFF, AIPD, PNG, JPG, or GIF file, or a file formatted according to another image format.
  • An image may also be obtained from other sources, including a hardware device, such as a camera, frame grabber, scanner, etc.
  • image may also refer to an entire image or to a portion or region (ROI) of an image.
  • the color characterization information of the template image may be pre-calculated and stored in the computer, and the actual template image is then not required to be stored or used for subsequent color match location operations with respective target images.
  • the color characterization software characterizes the colors in the target image and may compare this color information with the pre-computed color information of the template image.
  • the present invention is preferably implemented in one or more software programs which are executable by a processor or CPU.
  • the software program(s) of the present invention are preferably stored in a memory medium of a computer as described above.
  • FIGS. 4, 5A, 5 B HSI Color Space
  • characterizing the color information of a template image and/or target image may utilize HSI (hue, saturation, intensity) information.
  • HSI high-sensitivity, saturation, intensity
  • the HSI information of individual pixels of an image may be analyzed, and the pixel-specific results may be compiled in order to characterize the image based on color.
  • the color characterization method divides the color spectrum or color space into categories or “bins” (also called sub-spaces), primarily according to hue and saturation values, and then operates to assign pixels to respective ones of these bins.
  • the total number of pixels (or percentage of pixels) in an image that fall into each category or bin of the color spectrum may then be used as the basis of the color characterization.
  • FIG. 4 illustrates the possible hue, saturation, and intensity values (the color spectrum) as a 3-dimensional space or volume.
  • the color information of a given pixel may be represented as a vector or point within the 3D color space or volume shown in FIG. 4.
  • the vector's location represents the hue, saturation, and intensity of the pixel.
  • Hue represents the color shade of a pixel and is shown as an angle of a radial line in the circle in FIG. 4.
  • FIG. 5A illustrates a cross section of FIG. 4. As shown in FIG. 5A, hue is represented as an angular value ranging from 0-360 degrees.
  • Saturation refers to a color's freedom from mixture or dilution with white. Saturation is represented in FIG. 4 as the radial distance of a line on the circle, i.e., the distance from the center of the circle. Saturation may be more easily seen in the cross section of FIG. 5A. Saturation typically is measured in the range of 0 to 1, with 0 being at the center of the circle and 1 being at the outside perimeter of the circle. Thus, hue and saturation are essentially represented in polar coordinates to describe a point or location on the circle of FIGS. 4 and 5A.
  • Intensity sometimes referred to as light or luminance, refers to the degree of shade in a pixel and is represented on the vertical scale of FIG. 4, i.e., vector locations above or below the circle.
  • luminance and intensity are interchangeable throughout this description. Intensity values typically range from 0 to 1, with 0 being pure black and 1 being pure white.
  • the intensity value 0 is represented at the apex of the bottom cone, and the intensity value 1 is represented at the apex of the top cone.
  • the method used to characterize the color information of a template image and the method used to characterize the color information of a target image may be the same.
  • the color space of FIG. 4 may be partitioned into color categories.
  • the color space may be partitioned into any number of categories or bins.
  • the number of categories or bins determines the granularity or resolution of the color characterization. For example, for some applications a large degree of similarity between a template image and a target image region may be desired in order for the target image region to be considered as a match. Thus, a large number of categories or bins may be required in this instance.
  • user input may be received which specifies the desired complexity of the color characterization. In one embodiment, three possible complexity levels may be specified, these being low, medium, and high.
  • the low complexity level comprises 17 possible categories or bins.
  • the hue plane (FIG. 5A) is divided into seven different bins (or pie-shaped wedges) 440 for the seven possible natural colors, and the saturation plane is divided into two regions, thereby creating 14 (7 ⁇ 2) bins.
  • the seven possible natural colors comprise the 7 standard colors of the color spectrum, these being: red, orange, yellow, green, blue, indigo and violet.
  • the two regions of the saturation plane are defined by a radial distance threshold 442 , preferably 0.3 on a scale from 0 to 1.
  • the seven different bins of the hue plane and the two regions or bins of the saturation plane thereby create 14 possible categories or bins in the hue/saturation plane.
  • Three additional color categories are allotted for the pixel being characterized as black, gray, or white, thereby creating a total of 17 possible categories (14+3).
  • FIG. 5B illustrates the areas within HSI color space which may be categorized as either black, gray, or white.
  • the color of a specific pixel may be characterized as black, gray, or white if the saturation value is very low.
  • the black, gray, and white categories are discussed in more detail below.
  • the medium complexity level may comprise 31 possible categories or bins.
  • the hue plane (FIG. 5A) is divided into 14 different color categories 440 and the saturation plane is divided into two regions, thereby creating 28 (14 ⁇ 2) bins.
  • the hue plane is divided into 14 pie-shaped wedges, and the saturation plane is further sub-divided into 2 regions defined by a radial distance threshold 442 , preferably 0.3 on a scale from 0 to 1, thereby creating 28 possible color categories or bins in the hue/saturation plane.
  • Three additional color categories are allotted for the pixel being black, gray, or white, thereby creating a total of 31 possible color categories (28+3).
  • the high complexity level may comprise 59 possible color categories or bins.
  • the hue plane (FIG. 5A) is divided into 28 different bins 440 , and the saturation plane is divided into two regions, thereby creating 56 (28 ⁇ 2) bins.
  • the hue plane is divided into 28 pie-shaped wedges, and the saturation plane is further sub-divided into 2 regions defined by a radial distance threshold 442 , preferably 0.3 on a scale from 0 to 1, thereby creating 56 possible color categories or bins in the hue/saturation plane.
  • Three additional color categories are allotted for the pixel being black, gray, or white, thereby creating a total of 59 possible categories (56+3).
  • the saturation categorization i.e., the location of the radial distance threshold 442 , is preferably set to a default value, but may also be adjusted by the user setting the Learn Sat Threshold 604 .
  • the saturation threshold typically is only adjusted when color characterization is performed on images with little variance in color saturation. In another embodiment, the number of saturation divisions may be increased, for example, to 3 (or more), or may be decreased to 0 (i.e. colors are not divided with respect to saturation level).
  • FIG. 6 Color Characterization Method
  • FIG. 6 is a flowchart diagram illustrating one embodiment of a method for characterizing color information of a template image and/or a target image.
  • the color characterization method shown in FIG. 6 may be utilized in step 252 of the color match location method shown in FIG. 8 below, to perform a color characterization of a template image.
  • the color characterization method shown in FIG. 6 may also be utilized in step 254 of FIG. 8 to perform color characterization on regions of a target image during a color match location search.
  • FIG. 6 represents one particular embodiment of a color characterization method.
  • Various applications may require different levels of sensitivity with respect to characterizing colors in a template image and/or classifying target image regions as color matches.
  • Various applications may also have different computational efficiency requirements.
  • any of various color characterization methods may be utilized.
  • the color characterization method shown in FIG. 6 may be performed once and the color information for the template image may be stored and used as necessary.
  • the method of FIG. 6 may be performed multiple times for various regions of the image as the target image is searched.
  • the embodiment illustrated in FIG. 6 involves analyzing an image with respect to HSI color information.
  • user input may be received which specifies various color characterization method options.
  • the user input may specify a color sensitivity level to use in analyzing the image, i.e., a desired resolution of color information.
  • the user may select one of three sensitivity levels, these being low, medium, and high.
  • the sensitivity level may determine the number of categories or bins into which to divide the HSI color space. It is noted that the number of color categories may be set to any number or level, as desired. Alternatively, a default color characterization method may be used, and user input may not be used.
  • the image may be converted to HSI format.
  • Images are typically stored or received in RGB (Red, Green, Blue), Redness/Greenness, CMY, or HSI format.
  • RGB Red, Green, Blue
  • CMY Redness/Greenness
  • HSI format HSI format
  • the conversion process when necessary, may analyze an image pixel by pixel, applying an algorithm that converts the current color format to the HSI format.
  • alternative embodiments of color characterization methods may utilize other color representation formats, such as RGB or CMY, among others. In these embodiments, for example, the RGB or CMY color spaces may be divided into color categories or bins, and pixels may be assigned to these bins.
  • the HSI color space may be partitioned into categories or bins, such as described above with reference to FIGS. 4 and 5.
  • the number of bins to divide the space into may utilize color sensitivity information received in step 260 .
  • Step 264 may simply involve storing information that specifies the different bins.
  • the image may be analyzed pixel by pixel, in order to determine the pixel distribution among the HSI bins.
  • FIG. 7 illustrates one embodiment of step 266 in detail.
  • the user may specify one or more colors which should be ignored in performing the pixel distribution. For example, the user may specify that black, gray, white or some combination of these or other HSI colors should be ignored. This may be useful, for example, if the template image and/or the target image have background colors that should be ignored for color matching purposes.
  • pixels may be examined at the time that the HSI bin distribution is performed, so that pixels corresponding to certain bins are ignored. In another embodiment, this consideration may be performed after the pixel distribution is performed. For example, for each bin corresponding to a color that should be ignored, the number or percentage of pixels assigned to that bin may be set to zero after the distribution is performed, and the pixel percentages in the remaining bins may be normalized to sum to 100 percent. This latter embodiment may result in a more efficient color characterization method.
  • each examined pixel is assigned to a single category or bin.
  • pixels may be assigned to multiple bins, e.g., on a weighted basis. For example, if a pixel falls near an “edge” of a bin, with respect to the portion of color space represented by that bin, then a fraction of that pixel's weight may be assigned to a neighboring bin.
  • the determination on how to distribute a pixel among multiple bins may be performed in any of various ways, including through the use of a fuzzy membership function. Fractional distribution of pixels is further discussed below.
  • the color characterization method may also involve determining one or more color categories which are characterized as dominant color categories of the image, as shown in step 268 , wherein the one or more dominant color categories are assigned a relatively larger proportion of image pixels, with respect to other color categories of the color space.
  • the determination of dominant color categories may be performed in any of various ways. For example, in one embodiment the categories may be sorted with respect to pixel allocation percentage, and the category with the highest percentage may then be examined. If this percentage falls at or above a certain ratio value T, which may be a default value or may be specified by a user, then this color category may be considered as a single dominant color category for the image. If this percentage is below the value T, then the color category with the next largest percentage of pixel allocation may be considered as a second dominant color category for the image, etc., until the sum of the percentages of the examined bins is at or above the value T. Thus, there may be multiple dominant color categories for an image. In one embodiment it may be required that the percentage of pixels in the largest category be at least of a certain threshold value in order for the image to have any dominant color categories.
  • the dominant color information is determined only for the template image, i.e., this computation may be omitted when performing a color characterization analysis of a target image region.
  • the dominant color information of a template image may be utilized when comparing the color information of the template image to the color information of a target image, as described below.
  • FIG. 7 HSI Bin Pixel Distribution
  • FIG. 7 is a flowchart diagram illustrating one embodiment of step 266 of FIG. 6, in which pixels of an image are assigned to appropriate HSI space bins.
  • the method shown in FIG. 7 may be performed for each pixel of an image or for only a subset of the pixels.
  • the method would typically be performed for each pixel, in order to obtain as much color information for the template image as possible.
  • the color characterization analysis for the template image may only need to be performed once, and may be performed “offline”, i.e., does not need to be performed in real time as a target image is searched for color match regions. Thus, once the color characterization information has been obtained for the template image, it may not be necessary to have the template image in memory for a color match location procedure.
  • each region of the target image that is searched it may be desirable to examine only a subset of the region's pixels, since categorizing every pixel of the region into a bin may be computationally expensive, and many regions in the target image may need to be searched.
  • analyzing a subset of pixels in each target image region may be sufficient, e.g., in order to perform a coarse-grained search that identifies candidate regions that can then be analyzed in more detail.
  • the sample pixel subset may be generated using any of various sampling techniques, such as grid-based sampling, random sampling, or other non-uniform sampling.
  • step 412 the method determines if the intensity value of the pixel is below a certain threshold, which could be specified by the user as some small value close to 0.
  • FIG. 5B illustrates the intensity threshold 446 .
  • the intensity threshold 446 is preferably a decreasing function of the saturation.
  • the intensity threshold 446 may be set by the computer or in some embodiments may be selected by the user.
  • the intensity threshold BlkThreshold is specified as a function of the saturation as shown below:
  • BlkThreshold ⁇ ⁇ BlkGrayThreshold ⁇ for ⁇ ⁇ sat ⁇ 10 ⁇ ( BlkGrayThreshold - 5 ) ⁇ exp ⁇ [ - 0.025 ⁇ ( sat - 10 ) ] + 5 ⁇ for ⁇ ⁇ 10 ⁇ sat ⁇ 200 ⁇ 5 ⁇ for ⁇ ⁇ 200 ⁇ sat
  • step 414 If a pixel's intensity is smaller than BlkThreshold, then in step 414 the pixel is immediately categorized as black. In this case, no further color learning is performed on the pixel.
  • the threshold comparison performed in step 412 saves computer cycles by not requiring further HSI analysis on a pixel that is black based strictly on its low intensity. If the intensity value of the pixel is above the intensity threshold of step 412 , then operations proceed to step 416 , and further color categorizations are applied.
  • step 416 the saturation value of the pixel is examined. If the saturation of a pixel is very low, different colors are not distinguishable and the pixel may immediately be categorized as either black, gray, or white. When a pixel's saturation is close to the minimum saturation level, the pixel may be graphically represented near the origin of the circle of FIGS. 5B. Step 416 determines if a pixel's saturation is lower than a selected saturation threshold 604 (FIG. 5B), i.e., is very close to 0. In one embodiment, the Saturation Threshold 604 has a default value of 10 on a scale from 0 to 255 (this corresponds to a default value of 0.04 on a scale from 0 to 1). If the saturation level of a pixel is below the saturation threshold, the pixel does not require further saturation analysis or the hue analysis of step 418 so the process advances to step 422 .
  • a selected saturation threshold 604 FIG. 5B
  • a pixel (which has a very low saturation value) is examined based on its intensity value.
  • a pixel that has very low saturation i.e. is below the saturation threshold
  • the pixel is examined in step 423 to determine whether the intensity value falls on the upper portion of the intensity plane, i.e., I>WhiteGrayThreshold. If so, then the pixel is categorized as white in step 426 . Otherwise, the pixel is categorized as gray in step 427 . Values for BlkGrayThreshold and WhiteGrayThreshold may be pre-specified based on the importance of black, gray, and white color in the particular application. In one embodiment, the threshold values may be set to divide the intensity plane into three equal portions, which puts the same weight on black, gray, and white colors. After a pixel is categorized as either black, gray, or white, the method continues to step 428 .
  • step 420 If the saturation of a pixel is more than the saturation threshold 604 in step 416 , then hue and saturation analysis are performed in step 420 . In step 420 , the hue and saturation values of the pixels are analyzed, and the pixel is assigned to one of the bins in the hue/saturation plane based on these values.
  • FIG. 5A illustrates the hue/saturation plane, wherein hue is categorized by a color's angular orientation (from 0 to 360 degrees) on the cross sectional plane of FIG. 5A, and saturation is categorized as the color's radial distance on the cross sectional plane of FIG. 5A.
  • Hue characterization may divide the hue plane into, for example, 7, 14, or 28 bins (for low, medium, or high complexity) depending on a selected color sensitivity, such as shown in FIG. 15, and the bins are further split in half by a radial distance value, represented by circle 442 (FIG. 5A), that allows categorization according to saturation within each hue bin. This doubles the total number of bins, or categories, in the hue/saturation plane to 14, 28, or 56, respectively.
  • step 428 If the current pixel being analyzed is the last pixel to be analyzed as determined in step 428 , then operation completes. If not, then operation returns to step 412 , and steps 412 - 428 are repeated.
  • the method may calculate color parameters, such as the percentage of pixels in each bin, i.e., the number of pixels in each bin in relation to the total number of pixels examined. These calculations will result in N percentages whose sum is equal to 100%. Percentages are used, rather than raw data, to allow matching of differently shaped, scaled and rotated images. It is noted that other types of color parameters may be generated, e.g., other types of normalized values which are independent of the number of pixels in the image object.
  • the color characterization for the image thus may produce a list or data structure that contains N percentage values or parameters representing the color characterization of the image.
  • a user may specify one or more colors in the image to be ignored.
  • the percentage of pixels in each bin corresponding to an ignored color may be set to zero, and the percentages for the remaining bins may be normalized to result in a total of 100%, or pixels corresponding to these bins may not be assigned to the bins at all, which would automatically result in a zero percentage for these bins.
  • FIG. 8 Color Match Location Method
  • FIG. 8 is a flowchart diagram illustrating one embodiment of a method for locating regions of a target image that match a template image, with respect to color characterization.
  • a template image may be received.
  • the template image may be an image of any of various types, including gray-level and color images.
  • the template image may be received or obtained from any of various sources and may be an entire image or may be a portion of an image, e.g., a region of interest specified by a user.
  • a user may select a region of interest (ROI) using a graphical user interface (GUI).
  • GUI graphical user interface
  • a GUI may enable the user to choose from many different shapes of ROIs, such as a rectangle, an oval, or a shape selected freehand.
  • a target image may be received.
  • the target image may also be an image of any of various types, including an image obtained from a memory medium or an image acquired from a hardware device, such as a camera, frame grabber, scanner, etc.
  • the target image may also be received from any other source, including from a graphics software program, from transmission via a network, etc.
  • a target image may also be an entire image or only a portion of an image.
  • multiple template images and/or target images may be received or specified. For example, it may be desirable to search multiple target images for regions having color information matching that of a template image, or it may be desirable to search for target image regions matching any of a plurality of template images.
  • a color characterization analysis may be performed for the template image.
  • this analysis may involve dividing the HSI color space into a number of categories or “bins”. The color information of the template image pixels may then be examined in order to determine the allocation of the pixels across the bins.
  • One particular embodiment of step 252 is described above with reference to FIG. 6. In alternative embodiments, any of various other methods may be used to perform the color characterization analysis.
  • color characterization of the template image may be performed on a different computer system, and in step 250 the method may receive the color characterization information of the template image.
  • the computer system executing the color match location software may only receive or store the color characterization information of the template image, and may not be required to store the template image itself.
  • the target image may be searched in order to locate regions that match the template image with respect to color characterization.
  • This search may utilize the color characterization information of the template image obtained in step 252 and may also involve performing color characterization analyses for various regions of the target image.
  • step 254 may involve performing color characterization analyses for various regions of the target image, and comparing this color characterization of each of these regions with the color characterization information of the template image obtained in step 252 .
  • Step 254 may be performed in any of various ways.
  • the target image may be searched in multiple passes. The first pass may involve a coarse-grained search to efficiently identify a list of candidate areas or regions in the target image. Subsequent passes may then examine the candidate areas more closely in order to determine final matches.
  • One specific embodiment of step 254 is discussed in detail below with respect to FIG. 9.
  • step 256 color match location or analysis information may be generated.
  • Step 256 may involve displaying information, such as visually indicating the location of the match regions within the target image, and/or displaying information indicating various statistics regarding the color information of the match regions or regarding how closely the regions match the color information of the template image.
  • FIG. 9 Target Image Search
  • FIG. 9 is a flowchart diagram illustrating one embodiment of a method for searching a target image to find regions having color information that match a template image.
  • the target image search method shown in FIG. 9 may be used in step 254 of the color match location method shown in FIG. 6.
  • any of various other search methods may be used, as desired for a particular application.
  • the target image search method shown in FIG. 9 utilizes a coarse-to-fine heuristic, in which candidate color match areas of the target image are identified in a first-pass search, and these candidate areas are then examined in more detail to identify final color match regions.
  • Each region of the target image that is examined may be regarded as a window into the target image.
  • This window may have various sizes.
  • the window size may correspond exactly to the size of the template image, or the window size may be scaled to be larger or smaller than the template size.
  • the window may be moved through the target image in order to sample the image at various regions. The points at which to sample regions may be determined in any of various ways.
  • the window may initially be positioned at the top, left corner of the target image and may then be moved through the image at interval steps. For each sample region, the color information of the region may be compared with the color information of the template image, as described below.
  • FIG. 10 illustrates an example of window movement during a first-pass search, in which the window begins at the top, left comer of the target image and is moved through the target image using a step size of nine pixels.
  • the window for example, is moved downward 9 pixel scan lines as shown in FIG. 10B.
  • the window is moved another 9 scan lines downward as shown in FIG. 10C. The comparisons are repeated until the window reaches the bottom left portion of the target image, as shown in FIG. 10D.
  • the window for example, is moved back to the top of the target image and is moved over 9 vertical pixel columns to perform another comparison, as shown in FIG. 10E.
  • the window is moved down 9 horizontal scan lines of pixels as shown in FIG. 10F. This procedure again repeats a plurality of times until the window again reaches the bottom of the target image.
  • the window is moved back to the top of the target image and across 9 more vertical column of pixels (not shown) to perform another set of comparisons. This procedure may be performed until the window has been stepped through the entire target image, using a 9 pixel step size.
  • FIGS. 10 A- 10 F are merely an example of stepping the window across the target image, it being noted that the window may be stepped across the target image using any of various step sizes and in any of various manners, e.g., left to right, right to left, top to bottom, bottom to top, or other methodologies. Also, the target image may not necessarily be sampled at regular step intervals. For example, window placement may be chosen using any of various algorithms, or may be chosen randomly, quasi-randomly, etc.
  • step 450 of FIG. 9 user input specifying various search options may be received.
  • the search options may specify various parameter values affecting the degree of granularity used for deciding color matches and/or the efficiency of the target image search process.
  • the user may specify one of three options: “conservative”, “balanced,” or “aggressive,” which each control various search parameters, such as described below with reference to FIG. 11.
  • search parameters may be specified individually.
  • a first-pass search through the target image may be performed in order to find initial color match candidate areas, i.e., areas that may contain a region having color information that matches the color information of the template image.
  • initial color match candidate areas i.e., areas that may contain a region having color information that matches the color information of the template image.
  • each candidate area identified in step 452 may be examined in more detail.
  • various regions of the target image may be sampled at a relatively large step size, in order to efficiently identify areas containing a possible match.
  • the search window may initially be placed at the position where the window was during the first-pass search when the candidate area was identified. The window may then be moved around this initial position at a reduced step size in order to perform a finer-grained search, so that the best matching region for each candidate area is determined.
  • the new step size may be inversely proportional to how well the initial candidate matched the template image.
  • a “hill-climbing” heuristic may be used, such that if the initial candidate is very close to the template image, smaller steps are taken so that the best match is not stepped over.
  • Various methods for determining how close the color information of a target image region is to the color information of the template image are discussed below.
  • the window may be moved around each candidate area using any of various strategies or algorithms.
  • the distance that the window may be moved away from the original candidate's position is preferably limited, e.g., as a function of the size of the window and/or the step size used in the first-pass search.
  • searching in that direction may be aborted, in order to avoid unnecessary comparisons.
  • the color information for a target image region when the color information for a target image region is analyzed, it may be desirable to examine the color information for only a subset of the individual pixels of the region, e.g., in order to search through the target image more quickly.
  • the sub-sampling size for each target image region may be determined by search criteria specified by the user.
  • step 454 may comprise performing one or more subsequent passes through the candidate list after the first pass.
  • the coarse-to-fine search heuristic may be repeated, possibly only for certain candidates, using successively smaller step sizes, and/or larger sub-sampling sizes, e.g., until the step size is reduced to one pixel and every pixel of the target image region is sampled.
  • the desired number of passes performed and the rate at which the search parameters change between passes may differ according to the accuracy and efficiency requirements of particular applications.
  • Each initial candidate area identified in the first-pass search may be replaced by the region found in step 454 having color information that best matches the color information of the template image (or may not be replaced if no better match is found). Also, it is possible that candidate areas identified during a previous pass are eliminated altogether in a subsequent pass. For example, since the step size may be relatively large during the first-pass search, the match criteria for identifying candidates may be relatively loose, i.e., a target image region may not need to match the template image very closely in order to be considered a candidate match area. As candidate regions are examined more thoroughly in subsequent passes, it may be desirable to require the color information of each candidate to match the template image more strongly in order to remain a candidate.
  • information regarding an expected number of matches to be found in the target image may be utilized in order to more quickly complete the color match location process.
  • FIG. 15 illustrates a graphical user interface enabling a user to specify an expected number of matches.
  • the method may limit the number of color match candidate regions that are searched to a maximum number based on the expected number of matches. In one embodiment, this maximum number may be calculated with a formula such as:
  • the list of candidate regions identified in the first-pass search through the target image may be sorted with respect to how well the color information of each candidate region matches the color information of the template image, and in a subsequent search pass, the list of candidate regions may be traversed in this sorted order.
  • the maximum number calculated based on the number of expected matches may be used to limit the number of candidate regions that are searched in a subsequent pass. Since the first-pass search may use relatively loose matching criteria, the first-pass search may identify a large number of candidate regions. The method may operate to keep track of the number of candidates remaining after a subsequent pass. If the maximum number is reached, then a traversal of the remaining first-pass candidate regions may be avoided. In one embodiment, however, if the color difference between a given candidate region and the template image is smaller than a certain threshold value, then that candidate region may be traversed regardless of whether or not a maximum number of subsequent-pass candidates has already been reached.
  • each of the candidate regions determined after the one or more passes performed in step 454 may be scored, based on the difference between their color characterization information and the color characterization information for the template image.
  • the color differences may be calculated in any of various ways. Particular embodiments of color difference methods are discussed below. Any of various systems may be used to score the candidate regions. In one embodiment, each region is assigned a score from 0 to 1000, with 1000 being the best possible match and 0 being the worst.
  • a final list of color match regions may be generated, based on the scores determined in step 456 .
  • the scores may be compared to a threshold value that is used to eliminate regions scoring below a certain level. This threshold value may be a default value or may be specified from the user input received in step 450 .
  • FIG. 11 First-Pass Search
  • FIG. 11 is a flowchart diagram illustrating one embodiment of a method to perform the first pass search illustrated in step 452 of FIG. 9.
  • the first-pass search may involve sampling various regions of the target image, where the regions that are sampled may be determined by a window that slides along the target image according to a particular step size.
  • the method may determine an appropriate step size to use in sliding the window.
  • the step size may at least in part be determined based on user input received in step 450 of FIG. 9. For example, if the user specified aggressive search criteria, then the step size may be relatively large, whereas the step size may be relatively small if the user specified conservative search criteria.
  • the search size may also depend on the size of the template image and/or the target image.
  • the color information for the region may be analyzed, similarly as for the template image. However, as described above, it may not be desirable to examine the color information of every pixel in the region.
  • a sub-sampling size and/or method may be determined, wherein the sub-sampling size specifies the number of pixels to examine for each region.
  • the sub-sampling method may specify the type of sub-sampling, such as random, pseudo-random, or a low discrepancy sequence. In one embodiment, the method may use a low discrepancy sequence to select the subset of pixels.
  • the sub-sampling size and/or method may depend on search criteria specified by the user.
  • steps 474 through 480 may then be performed for each region of the target image to be sampled.
  • a color characterization analysis for the target image region may be performed. This step may utilize the color characterization method described above with reference to FIG. 7, in which the target image pixels (or a selected subset of pixels) are examined individually with respect to their color information and assigned to color space bins.
  • a measure of difference (or similarity) between the color spectrum of the target image region and the color spectrum of the template image may be computed by comparing the information obtained in their respective color characterization analyses. This comparison may be performed in any of various ways. In one embodiment, for each color bin from a set of N bins, the pixel percentage values assigned to corresponding bins for the two images may be subtracted from one another, resulting in N difference values.
  • each of the difference values is to zero, the more similarity there is between the template image and the target image region, with respect to that color category; i.e., the percentage of pixels on the template image and the target image region that fall into that particular color category are substantially the same.
  • the absolute values of the difference values may then be summed to give a value falling between zero and two, where two represents a maximum measure of difference between the color spectrums and zero represents a maximum measure of similarity.
  • each of the difference values may be compared to a threshold value to determine a “score” for each color category.
  • this method may not be the best method for all color matching applications. For example, consider a case where at least one of the seven natural colors of the hue plane is divided into two or more bins, e.g., in response to a user specifying a medium or high sensitivity level. Even if the template image and the target image region have colors that are very similar, it is still possible that pixels from each will be assigned to different bins corresponding to the same natural color in the hue plane. Thus, the results from this example may show very few or no pixels in the same bin, i.e., the results would indicate that the template image and the target image region have very different color spectrums. This may not be the proper result because the colors in the template image and the target image region are actually very similar, but happen to be in different hue categories of the same natural color.
  • Alternative color spectrum techniques may compensate for cases such as described above.
  • a portion of the percentages of pixels assigned to each bin may be manipulated, in order to share pixels among or re-distribute pixels to neighboring bins, before calculating the measure of color spectrum difference as described above.
  • FIG. 12 is a flowchart diagram illustrating one embodiment of a method for performing this type of pixel sharing or re-distribution.
  • the level of sharing or distribution may be determined according to a color sensitivity level specified by the user.
  • each bin shares with zero bins, one neighboring bin on each side, or two neighboring bins on each side, depending on a specified sensitivity level of low, medium, or high, respectively.
  • the level of sharing or distribution with neighboring bins may be determined automatically by the computer, e.g., if a certain threshold of pixels of the template image and the target image region fall into respective neighboring bins (as in the example above), then the method may automatically apply a level of sharing or distribution.
  • the method may automatically detect and compensate for the types of errors described above.
  • the pixel allocation percentages may be re-distributed among neighboring bins.
  • step 506 the compensated percentages of the template image and target image region may then be compared.
  • step 506 may involve subtracting percentages in respective bins of the template image and target image region and summing the results, similarly as described above. This may produce a value representing a measure of difference between the color information of the template image and the color information of the target image region.
  • pixels may be assigned to multiple bins at the time when the color characterization analysis is performed, e.g., on a fractional weighted basis. This increased complexity may result in more accurate color match location results.
  • FIG. 14 is a flowchart diagram illustrating one embodiment of a method using fuzzy membership functions to characterize the color information of the image. The steps shown in FIG. 14 may be performed for each pixel (possibly of a subset) of the image (or region of the image being examined).
  • step 900 the pixel may be assigned to a bin.
  • step 900 may comprise examining color information of the pixel to determine where the pixel lies within the color space and assigning the pixel to a bin corresponding to that portion of the color space.
  • a fuzzy membership or other function may be applied, based on where the pixel falls within the bin.
  • the bin corresponds to a portion of the color space
  • the color information of the pixel may correspond to a point within the color space.
  • the pixel may fall within the bin at various locations, with respect to the range of color space values corresponding to the bin.
  • the fuzzy membership function may determine a contribution which the pixel should make to one or more neighboring bins. For example, if the pixel falls near the edge of a bin (with respect to the portion of the color space that the bin corresponds to), then the fuzzy membership function may determine that a portion of the weight of the pixel should be contributed to the neighboring bin which the pixel is near. Any of various types of fuzzy membership functions may be applied, and the function may determine a contribution which the pixel should make to any number of bins, wherein the sum of these contributions is 100%. For example, the function may determine a plurality of values summing to 1.0, such as, 0.25, 0.50, and 0.25, wherein each value corresponds to a bin.
  • the weight of the pixel may be distributed across the bin to which the pixel was originally assigned and across the one or more bins to which the pixel contributes, in accordance with the contribution values determined in step 902 .
  • values determined by the function in step 902 such as the above exemplary values of 0.25, 0.50, and 0.25, may each be assigned to a corresponding bin.
  • step 900 may not need to be performed.
  • the function may determine a plurality of values to assign to a plurality of color space bins, based on the color information of the pixel and the location of the pixel within the color space, and not necessarily based on the location of a pixel within a bin.
  • FIG. 13A illustrates a triangle fuzzy membership function.
  • the 360-degree hue plane is divided into seven bins, which are shown linearly.
  • the bin that the pixel falls into may be determined, as well as the position within this bin.
  • the triangle fuzzy membership function may then be applied, based on the position within the bin, in order to determine a percentage of the pixel weight which should be assigned to that bin and or to a neighboring bin. This is represented by the angular lines drawn over the bins.
  • 100% of the pixel weight is assigned to that bin.
  • 75% of the pixel weight is assigned to that bin, and 25% of the pixel weight is assigned to the neighboring bin next to that edge, as indicated by the dashed lines.
  • FIG. 13B illustrates a trapezoidal fuzzy membership function.
  • the pixel falls near the center of a bin, then 100% of the pixel weight is assigned to that bin. Otherwise, a portion of the pixel weight may be distributed to a neighboring bin, similarly as in FIG. 13A.
  • FIG. 13C illustrates another example of distributing a pixel among multiple bins.
  • a step fuzzy membership function as applied.
  • Bin X is assigned 80% of the pixel weight
  • Bin X-1 is assigned 15% of the weight
  • Bin X-2 is assigned 5% of the pixel weight.
  • pixels may be distributed across three bins. Increasing the number of bins over which a pixel is distributed may be especially desirable when the hue space is partitioned into a large number of bins.
  • the fuzzy membership functions shown in FIGS. 13A, 13B, and 13 C are exemplary, and any other technique may be used in determining an appropriate pixel distribution.
  • information indicating one or more dominant color categories may be obtained when performing a color characterization analysis of a template image.
  • a measure of difference for the dominant color categories may be computed. This measure of difference may be computed similarly as described above for the color spectrum difference. For example, for each dominant color category determined for the template image, the percentage of template image pixels assigned to the dominant color category may be compared to the percentage of target image region pixels assigned to that color category.
  • the difference values determined in steps 476 and 478 may be used to decide whether to add the region to a list of candidate match areas. For example, the color spectrum difference may need to be less than a threshold value in order for the region to be added to the list. It is noted that the color spectrum difference may be tested immediately after its calculation, and further analysis of the sample region, such as step 478 , may be aborted if the difference is too great.
  • the dominant color difference(s) may be considered. Considering the dominant color difference(s) may help to further ensure that the sample region area is a potential match, since in various embodiments of the calculation of the color spectrum difference, it is possible to obtain a small difference value, even though the occurrence of the dominant color(s) of the template image may be largely reduced in the sample region or may even be missing altogether in the sample region. Dominant color differences may be considered individually or together. For example, if there are multiple dominant color categories, then the percentage difference for each category may be required to be smaller than a threshold value in order for the region to be added to the candidate list, or the average of the differences for all the categories may be required to be smaller than a threshold value.
  • FIG. 15 Color Match Location User Interface
  • FIG. 15 illustrates an example of a graphical user interface (GUI) associated with color match location software according to one embodiment of the present invention.
  • GUI graphical user interface
  • a brief description of applicable GUI elements is given below. It is noted that various other embodiments of such a GUI may comprise GUI elements enabling the user to specify variables affecting the color match location operation at a broader or finer level of granularity than the GUI shown in FIG. 15.
  • the GUI of FIG. 15 is associated with an application that is operable to perform match location of regions in a target image based on both color information of a template image and shape or pattern information of the template image. Thus, certain GUI elements pertain to this shape or pattern information.
  • Image Type displays the color format of the current target image. Color formats may include RGB, CMY, or HSI, among others.
  • “Learn Mode” specifies the invariant features to learn when setting up a learn color pattern. The following values may be selected: “All” (extracts template information for shift and rotation invariant matching”; “Shift Information” (Default) (extracts information for shift invariant matching); “Rotation Information” (extracts information for rotation invariant matching).
  • “Feature Mode” specifies the features to use in the searching stage. The following values may be chosen: “Color” (use color features only in the searching stage); “Shape” (use shape features in the searching stage); and “Color and Shape” (Default) (use both color and shape features in the searching stage.
  • Color Sensitivity specifies a level of color sensitivity (“low”, “medium”, or “high”). This setting may affect the number of color category divisions to use.
  • Search Strategy specifies the different searching algorithms to achieve a tradeoff between search speed and accuracy.
  • the default option is “Balanced”. In case the speed does not meet requirements, the “Aggressive” option may be used. In case the accuracy does not meet requirements, the “Conservative” option may be used.
  • “Number of Matches Expected” specifies a number of matching regions the user expects the target image to have, which may be used to increase the efficiency of the color match location process, as described above.
  • “Match Mode” specifies the technique to use when looking for the template pattern in the image. The following values may be chosen: “Shift Invariant” (default) (searches for the template pattern in the image, assuming that it is not rotated more than ⁇ 4°); “Rotation Invariant” (searches for the template in the image with no restriction on the rotation of the template). If the “Feature Mode” is set to “Color” only, then the rotation Invariant matching can also be achieved by using a square template image in “Shift Invariant” mode.
  • Minimum match score specifies a threshold value for color matching scores. The data range is between 0 and 1000.
  • the GUI also includes various fields for viewing information for each matching region of the target image, once the search has been performed, such as the location and size of the region, a match score indicating how close the color information of the region matches the color information of the template image, etc.
  • FIG. 16 Displaying Color Match Regions
  • the locations of the match regions may also be visually indicated in the target image, e.g., by displaying a box around each match region, as shown in FIG. 16.
  • FIG. 17 Display Color Characterization Information
  • an application may be operable to display information representing the color characterization of an image.
  • FIG. 17 illustrates one example of such a display.
  • FIG. 17 shows the percentage (vertical scale) of 16 defined colors (horizontal scale) as determined by one embodiment of the color characterization method described herein.
  • the color characterization list or data structure may further be operated upon to create a color characterization represented as a single value.
  • the color characterization may also be represented textually (e.g., by the terms brick red, jet black, mauve, etc.) through the use of a look-up table configured according to the color categorization method of the present invention.
  • the color characterization may also be represented graphically in various ways.
  • the color characterization may be stored along with the image or transmitted to other computer systems for analysis or display.
  • the color characterization may also be used as part of an image compression technique.

Abstract

A system and method for measuring the similarity of multiple-color images and for locating regions of a target image having color information that matches, at least to a degree, the color information of a template image. A color characterization method operates to characterize the colors of an image and to measure the similarity between multiple-color images. For each image pixel, the method determines a color category or bin for the respective pixel based on HSI values of the respective pixel, wherein the color category is one of a plurality of possible color categories in HSI color space. In various embodiments, the weight of the pixel may be fractionally distributed across a plurality of color categories, e.g., as determined by applying fuzzy pixel classification with a fuzzy membership function. The percentage of pixels assigned to each category is then determined. The percentage of pixels in each color category is then used as a color feature vector to represent the color information of the color image. A quantitative measurement of the color similarity between color images is then computed based on the distance between their color feature vectors. Once the color information of a template image has been characterized, a target image may be searched in order to locate regions within the target image having matching color information. In one embodiment, a coarse-to-fine heuristic may be utilized, in which multiple search stages of decreasing granularity are performed. A first-stage search may operate to identify a list of candidate match regions based on the city-block distance of the color feature vector computed using a sub-sampling scheme. These candidate match regions may then be examined in further detail in order to determine final matches.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a method for characterizing colors in an image. The invention also relates to a method for determining a measure of similarity between two color distributions of images or regions of interest using fuzzy pixel classification. The invention also relates to a method for locating regions of a target image that match a template image with respect to color characterization. [0001]
  • Description of the Related Art [0002]
  • Computer-implemented methods for characterizing the color information of an image or determining a measure of similarity between two color images have a wide array of applications in many fields. For example, in machine vision applications, color is a powerful descriptor that often simplifies object identification and information extraction from a scene. Color characterization, location, and comparison is an important part of machine vision and is used in a large class of assembly and packaging inspection applications, e.g., to detect missing, misplaced, or damaged color components, defects on the surfaces of color objects, etc. [0003]
  • In addition to the fields of industrial automation and machine vision, color characterization and color matching methods have important applications in many other fields such as content-based image retrieval (CBIR). In a content-based image retrieval system, a plurality of color images may be indexed. In the indexing step, color information regarding each image may be extracted and stored. A searching step may then be performed, where the stored color information is used to find one or more indexed images that match the color information of a template image. [0004]
  • Image processing and machine vision systems use several different color spaces, including RGB, HSI (or HSL) and CMY. In the RGB space, each color appears in its primary spectral components of red, green and blue. This RGB color space is based on a Cartesian coordinate system. The RGB model is represented by a 3-dimensional cube with red, green, and blue at the edges of each axis. Each point in the cube represents a color, and the coordinates of that point represent the amount of red, green and blue components present in that color. Because the red, green, and blue color components in RGB color space are highly correlated, it is difficult to characterize colors with intensity/luminance independent features. [0005]
  • The Hue, Saturation, Intensity (HSI) or Hue, Saturation, Luminance (HSL) color space was developed to specify color in terms that are easier for humans to quantify. The hue component is color as we normally think; such as orange, green, violet, and so on (a rainbow is a way of visualizing the range of hues). Thus, hue represents the dominant color as perceived by an observer. Saturation refers to the amount or richness of color present. Saturation is measured by the amount of white light mixed with a hue. In a pure spectrum, colors are fully saturated. Colors such as pink (red and white) and lavender (purple and white) are less saturated. The intensity or light component refers to the amount of grayness present in the image. [0006]
  • Colors represented in HSI model space may be ideal for machine vision applications for several reasons. First, HSI includes an intensity (luminance) component separated from the color information. Also, the intimate relation between hue and saturation more closely represents how humans perceive color. It may therefore be desirable to characterize colors in HSI space for color measurement and color matching. [0007]
  • HSI is modeled with cylindrical coordinates. One possible model is a double cone model, i.e., two cones placed end to end or an inverted cone below another cone (see FIG. 4). For information on the double cone model, please see “A Simplified Approach to Image Processing”, Randy Crane, Prentice Hall, 1997. The hue is represented as the angle theta, varying from 0 degrees to 360 degrees. Saturation corresponds to the radius or radial distance, varying from 0 to 1. Intensity varies along the z-axis with 0 being black and 1 being white. When S=0, the color is gray scale with intensity I and H is undefined. When S=1, the color is on the boundary of the top cone base and is fully saturated. When I=0, the color is black and therefore H is undefined. [0008]
  • On the assumption that the R, G and B values have been normalized to range from 0 to 1, the following equations may be used to convert from RGB color space to HSI (or HSL) color space: [0009]
  • I=(R+G+B)/3
  • [0010] H = cos - 1 { 1 2 [ ( R - G ) + ( R - B ) ] [ ( R - G ) 2 + ( R - B ) ( G - B ) ] 1 2 } S = 1 - 3 ( R + G + B ) [ min ( R , G , B ) ]
    Figure US20020102018A1-20020801-M00001
  • The Intensity I (or Luminance L) may also be represented by the equation: [0011]
  • L=0.299R+0.587G+0.114B
  • which is a weighted sum of the RGB values. [0012]
  • The equation for H yields values in the interval [0°,180°]. If B/I>G/I then H is greater than 180° and is obtained as H=360°−H. [0013]
  • Prior art systems use various techniques to measure and match colors. Those skilled in the art will be familiar with ‘thresholding’ an image. To threshold a color image, a threshold is applied to each of the three planes that make up the image. In RGB mode, to select a particular color, one will need to know the red, green and blue values that make up the color. In RGB mode it is not possible to separate color from intensity. Therefore, a characterization algorithm such as histogram intersection based on RGB space will be intensity sensitive. [0014]
  • A color-indexing scheme based on using histogram intersection in RGB space was proposed by Swain and Ballard (“Color Indexing”, Michael J. Swain, Internal Journal of Computer Vision, vol. 7:1, page 11-32, 1991). In Swain and Ballard's approach, the RGB space is first converted to an opponent-theory-based color space with axes black-white, red-green and blue-yellow. The color space is then divided into bins with the same number of bins in the red-green and blue-yellow axes but with a much coarser quantization in black-white axis. Color similarity is then computed by a histogram intersection method based on the color distribution in those bins. Let D[0015] M,T be the difference of the color information in the model image and the target image, then
  • D MT=diff (H M , H T)
  • Where HM is the quantized color histogram of the model image, and H[0016] T is the quantized color histogram of the target image. diff is a function defining the similarity measure of the quantized histograms. The similarity measure of color images in the histogram intersection is defined as DI M , T = k = 1 N min ( H M ( k ) , H T ( k ) ) k = 1 N H M ( k )
    Figure US20020102018A1-20020801-M00002
  • where, N is the total number of bins, H[0017] M(k) is the number of pixels from the model image in bin k and HT(k) is the number of pixels from the target image in bin k. It has been proved that the histogram intersection method is equivalent to computing the sum of absolute differences or city-block metric when the target image has the same size as the model image (M. Swain and D. Ballard, Color Indexing, International Journal of Computer Vision, vol. 7, no. 1, pp. 11-32, 1991). That is, the similarity measure can be defined as DI M , T = k = 1 N | H M ( k ) - H T ( k ) | k = 1 N H M ( k )
    Figure US20020102018A1-20020801-M00003
  • It can be seen from the above similarity measure that the histogram intersection method does not take into account the color similarity between a bin and its neighbors. For example, if the model image has all the pixels located in bin k but the target image has all the pixels located in bin k+[0018] 1, the similarity computed from histogram intersection method is 0. When the number of bins is large, this will cause a very similar image classified as a completely different image with similarity 0. A more robust color similarity measure that takes the similarity of the neighboring bins into account is desirable.
  • Color constancy, which is the ability to have constant perception of a color over varying lighting conditions, as people do in most circumstances, is important when defining a similarity measure of color images. This is especially true for applications of image retrieval and machine vision. However, Swain and Ballard's histogram intersection method has been proved sensitive to lighting change (J. Hafner, Efficient color histogram indexing for quadratic form distance functions, IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 17, no. 7, 1995). A color characterization with color constancy capability is desirable. [0019]
  • U.S. Pat. No. 5,410,637 (Kern) uses fuzzy logic to establish acceptable pass/fail tolerances for production or inventory samples. The process first stores a series of training image samples which are labeled pass or fail according to visual inspections. The initial value of the tolerance is a super ellipsoid determined by the high/low value of the pass samples in the training set. A classifier template uses the super ellipsoid tolerances and ranks every sample in the training set. The process then employs fuzzy logic to obtain an optimized tolerance which minimizes the sum of ranking error between the classifier template and the visual ranks. The process essentially builds a pass/fail color classifier. This process cannot be used to measure the colors quantitatively in an image or to measure the quantitative color similarity between two objects in an image or in two separated images. [0020]
  • U.S. Pat. No. 5,085,325 (Jones) implements a color sorting system and method. The method creates a lookup table containing a series of 0's (accept) and 1's (reject) based on good and bad sample images. During the sorting process, the pixel value of the input image is used to address the lookup table, the output of the lookup table is either [0021] 1or 0. If the number of rejects (1's) accumulated is larger than a specified number K, the input image is rejected. This color sorting method is based on a pixel-by-pixel comparison. A large memory is required to store the lookup table. Although a special hardware addressing approach can improve the processing speed, the cost of computation is still very high for sorting objects with complex colors.
  • U.S. Pat. No. 5,751,450 (Robinson) provides a method for measuring the color difference of two digital images as a single ‘distance.’ This ‘distance’ is an average of the color differences of all corresponding pixels of the two images. Similar to the Jones' patent as described above, the cost of computation of the distance is very high. This template image has to be stored in the computer memory for on-line color matching. If the size of the template is not the same as that of the target image, special operations for alignment or resizing the image must be done before the matching process can begin. A further drawback of this approach is that it is impossible to have scale and rotation-invariant color matching based on the ‘distance’ measure. [0022]
  • U.S. Pat. No. 5,218,555 (Shigeru Komai) discloses a system for judging color difference between a single color and a reference color in CIE Lab color space. The reference color value (L,a,b) is input from a computer keyboard and the Euclidean distance between the reference color and the inspected color is computed. If the Euclidean distance is smaller than a preset threshold e[0023] 1, then it is judged that there is no substantial color difference. If the Euclidean distance is larger than a preset threshold e2, then it is judged that there is a substantial color difference. If the Euclidean distance is between larger than e1 and smaller than e2, then fuzzy logical rule is applied to make decision about the difference. This system takes the human uncertainty of judging color difference into account to achieve a better judgment on the difference of two colors.
  • In the prior art, color matching based on pixel-by-pixel comparisons is sensitive to the change on image shift, scale and rotation. The computation cost of pixel-by-pixel comparison is very expensive and is difficult to accomplish in real time. A more efficient color characterization method is desirable. [0024]
  • In the prior art, methods for judging color difference using fuzzy logic only work for judging the difference between two single colors to produce a pass/fail result. A system that can make robust measurement of color difference between objects with multiple colors in color machine vision is desirable. [0025]
  • In the prior art, methods for color similarity measure based on color histogram do not take the similarity between the neighboring bins into account. Each pixel is only classified as belonging to one bin. Therefore, two colors in two close bins are considered to be completely different. A fuzzy pixel classification method based on fuzzy set theory to allow a pixel to belong to multiple bins according to a fuzzy membership function is desirable. The prior art of color matching is also sensitive to light intensity change. A more accurate and intensity-insensitive color characterization and comparison method is desirable. More specifically, it is desirable for machine vision applications to more effectively characterize and measure the color similarity of multiple-color images. [0026]
  • In the prior art, a method for automatically determining color features of a template image and using those features to locate color match regions in a target image has thus far been lacking. For example, some prior art methods require users to manually select or specify color features of the template image to be used in searching, e.g., by choosing a dominant color of the template image. Also, users may be required to manually threshold the template image, and this threshold information may be used in searching. Thus, a method to automatically determine color features of a template image and use these features to perform a color match search is desirable. [0027]
  • SUMMARY OF THE INVENTION
  • In view of the foregoing, an object of the present invention is to provide an improved system and method for effectively and accurately characterizing color for machine vision applications. [0028]
  • Another object of the invention is to provide improved systems and methods for locating regions or objects of a target image having color information that matches, at least to a degree, the color information of a template image. [0029]
  • Another object of the invention is to provide improved systems and methods for effectively and accurately measuring the color similarity of multiple-color images. [0030]
  • Still another object of the invention is to provide a machine vision system for measuring multiple colors, including black and white color, while the color measuring system is intensity independent within a large range of intensity variation. [0031]
  • Still another object of the invention is to provide a machine vision system for measuring multiple colors with different saturation values in an image, while the color measuring system comprises a wide range of intensity variation and is intensity independent. [0032]
  • Still another object of the invention is to provide a machine vision system for color matching that may quantitatively measure the color difference between two images or between two regions of interest in the same image. [0033]
  • Still another object of the invention is to provide a machine vision system for color matching that is not required to calculate the color difference based on pixel-by-pixel comparisons. [0034]
  • Still another object of the invention is to provide a machine vision system for color matching that is intensity independent within a large range of intensity variation. [0035]
  • Still another object of the invention is to provide a machine vision system for color matching that can distinguish colors with different saturation values. [0036]
  • Still another object of the invention is to provide a machine vision system for color matching that compensates for black and white color distribution in images. [0037]
  • A color characterization method is described herein which operates to characterize the colors of an image or region of an image. The image may be obtained in HSI format, or alternatively may be converted from another format to HSI. For example, an image may be acquired in HSI format by National Instruments color image acquisition board PCI-1411. The color characterization divides the HSI space into n color categories (also referred to as subspaces or bins), where n is the number of color categories of interest. The number of different color categories in the color space may be dependent on a desired complexity of the color characterization. [0038]
  • For each image pixel, the method determines a color category for the respective pixel based on values of the respective pixel, i.e., hue, saturation and intensity values, wherein the color category is one of a plurality of possible color categories or bins (or sub-spaces) in the HSI space. The number of pixels assigned to each category is then counted and normalized by the total number of pixels in the selected region of interest (or entire image), i.e., the percentage of pixels in each color category characterizes the colors of the image or ROI. The percentage of pixels in each color category may also be used as a quantitative measurement of the color distribution of the image. [0039]
  • In various embodiments, fuzzy membership or other functions may be applied to determine a desired distribution of pixels among color space bins. For example, pixels may be assigned to multiple bins during the color characterization method, e.g., on a fractional weighted basis. This increased complexity may result in more accurate color match location results. For each pixel, a fuzzy membership or other function may be applied, based on where the pixel falls within the bin and/or where the pixel falls within the color space, based on color information of the pixel. This function may determine a contribution that the pixel should make to one or more bins. For example, the function may determine a set of values to assign to each of the one or more bins. For example, if the pixel falls near the edge of a bin (with respect to the portion of the color space that the bin corresponds to), then the function may determine that a portion of the weight of the pixel should be contributed to the neighboring bin that the pixel is near. The function may determine a contribution that the pixel should make to any number of bins, wherein the sum of these contributions is 100%. Any of various types of fuzzy membership functions may be applied, including, triangle fuzzy membership functions, trapezoid fuzzy membership functions, and step fuzzy membership functions. [0040]
  • Another embodiment of the invention comprises a color match location method that may use the color characterization method described above. Once the color information of a template image has been characterized, a target image may be searched in order to locate regions within the target image having matching color information. In one embodiment, a coarse-to-fine heuristic may be utilized, in which multiple passes of decreasing granularity are performed. A first-pass search may operate to identify a list of candidate match regions. For example, the target image may be stepped across at a step interval, wherein color information of a target image region is characterized at each step, using the color characterization method described above. For each target image region, a measure of difference between the color characterization information of the target image region and the color characterization information of the template image may be calculated. If this difference is smaller than a threshold value, then the target image region may be added to a list of candidate regions. [0041]
  • For each candidate region, a larger area (region) proximal to the candidate region may then be searched, e.g., by stepping through the proximal area using a smaller step size than was used in the first-pass search. At each step, color information of a target image region within the proximal area may be characterized and compared to the template image color information. The target image region within the area proximal to the initial candidate region that best matches the color information of the template image may be considered a second-pass candidate region. The matching criteria used to determine whether a target image region is a second-pass candidate region are preferably stronger than the criteria used in the first-pass search, i.e., the value calculated as the difference between the color information of the target image region and the color information of the template image must be smaller than a smaller threshold value than was used in the first-pass search. [0042]
  • The process described above may be repeated for as many repetitions as desired. With each repetition, the step size used is preferably smaller and the measure of color difference preferably must be smaller in order for a region to be considered a candidate, e.g., until a predetermined number of search passes are performed or until step sizes are as small as possible and/or matching criteria are as strict as possible. Once the final repetition is performed, any target image regions that still remain as candidate matching regions may be considered as final matches. [0043]
  • The color match location method described above may be useful in many applications. For example, the method may be especially useful in applications that do not require an exact location of the template image within the target image to be determined, with sub-pixel accuracy. For example, some applications may need to very quickly determine match locations to a degree of accuracy, but may not require the locations to be determined with the degree of preciseness that may be obtained if pattern information is also used in the matching. This more coarse location determination may be suitable for many applications, e.g., to determine whether all color-coded pills are present in a blister pack. The method may also be especially suitable for applications that do not require the spatial orientation of the matches to be determined. [0044]
  • It is noted that in addition to the method described above which uses the color characterization method to locate regions of a target image that match the color information of a template image, the color characterization method may also be used to determine color similarity of a template image and a target image as a whole. For example, some applications do not require a target image to be searched for color match regions, but may simply require a determination of how closely the color information of the entire target image matches the color information of the template image. [0045]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A better understanding of the present invention can be obtained when the following detailed description of the preferred embodiment is considered in conjunction with the following drawings, in which: [0046]
  • FIG. 1 illustrates a computer system which performs color characterization and/or color matching according to one embodiment of the present invention; [0047]
  • FIG. 2 illustrates an exemplary image acquisition (video capture) system for acquiring images; [0048]
  • FIG. 3 is a high-level block diagram of the image acquisition system according to one embodiment; [0049]
  • FIGS. 4, 5A and [0050] 5B are graphical representations of HSI color space;
  • FIG. 6 is a flowchart diagram illustrating one embodiment of a method for characterizing color information of a template image and/or a target image; [0051]
  • FIG. 7 is a flowchart diagram illustrating one embodiment of a method for analyzing image pixels in order to determine a pixel distribution among HSI color bins; [0052]
  • FIG. 8 is a flowchart diagram illustrating one embodiment of a method for locating regions of a target image that match a template image, with respect to color characterization; [0053]
  • FIG. 9 is a flowchart diagram illustrating one embodiment of a method for searching a target image to find regions having color information that match a template image; [0054]
  • FIG. 10 illustrates an example of target image window movement during a first-pass search; [0055]
  • FIG. 11 is a flowchart diagram illustrating one embodiment of a method for performing a first pass search through a target image; [0056]
  • FIG. 12 is a flowchart diagram illustrating one embodiment of a method for performing pixel sharing or re-distribution after pixels have been assigned to HSI color bins; [0057]
  • FIGS. 13A, 13B, and [0058] 13C illustrate examples using fuzzy membership functions to determine a desired fractional pixel distribution among HSI color bins;
  • FIG. 14 is a flowchart diagram illustrating one embodiment of a method of using a fuzzy membership function to characterize the color information of the image; [0059]
  • FIG. 15 illustrates an example of a graphical user interface (GUI) associated with color match location software according to one embodiment of the present invention; [0060]
  • FIG. 16 illustrates a target image in which color match locations are visually indicated; and [0061]
  • FIG. 17 illustrates a display of information representing the color characterization of an image. [0062]
  • While the invention is susceptible to various modifications and alternative forms specific embodiments are shown by way of example in the drawings and are herein described in detail. It should be understood, however, that drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary the invention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the present invention as defined by the appended claims. [0063]
  • Detailed DESCRIPTION OF THE FIGURES
  • FIG. 1—Computer System [0064]
  • FIG. 1 illustrates a [0065] computer system 102 which may perform color match location according to one embodiment of the present invention. The computer system 102 may comprise one or more processors, a memory medium, display, and an input device or mechanism, such as a keyboard or mouse, and any other components necessary for a computer system.
  • The [0066] computer system 102 may perform a color characterization analysis of a template image and may use information determined in this analysis to locate regions of a target image which match the template image, with respect to color characterization. Images that are to be matched are preferably stored in the computer memory and/or received by the computer from an external device.
  • The [0067] computer system 102 preferably includes one or more software programs operable to perform the color match location. The software programs may be stored in a memory medium of the computer system 102. The term “memory medium” is intended to include various types of memory, including an installation medium, e.g., a CD-ROM, or floppy disks 104, a computer system memory such as DRAM, SRAM, EDO RAM, Rambus RAM, etc., or a non-volatile memory such as a magnetic medium, e.g., a hard drive, or optical storage. The memory medium may comprise other types of memory as well, or combinations thereof. In addition, the memory medium may be located in a first computer in which the programs are executed, or may be located in a second different computer which connects to the first computer over a network. In the latter instance, the second computer may provide the program instructions to the first computer for execution. Also, the computer system 102 may take various forms, including a personal computer system, mainframe computer system, workstation, network appliance, Internet appliance, personal digital assistant (PDA), television system or other device. In general, the term “computer system” can be broadly defined to encompass any device having a processor which executes instructions from a memory medium.
  • The software program(s) may be implemented in any of various ways, including procedure-based techniques, component-based techniques, and/or object-oriented techniques, among others. For example, the software program may be implemented using ActiveX controls, C++ objects, Java objects, Microsoft Foundation Classes (MFC), graphical programming techniques or other technologies or methodologies, as desired. A CPU, such as the host CPU, executing code and data from the memory medium comprises a means for performing color match location according to the methods or flowcharts described below. [0068]
  • FIG. 2—Machine Vision System [0069]
  • FIG. 2 illustrates a machine vision system or image acquisition system, which is an example of one application of the present invention. The color match location techniques described herein may be used in various types of image processing, machine vision or motion control applications. For example, the [0070] computer 102 may be embodied in various form factors and/or architectures, e.g., a robot or embedded device, among others. It is also noted that the color match location techniques described herein may be performed in any of various manners, either in software, programmable logic, or hardware, or a combination thereof.
  • In the machine vision system of FIG. 2, [0071] computer system 102 is coupled to a camera 112 and operates to receive one or more images. The computer system 102 may be operable to perform a color characterization analysis to characterize the colors in a template image. In the present application, the term “template image” is used to refer to either an entire image, or a portion of an image, e.g., a region of interest (ROI). The computer system 102 may also be operable to perform a search of a target image to locate target image regions that “match” the color characterization of the template image. As described below, the search may be performed to locate matching regions with any of various degrees of exactness, as appropriate for a particular application.
  • FIG. 3—Image Acquisition System Block Diagram [0072]
  • FIG. 3 is a high-level block diagram of the image acquisition system of FIG. 2 for acquiring an image for color characterization and/or color matching according to the present invention. It is noted that the block diagram of FIG. 3 is exemplary only, and other computer system architectures may be used as desired. For example, the present invention may be implemented in a “smart camera”, which integrates a sensor, analog to digital (A/D) converter, CPU, and communications devices together in a single unit. The present invention may be embodied in other architectures, devices, or embodiments, as desired. [0073]
  • As shown in FIG. 3, the [0074] host computer 102 preferably comprises a CPU 202, a bus bridge 204, system memory 206, and a peripheral bus 212. The CPU 202 is coupled to the bus bridge 204. The bus bridge 204 is coupled to the system memory 206 and the CPU 202, and couples to the peripheral bus 212. In the preferred embodiment, the peripheral bus 212 is the PCI expansion bus, although other types of buses may be used.
  • In this embodiment, the [0075] host computer system 102 also includes a video capture board (also referred to as an image acquisition board) 214 which is adapted for coupling to the video source 112. The video capture board 214 is preferably coupled to the peripheral bus 212. In addition to the video capture board 214, other peripheral devices (216 and 218) may be coupled to the peripheral bus 212, such as audio cards, modems, graphics cards, network cards, etc.
  • The [0076] video source 112 supplies the analog or digital video signals to the video capture board 214. The video capture board 214 transfers digitized video frames to the system memory 206 through peripheral bus 212 and bus bridge 204. In this embodiment, the video capture board 214 acquires the target image and transfers it to system memory 206. The user of the computer 102 may then select one or more regions of interest (ROI) in the target image which are desired to be searched for regions having color information that matches the color information of a template image. The ROI may be the entire target image or a portion of the target image.
  • The [0077] system memory 206 may store a template image. In a color match location application, the system memory 206 may store the color characterization information of the template image instead of, or in addition to, the actual template image. The system memory 206 also preferably stores software according to one embodiment of the present invention which operates to characterize the color information (color characterization software) of images, such as the template image and/or one or more acquired or specified target images. Thus the color characterization software in the system memory may operate on the template image to produce the color characterization information. The system memory 206 may also receive and/or store one or more other images, such as selected ROIs in the template image or another image, or acquired target images or target image objects. The system memory 206 also preferably stores software according to one embodiment of the present invention which operates to perform a color match location method (color match location software), as described below.
  • The term “image,” as used herein, may refer to any of various types of images. An image may be a gray-level or color image. An image may also be a complex image, in which pixel values have a real part and an imaginary part. An image may be obtained from any of various sources, including a memory medium. An image may, for example, be obtained from an image file, such as a BMP, TIFF, AIPD, PNG, JPG, or GIF file, or a file formatted according to another image format. An image may also be obtained from other sources, including a hardware device, such as a camera, frame grabber, scanner, etc. The term “image” may also refer to an entire image or to a portion or region (ROI) of an image. [0078]
  • It is noted that, in a color match location application, the color characterization information of the template image may be pre-calculated and stored in the computer, and the actual template image is then not required to be stored or used for subsequent color match location operations with respective target images. Thus, when a target image is acquired, the color characterization software characterizes the colors in the target image and may compare this color information with the pre-computed color information of the template image. [0079]
  • The present invention is preferably implemented in one or more software programs which are executable by a processor or CPU. The software program(s) of the present invention are preferably stored in a memory medium of a computer as described above. [0080]
  • FIGS. 4, 5A, [0081] 5B—HSI Color Space
  • In one embodiment, characterizing the color information of a template image and/or target image may utilize HSI (hue, saturation, intensity) information. The HSI information of individual pixels of an image may be analyzed, and the pixel-specific results may be compiled in order to characterize the image based on color. In one embodiment, the color characterization method divides the color spectrum or color space into categories or “bins” (also called sub-spaces), primarily according to hue and saturation values, and then operates to assign pixels to respective ones of these bins. The total number of pixels (or percentage of pixels) in an image that fall into each category or bin of the color spectrum may then be used as the basis of the color characterization. [0082]
  • FIG. 4 illustrates the possible hue, saturation, and intensity values (the color spectrum) as a 3-dimensional space or volume. The color information of a given pixel may be represented as a vector or point within the 3D color space or volume shown in FIG. 4. The vector's location represents the hue, saturation, and intensity of the pixel. [0083]
  • Hue represents the color shade of a pixel and is shown as an angle of a radial line in the circle in FIG. 4. FIG. 5A illustrates a cross section of FIG. 4. As shown in FIG. 5A, hue is represented as an angular value ranging from 0-360 degrees. [0084]
  • Saturation refers to a color's freedom from mixture or dilution with white. Saturation is represented in FIG. 4 as the radial distance of a line on the circle, i.e., the distance from the center of the circle. Saturation may be more easily seen in the cross section of FIG. 5A. Saturation typically is measured in the range of 0 to 1, with 0 being at the center of the circle and 1 being at the outside perimeter of the circle. Thus, hue and saturation are essentially represented in polar coordinates to describe a point or location on the circle of FIGS. 4 and 5A. [0085]
  • Intensity, sometimes referred to as light or luminance, refers to the degree of shade in a pixel and is represented on the vertical scale of FIG. 4, i.e., vector locations above or below the circle. The terms luminance and intensity are interchangeable throughout this description. Intensity values typically range from 0 to 1, with 0 being pure black and 1 being pure white. The [0086] intensity value 0 is represented at the apex of the bottom cone, and the intensity value 1 is represented at the apex of the top cone.
  • In one embodiment of a color match location method, the method used to characterize the color information of a template image and the method used to characterize the color information of a target image may be the same. [0087]
  • Before color characterization occurs, the color space of FIG. 4 may be partitioned into color categories. The color space may be partitioned into any number of categories or bins. The number of categories or bins determines the granularity or resolution of the color characterization. For example, for some applications a large degree of similarity between a template image and a target image region may be desired in order for the target image region to be considered as a match. Thus, a large number of categories or bins may be required in this instance. In various embodiments, user input may be received which specifies the desired complexity of the color characterization. In one embodiment, three possible complexity levels may be specified, these being low, medium, and high. [0088]
  • In the preferred embodiment, the low complexity level comprises 17 possible categories or bins. In the low complexity level, the hue plane (FIG. 5A) is divided into seven different bins (or pie-shaped wedges) [0089] 440 for the seven possible natural colors, and the saturation plane is divided into two regions, thereby creating 14 (7×2) bins. The seven possible natural colors comprise the 7 standard colors of the color spectrum, these being: red, orange, yellow, green, blue, indigo and violet. The two regions of the saturation plane are defined by a radial distance threshold 442, preferably 0.3 on a scale from 0 to 1. The seven different bins of the hue plane and the two regions or bins of the saturation plane thereby create 14 possible categories or bins in the hue/saturation plane. Three additional color categories are allotted for the pixel being characterized as black, gray, or white, thereby creating a total of 17 possible categories (14+3).
  • FIG. 5B illustrates the areas within HSI color space which may be categorized as either black, gray, or white. In general, the color of a specific pixel may be characterized as black, gray, or white if the saturation value is very low. The black, gray, and white categories are discussed in more detail below. [0090]
  • The medium complexity level may comprise 31 possible categories or bins. In the medium complexity level, the hue plane (FIG. 5A) is divided into 14 [0091] different color categories 440 and the saturation plane is divided into two regions, thereby creating 28 (14×2) bins. Thus, in the medium complexity level, the hue plane is divided into 14 pie-shaped wedges, and the saturation plane is further sub-divided into 2 regions defined by a radial distance threshold 442, preferably 0.3 on a scale from 0 to 1, thereby creating 28 possible color categories or bins in the hue/saturation plane. Three additional color categories are allotted for the pixel being black, gray, or white, thereby creating a total of 31 possible color categories (28+3).
  • The high complexity level may comprise 59 possible color categories or bins. In the high complexity level, the hue plane (FIG. 5A) is divided into 28 [0092] different bins 440, and the saturation plane is divided into two regions, thereby creating 56 (28×2) bins. Thus, in the high complexity level, the hue plane is divided into 28 pie-shaped wedges, and the saturation plane is further sub-divided into 2 regions defined by a radial distance threshold 442, preferably 0.3 on a scale from 0 to 1, thereby creating 56 possible color categories or bins in the hue/saturation plane. Three additional color categories are allotted for the pixel being black, gray, or white, thereby creating a total of 59 possible categories (56+3).
  • The saturation categorization, i.e., the location of the [0093] radial distance threshold 442, is preferably set to a default value, but may also be adjusted by the user setting the Learn Sat Threshold 604. The saturation threshold typically is only adjusted when color characterization is performed on images with little variance in color saturation. In another embodiment, the number of saturation divisions may be increased, for example, to 3 (or more), or may be decreased to 0 (i.e. colors are not divided with respect to saturation level).
  • FIG. 6—Color Characterization Method [0094]
  • FIG. 6 is a flowchart diagram illustrating one embodiment of a method for characterizing color information of a template image and/or a target image. In one embodiment, the color characterization method shown in FIG. 6 may be utilized in [0095] step 252 of the color match location method shown in FIG. 8 below, to perform a color characterization of a template image. The color characterization method shown in FIG. 6 may also be utilized in step 254 of FIG. 8 to perform color characterization on regions of a target image during a color match location search.
  • It is noted that FIG. 6 represents one particular embodiment of a color characterization method. Various applications may require different levels of sensitivity with respect to characterizing colors in a template image and/or classifying target image regions as color matches. Various applications may also have different computational efficiency requirements. Thus, in alternative embodiments, any of various color characterization methods may be utilized. [0096]
  • It is noted that, for a template image, the color characterization method shown in FIG. 6 may be performed once and the color information for the template image may be stored and used as necessary. For a target image, the method of FIG. 6 may be performed multiple times for various regions of the image as the target image is searched. [0097]
  • The embodiment illustrated in FIG. 6 involves analyzing an image with respect to HSI color information. As shown in [0098] step 260, user input may be received which specifies various color characterization method options. For example, the user input may specify a color sensitivity level to use in analyzing the image, i.e., a desired resolution of color information. In one embodiment, the user may select one of three sensitivity levels, these being low, medium, and high. As described above with reference to FIG. 5A, the sensitivity level may determine the number of categories or bins into which to divide the HSI color space. It is noted that the number of color categories may be set to any number or level, as desired. Alternatively, a default color characterization method may be used, and user input may not be used.
  • In [0099] step 262, the image may be converted to HSI format. Images are typically stored or received in RGB (Red, Green, Blue), Redness/Greenness, CMY, or HSI format. Thus, if an image is not in HSI format when received, it may be automatically converted to HSI format in step 262. The conversion process, when necessary, may analyze an image pixel by pixel, applying an algorithm that converts the current color format to the HSI format. It is noted that alternative embodiments of color characterization methods may utilize other color representation formats, such as RGB or CMY, among others. In these embodiments, for example, the RGB or CMY color spaces may be divided into color categories or bins, and pixels may be assigned to these bins.
  • In [0100] step 264, the HSI color space may be partitioned into categories or bins, such as described above with reference to FIGS. 4 and 5. The number of bins to divide the space into may utilize color sensitivity information received in step 260. Step 264 may simply involve storing information that specifies the different bins.
  • In [0101] step 266, the image (or ROI) may be analyzed pixel by pixel, in order to determine the pixel distribution among the HSI bins. FIG. 7 illustrates one embodiment of step 266 in detail. In one embodiment, the user may specify one or more colors which should be ignored in performing the pixel distribution. For example, the user may specify that black, gray, white or some combination of these or other HSI colors should be ignored. This may be useful, for example, if the template image and/or the target image have background colors that should be ignored for color matching purposes.
  • In one embodiment, pixels may be examined at the time that the HSI bin distribution is performed, so that pixels corresponding to certain bins are ignored. In another embodiment, this consideration may be performed after the pixel distribution is performed. For example, for each bin corresponding to a color that should be ignored, the number or percentage of pixels assigned to that bin may be set to zero after the distribution is performed, and the pixel percentages in the remaining bins may be normalized to sum to 100 percent. This latter embodiment may result in a more efficient color characterization method. [0102]
  • In the description above, each examined pixel is assigned to a single category or bin. In alternative embodiments, pixels may be assigned to multiple bins, e.g., on a weighted basis. For example, if a pixel falls near an “edge” of a bin, with respect to the portion of color space represented by that bin, then a fraction of that pixel's weight may be assigned to a neighboring bin. The determination on how to distribute a pixel among multiple bins may be performed in any of various ways, including through the use of a fuzzy membership function. Fractional distribution of pixels is further discussed below. [0103]
  • In one embodiment the color characterization method may also involve determining one or more color categories which are characterized as dominant color categories of the image, as shown in [0104] step 268, wherein the one or more dominant color categories are assigned a relatively larger proportion of image pixels, with respect to other color categories of the color space.
  • The determination of dominant color categories may be performed in any of various ways. For example, in one embodiment the categories may be sorted with respect to pixel allocation percentage, and the category with the highest percentage may then be examined. If this percentage falls at or above a certain ratio value T, which may be a default value or may be specified by a user, then this color category may be considered as a single dominant color category for the image. If this percentage is below the value T, then the color category with the next largest percentage of pixel allocation may be considered as a second dominant color category for the image, etc., until the sum of the percentages of the examined bins is at or above the value T. Thus, there may be multiple dominant color categories for an image. In one embodiment it may be required that the percentage of pixels in the largest category be at least of a certain threshold value in order for the image to have any dominant color categories. [0105]
  • In the preferred embodiment, the dominant color information is determined only for the template image, i.e., this computation may be omitted when performing a color characterization analysis of a target image region. The dominant color information of a template image may be utilized when comparing the color information of the template image to the color information of a target image, as described below. [0106]
  • FIG. 7—HSI Bin Pixel Distribution [0107]
  • FIG. 7 is a flowchart diagram illustrating one embodiment of [0108] step 266 of FIG. 6, in which pixels of an image are assigned to appropriate HSI space bins. The method shown in FIG. 7 may be performed for each pixel of an image or for only a subset of the pixels. For the template image, the method would typically be performed for each pixel, in order to obtain as much color information for the template image as possible. The color characterization analysis for the template image may only need to be performed once, and may be performed “offline”, i.e., does not need to be performed in real time as a target image is searched for color match regions. Thus, once the color characterization information has been obtained for the template image, it may not be necessary to have the template image in memory for a color match location procedure.
  • For each region of the target image that is searched, it may be desirable to examine only a subset of the region's pixels, since categorizing every pixel of the region into a bin may be computationally expensive, and many regions in the target image may need to be searched. In many cases, analyzing a subset of pixels in each target image region may be sufficient, e.g., in order to perform a coarse-grained search that identifies candidate regions that can then be analyzed in more detail. The sample pixel subset may be generated using any of various sampling techniques, such as grid-based sampling, random sampling, or other non-uniform sampling. [0109]
  • In [0110] step 412 the method determines if the intensity value of the pixel is below a certain threshold, which could be specified by the user as some small value close to 0. FIG. 5B illustrates the intensity threshold 446. The intensity threshold 446 is preferably a decreasing function of the saturation. The intensity threshold 446 may be set by the computer or in some embodiments may be selected by the user. In one embodiment, on the assumption that hue, saturation and intensity values have been normalized to range from 0 to 255, the intensity threshold BlkThreshold is specified as a function of the saturation as shown below: BlkThreshold = { BlkGrayThreshold for sat < 10 ( BlkGrayThreshold - 5 ) exp [ - 0.025 × ( sat - 10 ) ] + 5 for 10 sat 200 5 for 200 < sat
    Figure US20020102018A1-20020801-M00004
  • If a pixel's intensity is smaller than BlkThreshold, then in [0111] step 414 the pixel is immediately categorized as black. In this case, no further color learning is performed on the pixel. The threshold comparison performed in step 412 saves computer cycles by not requiring further HSI analysis on a pixel that is black based strictly on its low intensity. If the intensity value of the pixel is above the intensity threshold of step 412, then operations proceed to step 416, and further color categorizations are applied.
  • In [0112] step 416 the saturation value of the pixel is examined. If the saturation of a pixel is very low, different colors are not distinguishable and the pixel may immediately be categorized as either black, gray, or white. When a pixel's saturation is close to the minimum saturation level, the pixel may be graphically represented near the origin of the circle of FIGS. 5B. Step 416 determines if a pixel's saturation is lower than a selected saturation threshold 604 (FIG. 5B), i.e., is very close to 0. In one embodiment, the Saturation Threshold 604 has a default value of 10 on a scale from 0 to 255 (this corresponds to a default value of 0.04 on a scale from 0 to 1). If the saturation level of a pixel is below the saturation threshold, the pixel does not require further saturation analysis or the hue analysis of step 418 so the process advances to step 422.
  • In [0113] steps 422 and 423, a pixel (which has a very low saturation value) is examined based on its intensity value. A pixel that has very low saturation (i.e. is below the saturation threshold) is categorized as either black, gray, or white based on which half of the intensity plane the pixel resides in. In other words, the hue and saturation analysis of step 420 is not necessary because a pixel with a saturation value less than the saturation threshold is not distinguishable from other pixels with similar saturation values and different hue values. If the pixel is on the lower portion of the intensity plane, i.e., I<=BlkGrayThreshold, the pixel is categorized as black in step 424. Otherwise, the pixel is examined in step 423 to determine whether the intensity value falls on the upper portion of the intensity plane, i.e., I>WhiteGrayThreshold. If so, then the pixel is categorized as white in step 426. Otherwise, the pixel is categorized as gray in step 427. Values for BlkGrayThreshold and WhiteGrayThreshold may be pre-specified based on the importance of black, gray, and white color in the particular application. In one embodiment, the threshold values may be set to divide the intensity plane into three equal portions, which puts the same weight on black, gray, and white colors. After a pixel is categorized as either black, gray, or white, the method continues to step 428.
  • If the saturation of a pixel is more than the [0114] saturation threshold 604 in step 416, then hue and saturation analysis are performed in step 420. In step 420, the hue and saturation values of the pixels are analyzed, and the pixel is assigned to one of the bins in the hue/saturation plane based on these values.
  • As described above, FIG. 5A illustrates the hue/saturation plane, wherein hue is categorized by a color's angular orientation (from 0 to 360 degrees) on the cross sectional plane of FIG. 5A, and saturation is categorized as the color's radial distance on the cross sectional plane of FIG. 5A. Hue characterization may divide the hue plane into, for example, 7, 14, or 28 bins (for low, medium, or high complexity) depending on a selected color sensitivity, such as shown in FIG. 15, and the bins are further split in half by a radial distance value, represented by circle [0115] 442 (FIG. 5A), that allows categorization according to saturation within each hue bin. This doubles the total number of bins, or categories, in the hue/saturation plane to 14, 28, or 56, respectively.
  • If the current pixel being analyzed is the last pixel to be analyzed as determined in [0116] step 428, then operation completes. If not, then operation returns to step 412, and steps 412-428 are repeated. The color categorization process is repeated for at least a subset of the pixels, and possibly every pixel, until all are categorized. As each subsequent pixel is categorized, a running total of the number of pixels assigned to each bin may be stored in memory. Bins and the allocation of pixels to bins may be represented in any of various ways. In the preferred embodiment, the pixels are assigned to N categories or bins, where N=C*2+3 (where C=7, 14, or 28 depending on the selected complexity). The number N of bins or color categories may of course be adjusted by changing one or more of the number of hue divisions and saturation divisions.
  • After each pixel has been examined and assigned to one of the N categories, in [0117] step 430 the method may calculate color parameters, such as the percentage of pixels in each bin, i.e., the number of pixels in each bin in relation to the total number of pixels examined. These calculations will result in N percentages whose sum is equal to 100%. Percentages are used, rather than raw data, to allow matching of differently shaped, scaled and rotated images. It is noted that other types of color parameters may be generated, e.g., other types of normalized values which are independent of the number of pixels in the image object. The color characterization for the image thus may produce a list or data structure that contains N percentage values or parameters representing the color characterization of the image.
  • As noted above with reference to FIG. 6, in one embodiment, a user may specify one or more colors in the image to be ignored. In this case, the percentage of pixels in each bin corresponding to an ignored color may be set to zero, and the percentages for the remaining bins may be normalized to result in a total of 100%, or pixels corresponding to these bins may not be assigned to the bins at all, which would automatically result in a zero percentage for these bins. [0118]
  • FIG. 8—Color Match Location Method [0119]
  • FIG. 8 is a flowchart diagram illustrating one embodiment of a method for locating regions of a target image that match a template image, with respect to color characterization. [0120]
  • In [0121] step 250, a template image may be received. The template image may be an image of any of various types, including gray-level and color images. The template image may be received or obtained from any of various sources and may be an entire image or may be a portion of an image, e.g., a region of interest specified by a user. For example, a user may select a region of interest (ROI) using a graphical user interface (GUI). In one embodiment, a GUI may enable the user to choose from many different shapes of ROIs, such as a rectangle, an oval, or a shape selected freehand.
  • In [0122] step 251, a target image may be received. Similarly as for the template image, the target image may also be an image of any of various types, including an image obtained from a memory medium or an image acquired from a hardware device, such as a camera, frame grabber, scanner, etc. The target image may also be received from any other source, including from a graphics software program, from transmission via a network, etc. A target image may also be an entire image or only a portion of an image.
  • It is noted that in alternative embodiments, multiple template images and/or target images may be received or specified. For example, it may be desirable to search multiple target images for regions having color information matching that of a template image, or it may be desirable to search for target image regions matching any of a plurality of template images. [0123]
  • In [0124] step 252, a color characterization analysis may be performed for the template image. In one embodiment, this analysis may involve dividing the HSI color space into a number of categories or “bins”. The color information of the template image pixels may then be examined in order to determine the allocation of the pixels across the bins. One particular embodiment of step 252 is described above with reference to FIG. 6. In alternative embodiments, any of various other methods may be used to perform the color characterization analysis.
  • In one embodiment, color characterization of the template image may be performed on a different computer system, and in [0125] step 250 the method may receive the color characterization information of the template image. Thus, the computer system executing the color match location software may only receive or store the color characterization information of the template image, and may not be required to store the template image itself.
  • In step [0126] 254, the target image may be searched in order to locate regions that match the template image with respect to color characterization. This search may utilize the color characterization information of the template image obtained in step 252 and may also involve performing color characterization analyses for various regions of the target image. Thus step 254 may involve performing color characterization analyses for various regions of the target image, and comparing this color characterization of each of these regions with the color characterization information of the template image obtained in step 252. Step 254 may be performed in any of various ways. In one embodiment the target image may be searched in multiple passes. The first pass may involve a coarse-grained search to efficiently identify a list of candidate areas or regions in the target image. Subsequent passes may then examine the candidate areas more closely in order to determine final matches. One specific embodiment of step 254 is discussed in detail below with respect to FIG. 9.
  • In [0127] step 256, color match location or analysis information may be generated. Step 256 may involve displaying information, such as visually indicating the location of the match regions within the target image, and/or displaying information indicating various statistics regarding the color information of the match regions or regarding how closely the regions match the color information of the template image.
  • FIG. 9—Target Image Search [0128]
  • FIG. 9 is a flowchart diagram illustrating one embodiment of a method for searching a target image to find regions having color information that match a template image. In one embodiment, the target image search method shown in FIG. 9 may be used in step [0129] 254 of the color match location method shown in FIG. 6. In alternative embodiments, any of various other search methods may be used, as desired for a particular application. The target image search method shown in FIG. 9 utilizes a coarse-to-fine heuristic, in which candidate color match areas of the target image are identified in a first-pass search, and these candidate areas are then examined in more detail to identify final color match regions.
  • Each region of the target image that is examined may be regarded as a window into the target image. This window may have various sizes. For example, the window size may correspond exactly to the size of the template image, or the window size may be scaled to be larger or smaller than the template size. The window may be moved through the target image in order to sample the image at various regions. The points at which to sample regions may be determined in any of various ways. In one embodiment, the window may initially be positioned at the top, left corner of the target image and may then be moved through the image at interval steps. For each sample region, the color information of the region may be compared with the color information of the template image, as described below. [0130]
  • FIG. 10 illustrates an example of window movement during a first-pass search, in which the window begins at the top, left comer of the target image and is moved through the target image using a step size of nine pixels. After an initial color comparison between the template image and the top, left portion of the target image has been performed in FIG. 10A, the window, for example, is moved downward 9 pixel scan lines as shown in FIG. 10B. After this portion of the target image is compared to the template image, the window is moved another 9 scan lines downward as shown in FIG. 10C. The comparisons are repeated until the window reaches the bottom left portion of the target image, as shown in FIG. 10D. After this comparison, the window, for example, is moved back to the top of the target image and is moved over 9 vertical pixel columns to perform another comparison, as shown in FIG. 10E. After this comparison is performed in FIG. 10E, the window is moved down 9 horizontal scan lines of pixels as shown in FIG. 10F. This procedure again repeats a plurality of times until the window again reaches the bottom of the target image. At this point, the window is moved back to the top of the target image and across 9 more vertical column of pixels (not shown) to perform another set of comparisons. This procedure may be performed until the window has been stepped through the entire target image, using a 9 pixel step size. [0131]
  • It is noted that FIGS. [0132] 10A-10F are merely an example of stepping the window across the target image, it being noted that the window may be stepped across the target image using any of various step sizes and in any of various manners, e.g., left to right, right to left, top to bottom, bottom to top, or other methodologies. Also, the target image may not necessarily be sampled at regular step intervals. For example, window placement may be chosen using any of various algorithms, or may be chosen randomly, quasi-randomly, etc.
  • In [0133] step 450 of FIG. 9, user input specifying various search options may be received. For example, the search options may specify various parameter values affecting the degree of granularity used for deciding color matches and/or the efficiency of the target image search process. In one embodiment, the user may specify one of three options: “conservative”, “balanced,” or “aggressive,” which each control various search parameters, such as described below with reference to FIG. 11. In other embodiments search parameters may be specified individually.
  • In [0134] step 452, a first-pass search through the target image may be performed in order to find initial color match candidate areas, i.e., areas that may contain a region having color information that matches the color information of the template image. One embodiment of 452 is described below with reference to FIG. 11.
  • In [0135] step 454, each candidate area identified in step 452 may be examined in more detail. In the first-pass search, various regions of the target image may be sampled at a relatively large step size, in order to efficiently identify areas containing a possible match. In step 454, for each candidate area, the search window may initially be placed at the position where the window was during the first-pass search when the candidate area was identified. The window may then be moved around this initial position at a reduced step size in order to perform a finer-grained search, so that the best matching region for each candidate area is determined. The new step size may be inversely proportional to how well the initial candidate matched the template image. In other words, a “hill-climbing” heuristic may be used, such that if the initial candidate is very close to the template image, smaller steps are taken so that the best match is not stepped over. Various methods for determining how close the color information of a target image region is to the color information of the template image are discussed below.
  • During the search performed in [0136] step 454, the window may be moved around each candidate area using any of various strategies or algorithms. However, the distance that the window may be moved away from the original candidate's position is preferably limited, e.g., as a function of the size of the window and/or the step size used in the first-pass search. In one embodiment, if it is determined that the degree to which the target image color information matches the template image color information is decreasing as the window moves away from its initial position, then searching in that direction may be aborted, in order to avoid unnecessary comparisons.
  • As discussed above with reference to FIG. 8, when the color information for a target image region is analyzed, it may be desirable to examine the color information for only a subset of the individual pixels of the region, e.g., in order to search through the target image more quickly. The sub-sampling size for each target image region may be determined by search criteria specified by the user. In [0137] step 454, it may be desirable to increase the sub-sampling size used in analyzing the color information for the target image over the sub-sampling size used in the first-pass search, in order to possibly obtain more accurate color characterization information.
  • In various embodiments, [0138] step 454 may comprise performing one or more subsequent passes through the candidate list after the first pass. For example, if desired, the coarse-to-fine search heuristic may be repeated, possibly only for certain candidates, using successively smaller step sizes, and/or larger sub-sampling sizes, e.g., until the step size is reduced to one pixel and every pixel of the target image region is sampled. The desired number of passes performed and the rate at which the search parameters change between passes may differ according to the accuracy and efficiency requirements of particular applications.
  • Each initial candidate area identified in the first-pass search may be replaced by the region found in [0139] step 454 having color information that best matches the color information of the template image (or may not be replaced if no better match is found). Also, it is possible that candidate areas identified during a previous pass are eliminated altogether in a subsequent pass. For example, since the step size may be relatively large during the first-pass search, the match criteria for identifying candidates may be relatively loose, i.e., a target image region may not need to match the template image very closely in order to be considered a candidate match area. As candidate regions are examined more thoroughly in subsequent passes, it may be desirable to require the color information of each candidate to match the template image more strongly in order to remain a candidate.
  • In one embodiment, information regarding an expected number of matches to be found in the target image may be utilized in order to more quickly complete the color match location process. For example, FIG. 15 illustrates a graphical user interface enabling a user to specify an expected number of matches. In this case, the method may limit the number of color match candidate regions that are searched to a maximum number based on the expected number of matches. In one embodiment, this maximum number may be calculated with a formula such as: [0140]
  • Max=Base+Factor*NumberExpected
  • where “Base” and “Factor” are configurable variables. [0141]
  • The list of candidate regions identified in the first-pass search through the target image may be sorted with respect to how well the color information of each candidate region matches the color information of the template image, and in a subsequent search pass, the list of candidate regions may be traversed in this sorted order. The maximum number calculated based on the number of expected matches may be used to limit the number of candidate regions that are searched in a subsequent pass. Since the first-pass search may use relatively loose matching criteria, the first-pass search may identify a large number of candidate regions. The method may operate to keep track of the number of candidates remaining after a subsequent pass. If the maximum number is reached, then a traversal of the remaining first-pass candidate regions may be avoided. In one embodiment, however, if the color difference between a given candidate region and the template image is smaller than a certain threshold value, then that candidate region may be traversed regardless of whether or not a maximum number of subsequent-pass candidates has already been reached. [0142]
  • In [0143] step 456 each of the candidate regions determined after the one or more passes performed in step 454 may be scored, based on the difference between their color characterization information and the color characterization information for the template image. The color differences may be calculated in any of various ways. Particular embodiments of color difference methods are discussed below. Any of various systems may be used to score the candidate regions. In one embodiment, each region is assigned a score from 0 to 1000, with 1000 being the best possible match and 0 being the worst.
  • In step [0144] 458 a final list of color match regions may be generated, based on the scores determined in step 456. For example, the scores may be compared to a threshold value that is used to eliminate regions scoring below a certain level. This threshold value may be a default value or may be specified from the user input received in step 450.
  • FIG. 11—First-Pass Search [0145]
  • FIG. 11 is a flowchart diagram illustrating one embodiment of a method to perform the first pass search illustrated in [0146] step 452 of FIG. 9. As discussed above, in one embodiment, the first-pass search may involve sampling various regions of the target image, where the regions that are sampled may be determined by a window that slides along the target image according to a particular step size. Thus, in step 470 the method may determine an appropriate step size to use in sliding the window. The step size may at least in part be determined based on user input received in step 450 of FIG. 9. For example, if the user specified aggressive search criteria, then the step size may be relatively large, whereas the step size may be relatively small if the user specified conservative search criteria. In various embodiments, the search size may also depend on the size of the template image and/or the target image.
  • For each region that is sampled, the color information for the region may be analyzed, similarly as for the template image. However, as described above, it may not be desirable to examine the color information of every pixel in the region. Thus, in [0147] step 472, a sub-sampling size and/or method may be determined, wherein the sub-sampling size specifies the number of pixels to examine for each region. The sub-sampling method may specify the type of sub-sampling, such as random, pseudo-random, or a low discrepancy sequence. In one embodiment, the method may use a low discrepancy sequence to select the subset of pixels. Similarly as for the step size, the sub-sampling size and/or method may depend on search criteria specified by the user.
  • As shown in FIG. 11, [0148] steps 474 through 480 may then be performed for each region of the target image to be sampled.
  • In [0149] step 474, a color characterization analysis for the target image region may be performed. This step may utilize the color characterization method described above with reference to FIG. 7, in which the target image pixels (or a selected subset of pixels) are examined individually with respect to their color information and assigned to color space bins. In step 476, a measure of difference (or similarity) between the color spectrum of the target image region and the color spectrum of the template image may be computed by comparing the information obtained in their respective color characterization analyses. This comparison may be performed in any of various ways. In one embodiment, for each color bin from a set of N bins, the pixel percentage values assigned to corresponding bins for the two images may be subtracted from one another, resulting in N difference values. The closer each of the difference values is to zero, the more similarity there is between the template image and the target image region, with respect to that color category; i.e., the percentage of pixels on the template image and the target image region that fall into that particular color category are substantially the same.
  • The absolute values of the difference values may then be summed to give a value falling between zero and two, where two represents a maximum measure of difference between the color spectrums and zero represents a maximum measure of similarity. Alternatively, each of the difference values may be compared to a threshold value to determine a “score” for each color category. [0150]
  • While the above method is simple to apply and the results are easily understood, this method may not be the best method for all color matching applications. For example, consider a case where at least one of the seven natural colors of the hue plane is divided into two or more bins, e.g., in response to a user specifying a medium or high sensitivity level. Even if the template image and the target image region have colors that are very similar, it is still possible that pixels from each will be assigned to different bins corresponding to the same natural color in the hue plane. Thus, the results from this example may show very few or no pixels in the same bin, i.e., the results would indicate that the template image and the target image region have very different color spectrums. This may not be the proper result because the colors in the template image and the target image region are actually very similar, but happen to be in different hue categories of the same natural color. [0151]
  • Alternative color spectrum techniques may compensate for cases such as described above. In various embodiments, a portion of the percentages of pixels assigned to each bin may be manipulated, in order to share pixels among or re-distribute pixels to neighboring bins, before calculating the measure of color spectrum difference as described above. [0152]
  • FIG. 12 is a flowchart diagram illustrating one embodiment of a method for performing this type of pixel sharing or re-distribution. As shown, in [0153] step 502 the level of sharing or distribution may be determined according to a color sensitivity level specified by the user. In one embodiment, each bin shares with zero bins, one neighboring bin on each side, or two neighboring bins on each side, depending on a specified sensitivity level of low, medium, or high, respectively. In another embodiment, the level of sharing or distribution with neighboring bins may be determined automatically by the computer, e.g., if a certain threshold of pixels of the template image and the target image region fall into respective neighboring bins (as in the example above), then the method may automatically apply a level of sharing or distribution. Thus, the method may automatically detect and compensate for the types of errors described above.
  • In [0154] step 504, the pixel allocation percentages may be re-distributed among neighboring bins. Step 504 may be performed in any of various ways. For example, in one embodiment, a respective bin that contains 40% of all pixels may share 10% of its total with the neighboring bins on either side. In other words, 4% (10% of 40%) may be added to the neighboring bins on either side of the respective bin. This would leave 32% in the respective bin (40%−4%−4%=32%). The neighboring bins may then undergo the same sharing process, and a certain percent may be shifted back as well as a certain percent being shifted to another neighboring bin, and so on. Any of various other methods may be used in re-distributing the pixel percentages. These types of adjustments may have an effect similar to adding additional bins, making the results smoother. Hence, these types of adjustments may be referred to as “smoothing operations”. A smoothing operation may be performed for both the template image and the target image region.
  • In [0155] step 506 the compensated percentages of the template image and target image region may then be compared. For example, step 506 may involve subtracting percentages in respective bins of the template image and target image region and summing the results, similarly as described above. This may produce a value representing a measure of difference between the color information of the template image and the color information of the target image region.
  • It may sometimes be desirable to distribute pixels among multiple bins, based not only on bin percentages, but also on where the pixels fall in the bins, in terms of the portions of color space represented by the bins. For example, as described above with reference to FIG. 7, pixels may be assigned to multiple bins at the time when the color characterization analysis is performed, e.g., on a fractional weighted basis. This increased complexity may result in more accurate color match location results. [0156]
  • Various embodiments may employ fuzzy membership or other functions to determine the desired distribution among multiple bins. FIG. 14 is a flowchart diagram illustrating one embodiment of a method using fuzzy membership functions to characterize the color information of the image. The steps shown in FIG. 14 may be performed for each pixel (possibly of a subset) of the image (or region of the image being examined). [0157]
  • As shown, in [0158] step 900, the pixel may be assigned to a bin. For example, as described above, step 900 may comprise examining color information of the pixel to determine where the pixel lies within the color space and assigning the pixel to a bin corresponding to that portion of the color space.
  • In [0159] step 902, a fuzzy membership or other function may be applied, based on where the pixel falls within the bin. As described above, the bin corresponds to a portion of the color space, and the color information of the pixel may correspond to a point within the color space. Thus, the pixel may fall within the bin at various locations, with respect to the range of color space values corresponding to the bin.
  • The fuzzy membership function may determine a contribution which the pixel should make to one or more neighboring bins. For example, if the pixel falls near the edge of a bin (with respect to the portion of the color space that the bin corresponds to), then the fuzzy membership function may determine that a portion of the weight of the pixel should be contributed to the neighboring bin which the pixel is near. Any of various types of fuzzy membership functions may be applied, and the function may determine a contribution which the pixel should make to any number of bins, wherein the sum of these contributions is 100%. For example, the function may determine a plurality of values summing to 1.0, such as, 0.25, 0.50, and 0.25, wherein each value corresponds to a bin. [0160]
  • In [0161] step 904, the weight of the pixel may be distributed across the bin to which the pixel was originally assigned and across the one or more bins to which the pixel contributes, in accordance with the contribution values determined in step 902. For example, values determined by the function in step 902, such as the above exemplary values of 0.25, 0.50, and 0.25, may each be assigned to a corresponding bin.
  • It is noted that FIG. 14 is exemplary, and various alternative embodiments are contemplated, e.g., in which various steps may be combined, omitted, reordered, altered, For example, step [0162] 900 may not need to be performed. For example, the function may determine a plurality of values to assign to a plurality of color space bins, based on the color information of the pixel and the location of the pixel within the color space, and not necessarily based on the location of a pixel within a bin.
  • FIGS. 13A, 13B, and [0163] 13C illustrate examples of how fuzzy membership functions may be utilized. FIG. 13A illustrates a triangle fuzzy membership function. In FIG. 13A, the 360-degree hue plane is divided into seven bins, which are shown linearly. For each pixel, the bin that the pixel falls into may be determined, as well as the position within this bin. The triangle fuzzy membership function may then be applied, based on the position within the bin, in order to determine a percentage of the pixel weight which should be assigned to that bin and or to a neighboring bin. This is represented by the angular lines drawn over the bins. In this example, if a pixel falls exactly within the center of a bin, then, as shown, 100% of the pixel weight is assigned to that bin. If a pixel falls one fourth away from the edge of the bin, then 75% of the pixel weight is assigned to that bin, and 25% of the pixel weight is assigned to the neighboring bin next to that edge, as indicated by the dashed lines.
  • FIG. 13B illustrates a trapezoidal fuzzy membership function. In this example, if the pixel falls near the center of a bin, then 100% of the pixel weight is assigned to that bin. Otherwise, a portion of the pixel weight may be distributed to a neighboring bin, similarly as in FIG. 13A. [0164]
  • FIG. 13C illustrates another example of distributing a pixel among multiple bins. In the example of FIG. 13C, a step fuzzy membership function as applied. [0165]
  • Consider the bin labeled “Bin X.” If the pixel falls to the left of the center line shown, i.e., in terms of the pixel's hue, then Bin X is assigned 80% of the pixel weight, Bin X-1 is assigned 15% of the weight, and Bin X-2 is assigned 5% of the pixel weight. Thus, in this example, pixels may be distributed across three bins. Increasing the number of bins over which a pixel is distributed may be especially desirable when the hue space is partitioned into a large number of bins. It is noted that the fuzzy membership functions shown in FIGS. 13A, 13B, and [0166] 13C are exemplary, and any other technique may be used in determining an appropriate pixel distribution.
  • As noted above, in one embodiment, information indicating one or more dominant color categories may be obtained when performing a color characterization analysis of a template image. Referring again to FIG. 11, in [0167] step 478, a measure of difference for the dominant color categories may be computed. This measure of difference may be computed similarly as described above for the color spectrum difference. For example, for each dominant color category determined for the template image, the percentage of template image pixels assigned to the dominant color category may be compared to the percentage of target image region pixels assigned to that color category.
  • In [0168] step 480, the difference values determined in steps 476 and 478 may be used to decide whether to add the region to a list of candidate match areas. For example, the color spectrum difference may need to be less than a threshold value in order for the region to be added to the list. It is noted that the color spectrum difference may be tested immediately after its calculation, and further analysis of the sample region, such as step 478, may be aborted if the difference is too great.
  • If the color spectrum difference is sufficiently small, then the dominant color difference(s) may be considered. Considering the dominant color difference(s) may help to further ensure that the sample region area is a potential match, since in various embodiments of the calculation of the color spectrum difference, it is possible to obtain a small difference value, even though the occurrence of the dominant color(s) of the template image may be largely reduced in the sample region or may even be missing altogether in the sample region. Dominant color differences may be considered individually or together. For example, if there are multiple dominant color categories, then the percentage difference for each category may be required to be smaller than a threshold value in order for the region to be added to the candidate list, or the average of the differences for all the categories may be required to be smaller than a threshold value. [0169]
  • FIG. 15—Color Match Location User Interface [0170]
  • FIG. 15 illustrates an example of a graphical user interface (GUI) associated with color match location software according to one embodiment of the present invention. A brief description of applicable GUI elements is given below. It is noted that various other embodiments of such a GUI may comprise GUI elements enabling the user to specify variables affecting the color match location operation at a broader or finer level of granularity than the GUI shown in FIG. 15. [0171]
  • The GUI of FIG. 15 is associated with an application that is operable to perform match location of regions in a target image based on both color information of a template image and shape or pattern information of the template image. Thus, certain GUI elements pertain to this shape or pattern information. [0172]
  • “Image Type” displays the color format of the current target image. Color formats may include RGB, CMY, or HSI, among others. [0173]
  • “Learn Mode” specifies the invariant features to learn when setting up a learn color pattern. The following values may be selected: “All” (extracts template information for shift and rotation invariant matching”; “Shift Information” (Default) (extracts information for shift invariant matching); “Rotation Information” (extracts information for rotation invariant matching). [0174]
  • “Ignore Black and White” enables the user to specify that pixels classified as black and/or white are ignored in the match location operation. [0175]
  • “Feature Mode” specifies the features to use in the searching stage. The following values may be chosen: “Color” (use color features only in the searching stage); “Shape” (use shape features in the searching stage); and “Color and Shape” (Default) (use both color and shape features in the searching stage. [0176]
  • “Color Sensitivity” specifies a level of color sensitivity (“low”, “medium”, or “high”). This setting may affect the number of color category divisions to use. [0177]
  • “Search Strategy” specifies the different searching algorithms to achieve a tradeoff between search speed and accuracy. The default option is “Balanced”. In case the speed does not meet requirements, the “Aggressive” option may be used. In case the accuracy does not meet requirements, the “Conservative” option may be used. [0178]
  • “Number of Matches Expected” specifies a number of matching regions the user expects the target image to have, which may be used to increase the efficiency of the color match location process, as described above. [0179]
  • “Match Mode” specifies the technique to use when looking for the template pattern in the image. The following values may be chosen: “Shift Invariant” (default) (searches for the template pattern in the image, assuming that it is not rotated more than ±4°); “Rotation Invariant” (searches for the template in the image with no restriction on the rotation of the template). If the “Feature Mode” is set to “Color” only, then the rotation Invariant matching can also be achieved by using a square template image in “Shift Invariant” mode. [0180]
  • “Minimum match score” specifies a threshold value for color matching scores. The data range is between 0 and 1000. [0181]
  • As shown, the GUI also includes various fields for viewing information for each matching region of the target image, once the search has been performed, such as the location and size of the region, a match score indicating how close the color information of the region matches the color information of the template image, etc. [0182]
  • FIG. 16—Displaying Color Match Regions [0183]
  • In addition to displaying various statistics regarding each match region found, as shown in FIG. 15, the locations of the match regions may also be visually indicated in the target image, e.g., by displaying a box around each match region, as shown in FIG. 16. [0184]
  • FIG. 17—Display Color Characterization Information [0185]
  • In one embodiment, an application may be operable to display information representing the color characterization of an image. FIG. 17 illustrates one example of such a display. FIG. 17 shows the percentage (vertical scale) of 16 defined colors (horizontal scale) as determined by one embodiment of the color characterization method described herein. [0186]
  • The color characterization list or data structure may further be operated upon to create a color characterization represented as a single value. The color characterization may also be represented textually (e.g., by the terms brick red, jet black, mauve, etc.) through the use of a look-up table configured according to the color categorization method of the present invention. The color characterization may also be represented graphically in various ways. The color characterization may be stored along with the image or transmitted to other computer systems for analysis or display. The color characterization may also be used as part of an image compression technique. [0187]
  • Although the embodiments above have been described in considerable detail, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications. [0188]

Claims (45)

1. A computer-implemented method for characterizing colors of an image, wherein the image comprises a plurality of pixels, the method comprising:
for each respective pixel of at least a subset of pixels of the image, assigning values to one or more color categories based on color information of the pixel;
wherein, for each of one or more first pixels, said assigning comprises assigning values to a plurality of the color categories based on color information of the pixel; and
determining information regarding the total values of pixels assigned to each of the color categories, wherein said information characterizes colors of the image:
2. The method of claim 1,
wherein, for each of the one or more first pixels, said assigning comprises assigning a percentage of the pixel to each of the plurality of color categories.
3. The method of claim 2,
wherein, for each of the one or more first pixels, the sum of the percentages assigned to each of the plurality of color categories is 100 percent.
4. The method of claim 1,
wherein each of the one or more color categories corresponds to a portion of a color space.
5. The method of claim 4,
wherein, for each respective pixel of the at least a subset of pixels, said assigning values to the one or more color categories based on color information of the pixel comprises:
determining a location of the pixel within the color space;
applying a function based on the location of the pixel within the color space to determine the values assigned to the one or more color categories.
6. The method of claim 5,
wherein said determining the location of the pixel within the color space comprises examining color information of the pixel.
7. The method of claim 6,
wherein the color space is the Hue, Saturation, Intensity (HSI) color space;
wherein said examining color information of the pixel comprises examining HSI information of the pixel.
8. The method of claim 5,
wherein the function is a fuzzy membership function.
9. The method of claim 8,
wherein the fuzzy membership function is one of:
a triangle fuzzy membership function;
a trapezoidal fuzzy membership function; and
a step fuzzy membership function.
10. The method of claim 1, further comprising:
selecting the subset of pixels of the image, wherein the subset of pixels characterize the image, wherein the subset is selected using one or more of a random selection, a grid-based selection, or a Low Discrepancy sequence selection.
11. The method of claim 1,
wherein said assigning values to one or more color categories based on color information of each pixel comprises creating a data structure having values representing the total values of pixels assigned to each of the color categories;
wherein said determining information regarding the total values of pixels assigned to each of the color categories comprises determining the values of the data structure.
12. A computer-implemented method for characterizing colors of an image, wherein the image comprises a plurality of pixels, the method comprising:
for each respective pixel of at least a subset of pixels of the image, determining contributions of the pixel to one or more color categories;
wherein, for each of one or more first pixels, said determining comprises determining contributions of the pixel to a plurality of the color categories; and
determining information regarding the total contributions of pixels to each of the color categories, wherein said information characterizes colors of the image.
13. The method of claim 12,
wherein, for each of the one or more first pixels, said determining comprises determining a percentage of the pixel that is contributed to each of the plurality of color categories.
14. The method of claim 13,
wherein, for each of the one or more first pixels, the sum of the percentages contributed to each of the plurality of color categories is 100 percent.
15. The method of claim 12,
wherein each of the one or more color categories corresponds to a portion of a color space.
16. The method of claim 15,
wherein, for each respective pixel of the at least a subset of pixels, said determining the contributions of the pixel to the one or more color categories comprises:
determining a location of the pixel within the color space;
applying a function based on the location of the pixel within the color space to determine the contributions of the pixel to the one or more color categories.
17. The method of claim 16,
wherein said determining the location of the pixel within the color space comprises examining color information of the pixel.
18. The method of claim 17,
wherein the color space is the Hue, Saturation, Intensity (HSI) color space;
wherein said examining color information of the pixel comprises examining HSI information of the pixel.
19. The method of claim 16,
wherein the function is a fuzzy membership function.
20. The method of claim 19,
wherein the fuzzy membership function is one of:
a triangle fuzzy membership function;
a trapezoidal fuzzy membership function; and
a step fuzzy membership function.
21. The method of claim 12, further comprising:
selecting the subset of pixels of the image, wherein the subset of pixels characterize the image, wherein the subset is selected using one or more of a random selection, a grid-based selection, or a Low Discrepancy sequence selection.
22. The method of claim 12,
wherein said determining contributions of each pixel to one or more color categories comprises creating a data structure having values representing the total contributions of pixels to each of the color categories;
wherein said determining information regarding the total contributions of pixels to each of the color categories comprises determining the values of the data structure.
23. A computer-implemented method for characterizing the color information of an image, wherein the image comprises a plurality of pixels, wherein a color space representing possible colors of the pixels is divided into a plurality of bins, the method comprising:
for each of at least a subset of pixels of the image:
examining color information of the pixel to determine a bin corresponding to the color information of the pixel;
applying a function based on a location of the pixel within the bin to determine a contribution of the pixel to one or more neighboring bins;
assigning values to the bin and the one or more neighboring bins based on the determined contributions of the pixel to the one or more neighboring bins;
wherein the total assigned values across the bins of the color space characterize the color information of the image.
24. The method of claim 23,
wherein the color space is the Hue, Saturation, Intensity (HSI) color space;
wherein said examining color information of the pixel comprises examining HSI color information of the pixel.
25. The method of claim 23,
wherein the function is a fuzzy membership function.
26. The method of claim 23,
wherein a first pixel falls within a first bin;
wherein the location of the first pixel within the first bin is a first distance away from a second bin, with respect to color space difference, wherein the second bin neighbors the first bin, with respect to the portion of color space to which the first and second bins correspond;
wherein a second pixel falls within the first bin;
wherein the location of the second pixel within the first bin is a second distance away from the second bin, with respect to color space difference;
wherein the second distance is less than the first distance;
wherein applying the function based on the location of the first pixel within the first bin results in determining a contribution of the first pixel to the second bin;
wherein applying the function based on the location of the second pixel within the first bin results in determining a contribution of the second pixel to the second bin;
wherein the contribution of the second pixel to the second bin is greater than the contribution of the first pixel to the second bin.
27. A computer-implemented method for determining a similarity of colors between a template image and a target image, wherein the template image and the target image each comprise a plurality of pixels, the method comprising:
determining color information of the template image, wherein said color information comprises information regarding assigned values of at least a subset of template image pixels to each of a plurality of color categories;
determining color information of the target image, wherein said color information comprises information regarding assigned values of at least a subset of target image pixels to each of the plurality of color categories;
determining a similarity of colors between the template image and the target image, based on the color information of the template image and the color information of the target image;
wherein, for one or more template image pixels or one or more target image pixels, a value is assigned to more than one color category.
28. The method of claim 27,
wherein for each of the one or more template image pixels or one or more target image pixels for which a value is assigned to more than one color category, a percentage of the pixel is assigned to each of the more than one color categories, wherein the sum of the percentages is 100 percent.
29. The method of claim 27,
wherein each of the color categories corresponds to a portion of a color space.
30. The method of claim 29, further comprising:
for each of the at least a subset of template image pixels, determining a location of the pixel within the color space, based on color information of the pixel; and
for each of the at least a subset of target image pixels, determining a location of the pixel within the color space, based on color information of the pixel;
wherein said determining color information of the template image comprises:
for each of the at least a subset of template image pixels, assigning values to one or more color categories based on the location of the pixel within the color space;
wherein said determining color information of the target image comprises:
for each of the at least a subset of target image pixels, assigning values to one or more color categories based on the location of the pixel within the color space.
31. A computer-implemented method for characterizing colors in an image, wherein the image comprises a plurality of pixels, the method comprising:
for each respective pixel of at least a subset of the pixels, assigning the respective pixel to one or more color categories from a plurality of possible color categories, based on color information of the respective pixel;
determining information regarding the distribution of pixels across each of the color categories;
determining information regarding one or more dominant color categories, based on the information regarding the distribution of pixels across each of the color categories, wherein the one or more dominant color categories are assigned a relatively larger proportion of pixels, with respect to other color categories;
wherein the information regarding the distribution of pixels across each of the color categories and the information regarding the one or more dominant color categories characterizes colors in the image.
32. A system for characterizing colors of an image, wherein the image comprises a plurality of pixels, the system comprising:
a processor;
a memory medium coupled to the processor, wherein the memory medium stores color characterization software;
wherein the processor is operable to execute the color characterization software to:
for each respective pixel of at least a subset of pixels of the image, assign values to one or more color categories based on color information of the pixel;
wherein, for each of one or more first pixels, said assigning comprises assigning values to a plurality of the color categories based on color information of the pixel;
wherein the processor is operable to determine information regarding the total values of pixels assigned to each of the color categories, wherein said information characterizes colors of the image.
33. The system of claim 32,
wherein, for each of the one or more first pixels, said assigning comprises assigning a percentage of the pixel to each of the plurality of color categories.
34. The system of claim 32,
wherein each of the one or more color categories corresponds to a portion of a color space.
35. The system of claim 34,
wherein, for each respective pixel of the at least a subset of pixels, in assigning values to the one or more color categories based on color information of the pixel, the processor is operable to:
determine a location of the pixel within the color space;
apply a function based on the location of the pixel within the color space to determine the values assigned to the one or more color categories.
36. The system of claim 35,
wherein the function is a fuzzy membership function.
37. The system of claim 32, wherein the processor is further operable to:
select the subset of pixels of the image, wherein the subset of pixels characterize the image, wherein the processor is operable to select the subset using one or more of a random selection, a grid-based selection, or a Low Discrepancy sequence selection.
38. The system of claim 32,
wherein, in performing said assigning values to one or more color categories based on color information of each pixel, the processor is operable to create a data structure having values representing the total values of pixels assigned to each of the color categories;
wherein said determining information regarding the total values of pixels assigned to each of the color categories comprises determining the values of the data structure.
39. A memory medium comprising program instructions operable to:
for each respective pixel of at least a subset of pixels of an image, assign values to one or more color categories based on color information of the pixel;
wherein, for each of one or more first pixels, said assigning comprises assigning values to a plurality of the color categories based on color information of the pixel; and
determine information regarding the total values of pixels assigned to each of the color categories, wherein said information characterizes colors of the image.
40. The memory medium of claim 39,
wherein, for each of the one or more first pixels, said assigning comprises assigning a percentage of the pixel to each of the plurality of color categories.
41. The memory medium of claim 39,
wherein each of the one or more color categories corresponds to a portion of a color space.
42. The memory medium of claim 41,
wherein, for each respective pixel of the at least a subset of pixels, said assigning values to the one or more color categories based on color information of the pixel comprises:
determining a location of the pixel within the color space;
applying a function based on the location of the pixel within the color space to determine the values assigned to the one or more color categories.
43. The memory medium of claim 42,
wherein the function is a fuzzy membership function.
44. The memory medium of claim 39, further comprising program instructions operable to:
select the subset of pixels of the image, wherein the subset of pixels characterize the image, wherein the subset is selected using one or more of a random selection, a grid-based selection, or a Low Discrepancy sequence selection.
45. The memory medium of claim 39,
wherein said assigning values to one or more color categories based on color information of each pixel comprises creating a data structure having values representing the total values of pixels assigned to each of the color categories;
wherein said determining information regarding the total values of pixels assigned to each of the color categories comprises determining the values of the data structure.
US09/737,531 1999-08-17 2000-12-13 System and method for color characterization using fuzzy pixel classification with application in color matching and color match location Expired - Lifetime US7046842B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/737,531 US7046842B2 (en) 1999-08-17 2000-12-13 System and method for color characterization using fuzzy pixel classification with application in color matching and color match location

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/375,453 US6757428B1 (en) 1999-08-17 1999-08-17 System and method for color characterization with applications in color measurement and color matching
US09/737,531 US7046842B2 (en) 1999-08-17 2000-12-13 System and method for color characterization using fuzzy pixel classification with application in color matching and color match location

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/375,453 Continuation-In-Part US6757428B1 (en) 1999-08-17 1999-08-17 System and method for color characterization with applications in color measurement and color matching

Publications (3)

Publication Number Publication Date
US20020102018A1 true US20020102018A1 (en) 2002-08-01
US20040228526A9 US20040228526A9 (en) 2004-11-18
US7046842B2 US7046842B2 (en) 2006-05-16

Family

ID=46277179

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/737,531 Expired - Lifetime US7046842B2 (en) 1999-08-17 2000-12-13 System and method for color characterization using fuzzy pixel classification with application in color matching and color match location

Country Status (1)

Country Link
US (1) US7046842B2 (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020085752A1 (en) * 2000-12-28 2002-07-04 Manabu Ohga Image processing apparatus and method
US20030083850A1 (en) * 2001-10-26 2003-05-01 Schmidt Darren R. Locating regions in a target image using color matching, luminance pattern matching and hue plane pattern matching
US20030080974A1 (en) * 2001-10-13 2003-05-01 Grosvenor David Arthur Display of static digital images
EP1574991A1 (en) * 2002-12-11 2005-09-14 Seiko Epson Corporation Similar image extraction device, similar image extraction method, and similar image extraction program
US20050285947A1 (en) * 2004-06-21 2005-12-29 Grindstaff Gene A Real-time stabilization
US20060056736A1 (en) * 2004-09-10 2006-03-16 Xerox Corporation Simulated high resolution using binary sub-sampling
US20060115127A1 (en) * 2004-11-29 2006-06-01 Dainippon Screen Mfg. Co., Ltd. Print inspection apparatus
US20060153447A1 (en) * 2002-12-05 2006-07-13 Seiko Epson Corporation Characteristic region extraction device, characteristic region extraction method, and characteristic region extraction program
US20070296987A1 (en) * 2006-06-23 2007-12-27 Realtek Semiconductor Corp. Apparatus and method for color adjustment
EP2046064A1 (en) * 2006-10-05 2009-04-08 Panasonic Corporation Light emitting display device
US20090254217A1 (en) * 2008-04-02 2009-10-08 Irobot Corporation Robotics Systems
US20090274351A1 (en) * 2008-05-02 2009-11-05 Olympus Corporation Image processing apparatus and computer program product
US20090285464A1 (en) * 2008-05-13 2009-11-19 Canon Kabushiki Kaisha Image processing apparatus and image processing method
WO2010043618A1 (en) * 2008-10-14 2010-04-22 Sicpa Holding Sa Method and system for item identification
US20110149069A1 (en) * 2009-12-22 2011-06-23 Canon Kabushiki Kaisha Image processing apparatus and control method thereof
US20120089867A1 (en) * 2010-10-06 2012-04-12 International Business Machines Corporation Redundant array of independent disk (raid) storage recovery
US20120320078A1 (en) * 2007-09-07 2012-12-20 Texas Instruments Incorporated Image intensity-based color sequence reallocation for sequential color image display
US20130129215A1 (en) * 2011-11-09 2013-05-23 Canon Kabushiki Kaisha Method and system for describing image region based on color histogram
US20140023282A1 (en) * 2012-07-18 2014-01-23 Nvidia Corporation System, method, and computer program product for generating a subset of a low discrepancy sequence
US20140253578A1 (en) * 2011-01-31 2014-09-11 Marvell World Trade Ltd. Systems and Methods for Performing Color Adjustment of Pixels on a Color Display
US8948452B2 (en) 2009-12-22 2015-02-03 Canon Kabushiki Kaisha Image processing apparatus and control method thereof
CN104978740A (en) * 2015-05-10 2015-10-14 刘畅 Component automatic measurement method based on image color feature
CN105184778A (en) * 2015-08-25 2015-12-23 广州视源电子科技股份有限公司 Detection method and apparatus
WO2017083189A1 (en) * 2015-11-13 2017-05-18 Microsoft Technology Licensing, Llc Image analysis based color suggestions
US20170180725A1 (en) * 2013-09-11 2017-06-22 Color Match, LLC Color measurement and calibration
CN109460710A (en) * 2018-10-12 2019-03-12 北京中科慧眼科技有限公司 A kind of color classification recognition methods, device and automated driving system
CN109726730A (en) * 2017-10-27 2019-05-07 财团法人工业技术研究院 Automatic optics inspection image classification method, system and computer-readable medium
US10282075B2 (en) 2013-06-24 2019-05-07 Microsoft Technology Licensing, Llc Automatic presentation of slide design suggestions
US10469807B2 (en) 2013-09-11 2019-11-05 Color Match, LLC Color measurement and calibration
CN110648305A (en) * 2018-06-08 2020-01-03 财团法人工业技术研究院 Industrial image detection method, system and computer readable recording medium
US10528547B2 (en) 2015-11-13 2020-01-07 Microsoft Technology Licensing, Llc Transferring files
US10534748B2 (en) 2015-11-13 2020-01-14 Microsoft Technology Licensing, Llc Content file suggestions
CN110865862A (en) * 2019-11-13 2020-03-06 北京字节跳动网络技术有限公司 Page background setting method and device and electronic equipment
CN110991465A (en) * 2019-11-15 2020-04-10 泰康保险集团股份有限公司 Object identification method and device, computing equipment and storage medium
CN111242836A (en) * 2018-11-29 2020-06-05 阿里巴巴集团控股有限公司 Method, device and equipment for generating target image and advertising image
CN112561487A (en) * 2020-12-21 2021-03-26 广联达科技股份有限公司 Method, device and equipment for calculating construction progress and readable storage medium
CN112884778A (en) * 2021-01-08 2021-06-01 宁波智能装备研究院有限公司 Robust machine vision target recognition and segmentation method and system
CN113837181A (en) * 2021-09-24 2021-12-24 深圳集智数字科技有限公司 Screening method and device, computer equipment and computer readable storage medium
CN114170522A (en) * 2022-02-14 2022-03-11 北京中科慧眼科技有限公司 Color classification identification method and system based on chromatographic similarity measurement
US11341759B2 (en) * 2020-03-31 2022-05-24 Capital One Services, Llc Image classification using color profiles
CN114648594A (en) * 2022-05-19 2022-06-21 南通恒强家纺有限公司 Textile color detection method and system based on image recognition
US11403723B2 (en) * 2019-12-23 2022-08-02 Shopify Inc Methods and systems for detecting errors in kit assembly
US20220319051A1 (en) * 2021-04-01 2022-10-06 Hub Promotional Group dba HPG Modifying Promotional Material Using Logo Images

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6961736B1 (en) * 2002-05-31 2005-11-01 Adobe Systems Incorporated Compact color feature vector representation
US7321623B2 (en) 2002-10-01 2008-01-22 Avocent Corporation Video compression system
US20060126718A1 (en) * 2002-10-01 2006-06-15 Avocent Corporation Video compression encoder
US8346497B2 (en) * 2003-03-26 2013-01-01 Semiconductor Energy Laboratory Co., Ltd. Method for testing semiconductor film, semiconductor device and manufacturing method thereof
US9560371B2 (en) 2003-07-30 2017-01-31 Avocent Corporation Video compression system
US20070036371A1 (en) * 2003-09-08 2007-02-15 Koninklijke Philips Electronics N.V. Method and apparatus for indexing and searching graphic elements
US7113880B1 (en) * 2004-02-04 2006-09-26 American Megatrends, Inc. Video testing via pixel comparison to known image
US7457461B2 (en) 2004-06-25 2008-11-25 Avocent Corporation Video compression noise immunity
US7545540B2 (en) * 2005-05-27 2009-06-09 Xerox Corporation Method and system for processing color image data
US20070070365A1 (en) * 2005-09-26 2007-03-29 Honeywell International Inc. Content-based image retrieval based on color difference and gradient information
US7555570B2 (en) 2006-02-17 2009-06-30 Avocent Huntsville Corporation Device and method for configuring a target device
US8718147B2 (en) 2006-02-17 2014-05-06 Avocent Huntsville Corporation Video compression algorithm
US7949186B2 (en) * 2006-03-15 2011-05-24 Massachusetts Institute Of Technology Pyramid match kernel and related techniques
US20070242065A1 (en) * 2006-04-18 2007-10-18 O'flynn Brian M Target acquisition training system and method
MY149291A (en) * 2006-04-28 2013-08-30 Avocent Corp Dvc delta commands
US7996173B2 (en) 2006-07-31 2011-08-09 Visualant, Inc. Method, apparatus, and article to facilitate distributed evaluation of objects using electromagnetic energy
JP2009545746A (en) 2006-07-31 2009-12-24 ヴィジュアラント,インコーポレイテッド System and method for evaluating objects using electromagnetic energy
US8081304B2 (en) 2006-07-31 2011-12-20 Visualant, Inc. Method, apparatus, and article to facilitate evaluation of objects using electromagnetic energy
US8560402B2 (en) * 2006-08-11 2013-10-15 Etsy, Inc. System and method of shopping by color
US20080144143A1 (en) * 2006-10-31 2008-06-19 Solechnik Nickolai D Method for color characterization and related systems
BRPI0800754A2 (en) 2008-03-25 2020-09-24 Sicpa Holding S.A. PRODUCTION CONTROL SYSTEM INTEGRATED BY IMAGE PROCESSING AND AUTOMATED CODING
KR100889026B1 (en) * 2008-07-22 2009-03-17 김정태 Searching system using image
US8004576B2 (en) * 2008-10-31 2011-08-23 Digimarc Corporation Histogram methods and systems for object recognition
US20100166303A1 (en) * 2008-12-31 2010-07-01 Ali Rahimi Object recognition using global similarity-based classifier
US8949252B2 (en) * 2010-03-29 2015-02-03 Ebay Inc. Product category optimization for image similarity searching of image-based listings in a network-based publication system
US9792638B2 (en) 2010-03-29 2017-10-17 Ebay Inc. Using silhouette images to reduce product selection error in an e-commerce environment
US9405773B2 (en) * 2010-03-29 2016-08-02 Ebay Inc. Searching for more products like a specified product
US8861844B2 (en) 2010-03-29 2014-10-14 Ebay Inc. Pre-computing digests for image similarity searching of image-based listings in a network-based publication system
US8332419B1 (en) 2010-05-13 2012-12-11 A9.com Content collection search with robust content matching
US8320671B1 (en) * 2010-06-11 2012-11-27 Imad Zoghlami Method for ranking image similarity and system for use therewith
US8412594B2 (en) 2010-08-28 2013-04-02 Ebay Inc. Multilevel silhouettes in an online shopping environment
JP5113929B1 (en) * 2011-06-24 2013-01-09 楽天株式会社 Image providing apparatus, image processing method, image processing program, and recording medium
US8890886B2 (en) 2011-09-02 2014-11-18 Microsoft Corporation User interface with color themes based on input image data
US9135497B2 (en) 2012-01-27 2015-09-15 National Instruments Corporation Identifying randomly distributed microparticles in images to sequence a polynucleotide
WO2013119824A1 (en) 2012-02-10 2013-08-15 Visualant, Inc. Systems, methods and articles related to machine-readable indicia and symbols
US9316581B2 (en) 2013-02-04 2016-04-19 Visualant, Inc. Method, apparatus, and article to facilitate evaluation of substances using electromagnetic energy
US9041920B2 (en) 2013-02-21 2015-05-26 Visualant, Inc. Device for evaluation of fluids using electromagnetic energy
WO2014165003A1 (en) 2013-03-12 2014-10-09 Visualant, Inc. Systems and methods for fluid analysis using electromagnetic energy
US9977994B2 (en) 2016-06-30 2018-05-22 Apple Inc. Configurable histogram-of-oriented gradients (HOG) processor
CN110603567B (en) * 2017-05-30 2023-07-28 富士胶片富山化学株式会社 Dispensing inspection assisting device and dispensing inspection assisting method
US10740930B2 (en) 2018-11-07 2020-08-11 Love Good Color LLC Systems and methods for color selection and auditing

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5085325A (en) * 1988-03-08 1992-02-04 Simco/Ramic Corporation Color sorting system and method
US5410637A (en) * 1992-06-18 1995-04-25 Color And Appearance Technology, Inc. Color tolerancing system employing fuzzy logic
US5548697A (en) * 1994-12-30 1996-08-20 Panasonic Technologies, Inc. Non-linear color corrector having a neural network and using fuzzy membership values to correct color and a method thereof
US5652881A (en) * 1993-11-24 1997-07-29 Hitachi, Ltd. Still picture search/retrieval method carried out on the basis of color information and system for carrying out the same
US5751450A (en) * 1996-05-22 1998-05-12 Medar, Inc. Method and system for measuring color difference
US5799105A (en) * 1992-03-06 1998-08-25 Agri-Tech, Inc. Method for calibrating a color sorting apparatus
US5828777A (en) * 1992-02-28 1998-10-27 Canon Kabushiki Kaisha Image processing method and apparatus for preventing coping of specified documents
US6229921B1 (en) * 1999-01-06 2001-05-08 National Instruments Corporation Pattern matching system and method with improved template image sampling using low discrepancy sequences
US6272239B1 (en) * 1997-12-30 2001-08-07 Stmicroelectronics S.R.L. Digital image color correction device and method employing fuzzy logic
US6282317B1 (en) * 1998-12-31 2001-08-28 Eastman Kodak Company Method for automatic determination of main subjects in photographic images
US6370270B1 (en) * 1999-01-06 2002-04-09 National Instruments Corporation System and method for sampling and/or placing objects using low discrepancy sequences
US6385337B1 (en) * 1998-12-21 2002-05-07 Xerox Corporation Method of selecting colors for pixels within blocks for block truncation encoding
US6516100B1 (en) * 1998-10-29 2003-02-04 Sharp Laboratories Of America, Inc. Method for image characterization using color and texture statistics with embedded spatial information
US6625308B1 (en) * 1999-09-10 2003-09-23 Intel Corporation Fuzzy distinction based thresholding technique for image segmentation
US6757428B1 (en) * 1999-08-17 2004-06-29 National Instruments Corporation System and method for color characterization with applications in color measurement and color matching

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69435214D1 (en) 1993-12-10 2009-08-06 Ricoh Kk A method of image recognition and extraction and recognition of a specified image from an image input signal
JP3178305B2 (en) 1995-06-29 2001-06-18 オムロン株式会社 Image processing method and apparatus, copier, scanner and printer equipped with the same
WO1999023600A1 (en) 1997-11-04 1999-05-14 The Trustees Of Columbia University In The City Of New York Video signal face region detection

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5085325A (en) * 1988-03-08 1992-02-04 Simco/Ramic Corporation Color sorting system and method
US5828777A (en) * 1992-02-28 1998-10-27 Canon Kabushiki Kaisha Image processing method and apparatus for preventing coping of specified documents
US5799105A (en) * 1992-03-06 1998-08-25 Agri-Tech, Inc. Method for calibrating a color sorting apparatus
US5410637A (en) * 1992-06-18 1995-04-25 Color And Appearance Technology, Inc. Color tolerancing system employing fuzzy logic
US5652881A (en) * 1993-11-24 1997-07-29 Hitachi, Ltd. Still picture search/retrieval method carried out on the basis of color information and system for carrying out the same
US5548697A (en) * 1994-12-30 1996-08-20 Panasonic Technologies, Inc. Non-linear color corrector having a neural network and using fuzzy membership values to correct color and a method thereof
US5751450A (en) * 1996-05-22 1998-05-12 Medar, Inc. Method and system for measuring color difference
US6272239B1 (en) * 1997-12-30 2001-08-07 Stmicroelectronics S.R.L. Digital image color correction device and method employing fuzzy logic
US6516100B1 (en) * 1998-10-29 2003-02-04 Sharp Laboratories Of America, Inc. Method for image characterization using color and texture statistics with embedded spatial information
US6385337B1 (en) * 1998-12-21 2002-05-07 Xerox Corporation Method of selecting colors for pixels within blocks for block truncation encoding
US6282317B1 (en) * 1998-12-31 2001-08-28 Eastman Kodak Company Method for automatic determination of main subjects in photographic images
US6229921B1 (en) * 1999-01-06 2001-05-08 National Instruments Corporation Pattern matching system and method with improved template image sampling using low discrepancy sequences
US6370270B1 (en) * 1999-01-06 2002-04-09 National Instruments Corporation System and method for sampling and/or placing objects using low discrepancy sequences
US6757428B1 (en) * 1999-08-17 2004-06-29 National Instruments Corporation System and method for color characterization with applications in color measurement and color matching
US6625308B1 (en) * 1999-09-10 2003-09-23 Intel Corporation Fuzzy distinction based thresholding technique for image segmentation

Cited By (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7072507B2 (en) * 2000-12-28 2006-07-04 Canon Kabushiki Kaisha Image processing apparatus and method
US20020085752A1 (en) * 2000-12-28 2002-07-04 Manabu Ohga Image processing apparatus and method
US20030080974A1 (en) * 2001-10-13 2003-05-01 Grosvenor David Arthur Display of static digital images
US6847379B2 (en) * 2001-10-13 2005-01-25 Hewlett-Packard Development Company, L.P. Display of static digital images
US20030083850A1 (en) * 2001-10-26 2003-05-01 Schmidt Darren R. Locating regions in a target image using color matching, luminance pattern matching and hue plane pattern matching
US6944331B2 (en) * 2001-10-26 2005-09-13 National Instruments Corporation Locating regions in a target image using color matching, luminance pattern matching and hue plane pattern matching
US20060153447A1 (en) * 2002-12-05 2006-07-13 Seiko Epson Corporation Characteristic region extraction device, characteristic region extraction method, and characteristic region extraction program
US20060115148A1 (en) * 2002-12-11 2006-06-01 Makoto Ouchi Similar image extraction device, similar image extraction method, and similar image extraction program
EP1574991A4 (en) * 2002-12-11 2006-09-27 Seiko Epson Corp Similar image extraction device, similar image extraction method, and similar image extraction program
EP1574991A1 (en) * 2002-12-11 2005-09-14 Seiko Epson Corporation Similar image extraction device, similar image extraction method, and similar image extraction program
US20050285947A1 (en) * 2004-06-21 2005-12-29 Grindstaff Gene A Real-time stabilization
US20060056736A1 (en) * 2004-09-10 2006-03-16 Xerox Corporation Simulated high resolution using binary sub-sampling
US7545997B2 (en) * 2004-09-10 2009-06-09 Xerox Corporation Simulated high resolution using binary sub-sampling
US20060115127A1 (en) * 2004-11-29 2006-06-01 Dainippon Screen Mfg. Co., Ltd. Print inspection apparatus
US7777925B2 (en) 2006-06-23 2010-08-17 Realtek Semiconductor Corp. Apparatus and method for color adjustment
US20070296987A1 (en) * 2006-06-23 2007-12-27 Realtek Semiconductor Corp. Apparatus and method for color adjustment
EP2046064A1 (en) * 2006-10-05 2009-04-08 Panasonic Corporation Light emitting display device
EP2046064A4 (en) * 2006-10-05 2009-10-21 Panasonic Corp Light emitting display device
US20090225213A1 (en) * 2006-10-05 2009-09-10 Matsushita Electric Industrial Co., Ltd. Luminescent display device
US20120320078A1 (en) * 2007-09-07 2012-12-20 Texas Instruments Incorporated Image intensity-based color sequence reallocation for sequential color image display
US20090254217A1 (en) * 2008-04-02 2009-10-08 Irobot Corporation Robotics Systems
US8452448B2 (en) * 2008-04-02 2013-05-28 Irobot Corporation Robotics systems
US20090274351A1 (en) * 2008-05-02 2009-11-05 Olympus Corporation Image processing apparatus and computer program product
US8160331B2 (en) * 2008-05-02 2012-04-17 Olympus Corporation Image processing apparatus and computer program product
US20090285464A1 (en) * 2008-05-13 2009-11-19 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US8774476B2 (en) * 2008-05-13 2014-07-08 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US8411918B2 (en) * 2008-05-13 2013-04-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20130182923A1 (en) * 2008-05-13 2013-07-18 Canon Kabushiki Kaisha Image processing apparatus and image processing method
EA019627B1 (en) * 2008-10-14 2014-05-30 Сикпа Холдинг Са Method and system for item identification
CN102246186A (en) * 2008-10-14 2011-11-16 西柏控股股份有限公司 Method and system for item identification
WO2010043618A1 (en) * 2008-10-14 2010-04-22 Sicpa Holding Sa Method and system for item identification
US9064187B2 (en) 2008-10-14 2015-06-23 Sicpa Holding Sa Method and system for item identification
US20110149069A1 (en) * 2009-12-22 2011-06-23 Canon Kabushiki Kaisha Image processing apparatus and control method thereof
US8948452B2 (en) 2009-12-22 2015-02-03 Canon Kabushiki Kaisha Image processing apparatus and control method thereof
US20120089867A1 (en) * 2010-10-06 2012-04-12 International Business Machines Corporation Redundant array of independent disk (raid) storage recovery
US10229023B2 (en) 2010-10-06 2019-03-12 International Business Machines Corporation Recovery of storage device in a redundant array of independent disk (RAID) or RAID-like array
US9619353B2 (en) * 2010-10-06 2017-04-11 International Business Machines Corporation Redundant array of independent disk (RAID) storage recovery
US20140253578A1 (en) * 2011-01-31 2014-09-11 Marvell World Trade Ltd. Systems and Methods for Performing Color Adjustment of Pixels on a Color Display
US9349345B2 (en) * 2011-01-31 2016-05-24 Marvell World Trade Ltd. Systems and methods for performing color adjustment of pixels on a color display
US9275447B2 (en) * 2011-11-09 2016-03-01 Canon Kabushiki Kaisha Method and system for describing image region based on color histogram
US20130129215A1 (en) * 2011-11-09 2013-05-23 Canon Kabushiki Kaisha Method and system for describing image region based on color histogram
US9202139B2 (en) * 2012-07-18 2015-12-01 Nvidia Corporation System, method, and computer program product for generating a subset of a low discrepancy sequence
US20140023282A1 (en) * 2012-07-18 2014-01-23 Nvidia Corporation System, method, and computer program product for generating a subset of a low discrepancy sequence
US10282075B2 (en) 2013-06-24 2019-05-07 Microsoft Technology Licensing, Llc Automatic presentation of slide design suggestions
US11010034B2 (en) 2013-06-24 2021-05-18 Microsoft Technology Licensing, Llc Automatic presentation of slide design suggestions
US20170180725A1 (en) * 2013-09-11 2017-06-22 Color Match, LLC Color measurement and calibration
US10469807B2 (en) 2013-09-11 2019-11-05 Color Match, LLC Color measurement and calibration
CN104978740A (en) * 2015-05-10 2015-10-14 刘畅 Component automatic measurement method based on image color feature
WO2017032311A1 (en) * 2015-08-25 2017-03-02 广州视源电子科技股份有限公司 Detection method and apparatus
CN105184778A (en) * 2015-08-25 2015-12-23 广州视源电子科技股份有限公司 Detection method and apparatus
US10528547B2 (en) 2015-11-13 2020-01-07 Microsoft Technology Licensing, Llc Transferring files
US9824291B2 (en) 2015-11-13 2017-11-21 Microsoft Technology Licensing, Llc Image analysis based color suggestions
US10534748B2 (en) 2015-11-13 2020-01-14 Microsoft Technology Licensing, Llc Content file suggestions
WO2017083189A1 (en) * 2015-11-13 2017-05-18 Microsoft Technology Licensing, Llc Image analysis based color suggestions
CN109726730A (en) * 2017-10-27 2019-05-07 财团法人工业技术研究院 Automatic optics inspection image classification method, system and computer-readable medium
US11315231B2 (en) 2018-06-08 2022-04-26 Industrial Technology Research Institute Industrial image inspection method and system and computer readable recording medium
CN110648305A (en) * 2018-06-08 2020-01-03 财团法人工业技术研究院 Industrial image detection method, system and computer readable recording medium
CN109460710A (en) * 2018-10-12 2019-03-12 北京中科慧眼科技有限公司 A kind of color classification recognition methods, device and automated driving system
CN111242836A (en) * 2018-11-29 2020-06-05 阿里巴巴集团控股有限公司 Method, device and equipment for generating target image and advertising image
CN110865862A (en) * 2019-11-13 2020-03-06 北京字节跳动网络技术有限公司 Page background setting method and device and electronic equipment
CN110991465A (en) * 2019-11-15 2020-04-10 泰康保险集团股份有限公司 Object identification method and device, computing equipment and storage medium
US11403723B2 (en) * 2019-12-23 2022-08-02 Shopify Inc Methods and systems for detecting errors in kit assembly
US11341759B2 (en) * 2020-03-31 2022-05-24 Capital One Services, Llc Image classification using color profiles
CN112561487A (en) * 2020-12-21 2021-03-26 广联达科技股份有限公司 Method, device and equipment for calculating construction progress and readable storage medium
CN112884778A (en) * 2021-01-08 2021-06-01 宁波智能装备研究院有限公司 Robust machine vision target recognition and segmentation method and system
US20220319051A1 (en) * 2021-04-01 2022-10-06 Hub Promotional Group dba HPG Modifying Promotional Material Using Logo Images
CN113837181A (en) * 2021-09-24 2021-12-24 深圳集智数字科技有限公司 Screening method and device, computer equipment and computer readable storage medium
CN114170522A (en) * 2022-02-14 2022-03-11 北京中科慧眼科技有限公司 Color classification identification method and system based on chromatographic similarity measurement
CN114648594A (en) * 2022-05-19 2022-06-21 南通恒强家纺有限公司 Textile color detection method and system based on image recognition

Also Published As

Publication number Publication date
US20040228526A9 (en) 2004-11-18
US7046842B2 (en) 2006-05-16

Similar Documents

Publication Publication Date Title
US7046842B2 (en) System and method for color characterization using fuzzy pixel classification with application in color matching and color match location
US6963425B1 (en) System and method for locating color and pattern match regions in a target image
US6757428B1 (en) System and method for color characterization with applications in color measurement and color matching
US6944331B2 (en) Locating regions in a target image using color matching, luminance pattern matching and hue plane pattern matching
US7039229B2 (en) Locating regions in a target image using color match, luminance pattern match and hill-climbing techniques
US6456899B1 (en) Context-based automated defect classification system using multiple morphological masks
US6622135B1 (en) Method for detecting and classifying anomalies using artificial neural networks
US7136524B1 (en) Robust perceptual color identification
US6226399B1 (en) Method and system for identifying an image feature and method and system for determining an optimal color space for use therein
JP2009187581A (en) Information search and retrieval system
JP3581149B2 (en) Method and apparatus for identifying an object using a regular sequence of boundary pixel parameters
CN114648594B (en) Textile color detection method and system based on image recognition
KR20080021181A (en) Video data processing method and system thereof
JP7412556B2 (en) Method and apparatus for identifying effect pigments in target coatings
US7403636B2 (en) Method and apparatus for processing an image
CN110826571B (en) Image traversal algorithm for rapid image identification and feature matching
EP1218851B1 (en) System and method for locating color and pattern match regions in a target image
Lezoray et al. Segmentation of cytological images using color and mathematical morphology
JPH0793561A (en) Edge and contour extractor
Islami Implementation of HSV-based Thresholding Method for Iris Detection
JP2840347B2 (en) Board mounting inspection equipment
WO1999017250A1 (en) Image comparing system
US7876964B2 (en) Method for associating a digital image with a class of a classification system
Posokhov et al. Method and algorithms for cascade classification of sulfur print images of billet transverse templates
Panetta et al. Techniques for detection and classification of edges in color images

Legal Events

Date Code Title Description
AS Assignment

Owner name: NATIONAL INSTRUMENTS CORPORATION, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIN, SIMING;NAIR, DINESH;SCHMIDT, DARREN;REEL/FRAME:011422/0286

Effective date: 20001122

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553)

Year of fee payment: 12

AS Assignment

Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, NORTH CAROLINA

Free format text: SECURITY INTEREST;ASSIGNORS:NATIONAL INSTRUMENTS CORPORATION;PHASE MATRIX, INC.;REEL/FRAME:052935/0001

Effective date: 20200612

AS Assignment

Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, NORTH CAROLINA

Free format text: SECURITY INTEREST;ASSIGNOR:NATIONAL INSTRUMENTS CORPORATION;REEL/FRAME:057280/0028

Effective date: 20210618

AS Assignment

Owner name: NATIONAL INSTRUMENTS CORPORATION, TEXAS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 057280/0028);ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT;REEL/FRAME:065231/0466

Effective date: 20231011

Owner name: PHASE MATRIX, INC., CALIFORNIA

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 052935/0001);ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT;REEL/FRAME:065653/0463

Effective date: 20231011

Owner name: NATIONAL INSTRUMENTS CORPORATION, TEXAS

Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 052935/0001);ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT;REEL/FRAME:065653/0463

Effective date: 20231011