US20050169553A1 - Image sharpening by variable contrast stretching - Google Patents

Image sharpening by variable contrast stretching Download PDF

Info

Publication number
US20050169553A1
US20050169553A1 US11/006,999 US699904A US2005169553A1 US 20050169553 A1 US20050169553 A1 US 20050169553A1 US 699904 A US699904 A US 699904A US 2005169553 A1 US2005169553 A1 US 2005169553A1
Authority
US
United States
Prior art keywords
range
pixel
contrast
dynamic range
slope
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/006,999
Inventor
Ron Maurer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/006,999 priority Critical patent/US20050169553A1/en
Publication of US20050169553A1 publication Critical patent/US20050169553A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/70
    • G06T5/94

Definitions

  • the present invention relates to digital images. More specifically, the present invention relates to image sharpening.
  • Image sharpening is performed to improve the appearance of digital images and particularly the legibility of documents.
  • One well-known image sharpening technique is unsharp masking.
  • traditional unsharp masking enhances perceptible noise in sharpened images. For instance, the unsharp masking enhances low-amplitude noise in scanned images and captured photographs.
  • traditional unsharp masking does not avoid overshoot at edges. Overshoot can be especially troublesome if a digital image is interpolated. The overshoot is spatially spread by the interpolation and appears as an artifact in the interpolated image.
  • Modifications to traditional unsharp masking have been made to reduce the perceptible noise and avoid the overshoot.
  • the modifications do not address imperceptible noise, which reduces the compressibility of the sharpened images.
  • modified unsharp masking techniques that avoid overshoot are usually slower to perform.
  • toggle mapping Another image sharpening technique, known from morphological filtering theory, is toggle mapping.
  • Toggle mapping is usually effective for sharpening text-based images containing edges between black text and white background.
  • toggle mapping is not as effective for sharpening image regions that are not purely black and white. In such regions, the toggle mapping tends to oversharpen textures and natural (e.g., photographed) features.
  • the toggle mapping also tends to enhance noise, although not as much as traditional unsharp masking.
  • the toggle mapping tends to produce jagged edges in text in low-resolution images (in low resolution images, the text looks better when smoothed by anti-aliasing).
  • a digital image is sharpened by clipping those pixel intensity values outside of a variable range; and mapping those pixel intensity values within the variable range.
  • FIG. 1 is an illustration of variable contrast stretching
  • FIG. 2 is an illustration of a neighborhood of pixels for the variable contrast stretching
  • FIG. 3 is an illustration of a method of sharpening a digital image by variable contrast stretching
  • FIG. 4 is an illustration of a hardware implementation of variable contrast stretching.
  • variable contrast stretching reduces spatial scale of large gray-level transitions, resulting in a considerable sharpening in features that are computer-generated (e.g., text, CAD drawings).
  • the variable contrast stretching only mildly reduces the spatial scale of milder gray-level transitions, resulting in a milder sharpening of features that are “natural” (e.g., photographed features).
  • the variable contrast stretching applies strong sharpening to computer-generated features and mild sharpening to edges in natural features.
  • the variable contrast stretching improves the appearance and legibility of compound documents containing both natural and computer-generated features.
  • the spatial scale for weak gray-level transitions is essentially unchanged.
  • variable contrast stretching avoids overshoot.
  • overshoot-related artifacts do not appear as the result of interpolation of digital images that have been sharpened by variable contrast stretching.
  • variable contrast stretching does not enhance low-amplitude noise. In fact, it can slightly reduce low-amplitude noise. Because the variable contrast stretching does not increase low-amplitude noise and it avoids overshoot, compressibility of the sharpened image is not reduced relative to the original image. Consequently, a digital image may be sharpened only once, prior to compression, thus avoiding the need to sharpen the image each time after decompression.
  • variable contrast stretching can work in raster-scan mode with an upper bound on the number of operations per scan line. This makes it suitable for real-time operations.
  • FIG. 1 illustrates variable contrast stretching for a digital image.
  • the digital image is made up of a plurality of pixels, each pixel being represented by an intensity value.
  • a point-wise contrast stretching operation g(I) is performed on each pixel of interest as follows.
  • g ⁇ ( I ) ⁇ I - A ⁇ - W m ⁇ I - A ⁇ ⁇ W A + D 2 ⁇ W ⁇ ( I - A ) I - A ⁇ W M
  • the pixel of interest is modified with respect to a neighborhood of pixels.
  • the gray-value or intensity of the pixel of interest is denoted by the letter I.
  • Maximum gray-value of the neighborhood is denoted by the uppercase letter M
  • minimum gray-value of the neighborhood is denoted by the lowercase letter m.
  • a “contrast range” has a width of 2W.
  • the contrast range is centered about the middle (A) of the dynamic range.
  • A (M+m)/2.
  • the contrast range has a starting point at A ⁇ W and an ending point at A+W.
  • the intensity value of the pixel of interest is outside of the contrast range, the intensity value is clipped to either m or M. If the intensity value of the pixel of interest lies within the contrast range, the amount by which the local contrast is changed is determined by the gradient of the slope of a line segment 10 within the contrast range.
  • the slope of the line segment 10 is a function of the dynamic range.
  • the slope complies with the following: the slope approaches unity as the dynamic range approaches 0 (i.e., S(D) ⁇ 1 as D ⁇ 0); the slope is greater than unity when the dynamic range is greater than zero (i.e., S(D)>1 when D ⁇ 0), and the slope is a non-decreasing function of the dynamic range. As the dynamic range increases, the slope becomes larger and the sharpening increases.
  • the slope is a function of the dynamic range and the contrast range of a given pixel neighborhood. Because a neighborhood is determined for each pixel, the dynamic range, the contrast range and the slope are variable on a pixel-by-pixel basis.
  • variable contrast stretching operation within the variable contrast range may be expressed as follows.
  • g ⁇ ( I ) I + D R ⁇ ( I - A ) ⁇ ⁇ I - A ⁇ ⁇ W ⁇ If D>>R, the mapping becomes equivalent to toggle mapping, whereby edges are oversharpened. Proper selection of the constant R prevents such a problem. For neighborhoods having small dynamic ranges, D ⁇ R and 1+D/R 1. Therefore, no effective change in contrast will occur for D ⁇ R.
  • the preferred value of R is between 64 and 512. That is, 6 ⁇ L ⁇ 9. More generally, if the dynamic range of the entire image is normalized to cover the complete dynamic range of the capturing device (e.g., scanner), the preferred value of R is between one-quarter of the dynamic range and twice the dynamic range.
  • FIG. 2 illustrates an example of a neighborhood of pixels.
  • the neighborhood delineated by a window indicated in dashed lines, includes a 3 ⁇ 3 array of pixels.
  • FIG. 3 shows a general method of applying the sharpening filter to a digital image.
  • a digital image is accessed (block 102 ).
  • the digital image may be accessed from a digital image file, the digital image may be received one or more lines at a time and processed in real time, etc.
  • a neighborhood of pixels is determined (block 106 ), a dynamic range and contrast range of the neighborhood are determined (block 108 ), and the contrast stretching operation g(I) is applied to the pixel of interest (block 110 ).
  • Each filtered pixel is stored in a new file (block 112 ). Pixels lying at the boundaries of the digital image will have partial neighborhoods. These boundary pixels may be processed with respect to their partial neighborhoods, or the filtering may be ignored and the boundary pixels may be stored without modification.
  • the sharpened image may be compressed and stored (block 118 ). Because the variable contrast stretching does not enhance noise and can even reduce low-amplitude noise, the variable contrast stretching does not reduce compressibility of the sharpened image.
  • the compressed image may be accessed and decompressed at a later time. Since the compressed image is already sharpened, sharpening does not have to be performed on the decompressed image. Thus, sharpening on the image is performed only once; it is not performed each time the image is decompressed.
  • variable contrast stretching may be implemented in hardware, software or a combination of the two.
  • an image capture device 212 provides lines of a digital image to a processor 214 .
  • the processor 214 may store all of the lines of the digital image in memory 216 for sharpening at a later time, or it may sharpen the digital image in real time. The sharpened image may also be stored in the memory 216 .
  • the processor 214 performs the filtering steps described in FIG. 3 .
  • the amount of image blurring in the capture device 212 determines how much sharpening is needed.
  • the constant R may be set by subjective quality tests. Different values for the constant R may be tested, and the value producing the most desirable image may be used.
  • the memory 216 stores a program that, when executed, instructs the processor 214 to perform the filtering steps described in FIG. 3 .
  • the processor 214 and memory 216 may be part of a personal computer or workstation, they may be embedded in an image capture device, etc.
  • a non-recursive algorithm may be used to determine the minimum and maximum values of pixel neighborhoods.
  • the non-recursive algorithm may be used for all types of neighborhoods.
  • the size of the memory needed to determine the minimum and maximum values is equal to the size of the neighborhood.
  • a fast recursive algorithm may be used to determine the minimum and maximum values of full square neighborhoods (e.g., 3 ⁇ 3 neighborhoods, 5 ⁇ 5 neighborhoods). Only twelve comparisons are needed to determine the maximum and minimum values, regardless of window size. If this recursive algorithm is used, the size of memory needed to buffer the lines is twice the size of the lateral kernel window.
  • variable contrast stretching can be performed in real time in raster-scan mode because there is a relatively small upper bound on the number of operations per scan line.
  • Recursive and non-recursive algorithms may be used for real time operation.
  • FIG. 2 shows a 3 ⁇ 3 square-shaped neighborhood
  • the variable contrast stretching is not limited to such a neighborhood.
  • the neighborhood is not limited to any particular size.
  • the number of pixels is not limited to nine. Although a fixed number of pixels in the neighborhood is preferred for all pixels of interest, the size of the neighborhood may be changed dynamically to accommodate a particular class of image region (e.g., text, graphics, natural features).
  • the neighborhood is not limited to any particular geometry, although square windows are preferred for performance reasons.
  • the shape of the neighborhood may be diamond-shaped.
  • the minimum/maximum calculations may include the center pixel (that is, a full neighborhood may be used) or the minimum/maximum calculations may exclude the center pixel of interest (that is, a hollow neighborhood may be used).
  • a hollow neighborhood might be preferred for images containing speckle-type noise.
  • variable contrast stretching may be “bootstrapped” onto another filtering method that determines minimum and maximum pixel intensity values for neighborhoods of pixels. This would allow the variable contrast stretching to be performed at virtually no additional overhead, since the main computational effort goes to computing the minimum and maximum values.
  • the variable contrast stretching may be bootstrapped onto a “despeckling” method that determines minimum and maximum pixel intensity values for neighborhoods in which the center pixel (the pixel of interest) has been excluded.
  • the despeckling method is performed to clean speckle noise composed of isolated light pixels on a dark background and vice-versa.
  • both despeckling and variable contrast stretching may be performed, with virtually no additional overhead for performing the variable contrast stretching.
  • variable contrast stretching is not limited to linear mapping within the contrast range. Although linear mapping is preferred, non-linear mapping within the contrast range may be performed.
  • Variable contrast stretching is not limited to documents including both text and natural images. It may be applied to images of any type, including images containing only text and images containing only natural features. If the variable contrast stretching is applied to images that are entirely computer-generated (which images contain no low-amplitude noise), artifacts at sharp edges will not be produced.
  • variable contrast stretching has been described in connection with grayscale values, it is not so limited.
  • the variable contrast stretching may be performed on multiple color planes.
  • the variable contrast stretching may be performed on images in RGB color space as follows. The color image is transformed to YCbCr color space. Sharpening is applied only to the luminance channel (Y) and the sharpened result is combined with unsharpened chrominance information, and the sharpened image is transformed from YCbCr color space back to RGB color space. Consequently, color fringes at the edges are not enhanced. These artifacts might occur if all three color planes are sharpened separately.
  • variable contrast stretching works well with a simple approximation for transforming the pixels from RGB color space to YCbCr color space and back to RGB color space.
  • the following transformations may be used.
  • ⁇ Y 1 4 ⁇ ( R + 2 ⁇ G + B )
  • C 1 R - G

Abstract

Image sharpening is performed by applying variable contrast stretching to pixels of interest in a digital image. For each pixel of interest, the amount of contrast stretching is a function of minimum and maximum intensity values in a local pixel neighborhood.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to digital images. More specifically, the present invention relates to image sharpening.
  • Image sharpening is performed to improve the appearance of digital images and particularly the legibility of documents. One well-known image sharpening technique is unsharp masking.
  • However, traditional unsharp masking enhances perceptible noise in sharpened images. For instance, the unsharp masking enhances low-amplitude noise in scanned images and captured photographs. In addition, traditional unsharp masking does not avoid overshoot at edges. Overshoot can be especially troublesome if a digital image is interpolated. The overshoot is spatially spread by the interpolation and appears as an artifact in the interpolated image.
  • Modifications to traditional unsharp masking have been made to reduce the perceptible noise and avoid the overshoot. However, the modifications do not address imperceptible noise, which reduces the compressibility of the sharpened images. Moreover, modified unsharp masking techniques that avoid overshoot are usually slower to perform.
  • Another image sharpening technique, known from morphological filtering theory, is toggle mapping. Toggle mapping is usually effective for sharpening text-based images containing edges between black text and white background. However, toggle mapping is not as effective for sharpening image regions that are not purely black and white. In such regions, the toggle mapping tends to oversharpen textures and natural (e.g., photographed) features. The toggle mapping also tends to enhance noise, although not as much as traditional unsharp masking. Moreover, the toggle mapping tends to produce jagged edges in text in low-resolution images (in low resolution images, the text looks better when smoothed by anti-aliasing).
  • SUMMARY OF THE INVENTION
  • According to one aspect of the present invention, a digital image is sharpened by clipping those pixel intensity values outside of a variable range; and mapping those pixel intensity values within the variable range. This technique improves image quality without enhancing low-amplitude noise in an image, and it avoids overshoot.
  • Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the present invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an illustration of variable contrast stretching;
  • FIG. 2 is an illustration of a neighborhood of pixels for the variable contrast stretching;
  • FIG. 3 is an illustration of a method of sharpening a digital image by variable contrast stretching; and
  • FIG. 4 is an illustration of a hardware implementation of variable contrast stretching.
  • DETAILED DESCRIPTION OF THE INVENTION
  • As shown in the drawings for purposes of illustration, the present invention is embodied in a method and apparatus for sharpening an image by variable contrast stretching. The variable contrast stretching reduces spatial scale of large gray-level transitions, resulting in a considerable sharpening in features that are computer-generated (e.g., text, CAD drawings). The variable contrast stretching only mildly reduces the spatial scale of milder gray-level transitions, resulting in a milder sharpening of features that are “natural” (e.g., photographed features). Thus, the variable contrast stretching applies strong sharpening to computer-generated features and mild sharpening to edges in natural features. The variable contrast stretching improves the appearance and legibility of compound documents containing both natural and computer-generated features. However, the spatial scale for weak gray-level transitions is essentially unchanged.
  • The variable contrast stretching avoids overshoot. Thus, overshoot-related artifacts do not appear as the result of interpolation of digital images that have been sharpened by variable contrast stretching.
  • The variable contrast stretching does not enhance low-amplitude noise. In fact, it can slightly reduce low-amplitude noise. Because the variable contrast stretching does not increase low-amplitude noise and it avoids overshoot, compressibility of the sharpened image is not reduced relative to the original image. Consequently, a digital image may be sharpened only once, prior to compression, thus avoiding the need to sharpen the image each time after decompression.
  • The variable contrast stretching can work in raster-scan mode with an upper bound on the number of operations per scan line. This makes it suitable for real-time operations.
  • Reference is made to FIG. 1, which illustrates variable contrast stretching for a digital image. The digital image is made up of a plurality of pixels, each pixel being represented by an intensity value. A point-wise contrast stretching operation g(I) is performed on each pixel of interest as follows. g ( I ) = { I - A - W m I - A < W A + D 2 W ( I - A ) I - A W M
  • The pixel of interest is modified with respect to a neighborhood of pixels. The gray-value or intensity of the pixel of interest is denoted by the letter I. Maximum gray-value of the neighborhood is denoted by the uppercase letter M, and minimum gray-value of the neighborhood is denoted by the lowercase letter m. The local dynamic range of the neighborhood, denoted by the letter D, is the difference between minimum and maximum values of the neighborhood (that is, D=M−m).
  • A “contrast range” has a width of 2W. The contrast range is centered about the middle (A) of the dynamic range. A=(M+m)/2. Thus, the contrast range has a starting point at A−W and an ending point at A+W.
  • If the intensity value of the pixel of interest is outside of the contrast range, the intensity value is clipped to either m or M. If the intensity value of the pixel of interest lies within the contrast range, the amount by which the local contrast is changed is determined by the gradient of the slope of a line segment 10 within the contrast range.
  • The slope of the line segment 10 is a function of the dynamic range. The slope, denoted by S(D), is related to the contrast range and the dynamic range as follows: S ( D ) = D 2 W .
  • In general, the slope complies with the following: the slope approaches unity as the dynamic range approaches 0 (i.e., S(D)→1 as D→0); the slope is greater than unity when the dynamic range is greater than zero (i.e., S(D)>1 when D≠0), and the slope is a non-decreasing function of the dynamic range. As the dynamic range increases, the slope becomes larger and the sharpening increases.
  • Thus the slope is a function of the dynamic range and the contrast range of a given pixel neighborhood. Because a neighborhood is determined for each pixel, the dynamic range, the contrast range and the slope are variable on a pixel-by-pixel basis.
  • There are many different ways of expressing the slope of the line segment 10. For example, the slope may be expressed as follows. S ( D ) = D 2 W = 1 + D R
    where constant R is a single global parameter that controls corresponds to the dynamic scale for sharpening. This also defines the contrast width (2W) as a function of the dynamic range ( D ) : W = 1 2 ( R - 1 + D - 1 ) - 1 .
  • Thus the variable contrast stretching operation within the variable contrast range may be expressed as follows. g ( I ) = I + D R ( I - A ) { I - A < W }
    If D>>R, the mapping becomes equivalent to toggle mapping, whereby edges are oversharpened. Proper selection of the constant R prevents such a problem. For neighborhoods having small dynamic ranges, D<<R and 1+D/R 1. Therefore, no effective change in contrast will occur for D<<R.
  • The constant R may be limited to powers of two for computational efficiency. Since the quantity 1+D/R involves a division by the constant R, limiting the constant R to a power of two allows the division to be performed simply by bit-shifting. Thus R=2L, where integer L>0. As the constant R decreases, the sharpening effect increases since the contrast region is smaller and the slope of the contrast stretching becomes larger.
  • For pixel intensity values that are represented by 8-bit words, the preferred value of R is between 64 and 512. That is, 6≦L≦9. More generally, if the dynamic range of the entire image is normalized to cover the complete dynamic range of the capturing device (e.g., scanner), the preferred value of R is between one-quarter of the dynamic range and twice the dynamic range.
  • Reference is now made to FIG. 2, which illustrates an example of a neighborhood of pixels. The neighborhood, delineated by a window indicated in dashed lines, includes a 3×3 array of pixels. The pixel of interest, marked by an “X,” lies at the center of the neighborhood. Thus, the function g(I) is applied to the center pixel (marked by the “X”) and the dynamic range is determined by the minimum and maximum intensity values of the pixels lying within the window. If, for example, the pixel intensity values are represented by 8-bit words, the lowest intensity value of the pixels in the neighborhood is m=5 and the highest intensity value of the pixels in the neighborhood is M=250, the dynamic range is D=245 for that neighborhood.
  • Reference is now made to FIG. 3, which shows a general method of applying the sharpening filter to a digital image. A digital image is accessed (block 102). The digital image may be accessed from a digital image file, the digital image may be received one or more lines at a time and processed in real time, etc.
  • For each pixel of interest in the digital image ( blocks 104, 114, 116), a neighborhood of pixels is determined (block 106), a dynamic range and contrast range of the neighborhood are determined (block 108), and the contrast stretching operation g(I) is applied to the pixel of interest (block 110). Each filtered pixel is stored in a new file (block 112). Pixels lying at the boundaries of the digital image will have partial neighborhoods. These boundary pixels may be processed with respect to their partial neighborhoods, or the filtering may be ignored and the boundary pixels may be stored without modification.
  • Not all neighborhoods will have the same dynamic range (unless all pixels in the image have the same pixel intensity value). Thus the dynamic range is variable. The contrast region, slope and mapping are all variable too.
  • After the sharpening filter has been applied to the digital image, the sharpened image may be compressed and stored (block 118). Because the variable contrast stretching does not enhance noise and can even reduce low-amplitude noise, the variable contrast stretching does not reduce compressibility of the sharpened image.
  • The compressed image may be accessed and decompressed at a later time. Since the compressed image is already sharpened, sharpening does not have to be performed on the decompressed image. Thus, sharpening on the image is performed only once; it is not performed each time the image is decompressed.
  • Reference is now made to FIG. 4. The variable contrast stretching may be implemented in hardware, software or a combination of the two. In general, an image capture device 212 provides lines of a digital image to a processor 214. The processor 214 may store all of the lines of the digital image in memory 216 for sharpening at a later time, or it may sharpen the digital image in real time. The sharpened image may also be stored in the memory 216. When sharpening the digital image, the processor 214 performs the filtering steps described in FIG. 3.
  • The amount of image blurring in the capture device 212 determines how much sharpening is needed. The constant R may be set by subjective quality tests. Different values for the constant R may be tested, and the value producing the most desirable image may be used.
  • In a software implementation of the variable contrast stretching, the memory 216 stores a program that, when executed, instructs the processor 214 to perform the filtering steps described in FIG. 3. The processor 214 and memory 216 may be part of a personal computer or workstation, they may be embedded in an image capture device, etc.
  • A non-recursive algorithm may be used to determine the minimum and maximum values of pixel neighborhoods. The non-recursive algorithm may be used for all types of neighborhoods. The size of the memory needed to determine the minimum and maximum values is equal to the size of the neighborhood.
  • A fast recursive algorithm may be used to determine the minimum and maximum values of full square neighborhoods (e.g., 3×3 neighborhoods, 5×5 neighborhoods). Only twelve comparisons are needed to determine the maximum and minimum values, regardless of window size. If this recursive algorithm is used, the size of memory needed to buffer the lines is twice the size of the lateral kernel window.
  • The variable contrast stretching can be performed in real time in raster-scan mode because there is a relatively small upper bound on the number of operations per scan line. Recursive and non-recursive algorithms may be used for real time operation.
  • Although FIG. 2 shows a 3×3 square-shaped neighborhood, the variable contrast stretching is not limited to such a neighborhood. The neighborhood is not limited to any particular size. The number of pixels is not limited to nine. Although a fixed number of pixels in the neighborhood is preferred for all pixels of interest, the size of the neighborhood may be changed dynamically to accommodate a particular class of image region (e.g., text, graphics, natural features).
  • The neighborhood is not limited to any particular geometry, although square windows are preferred for performance reasons. For example, the shape of the neighborhood may be diamond-shaped. The minimum/maximum calculations may include the center pixel (that is, a full neighborhood may be used) or the minimum/maximum calculations may exclude the center pixel of interest (that is, a hollow neighborhood may be used). A hollow neighborhood might be preferred for images containing speckle-type noise.
  • The variable contrast stretching may be “bootstrapped” onto another filtering method that determines minimum and maximum pixel intensity values for neighborhoods of pixels. This would allow the variable contrast stretching to be performed at virtually no additional overhead, since the main computational effort goes to computing the minimum and maximum values. For example, the variable contrast stretching may be bootstrapped onto a “despeckling” method that determines minimum and maximum pixel intensity values for neighborhoods in which the center pixel (the pixel of interest) has been excluded. The despeckling method is performed to clean speckle noise composed of isolated light pixels on a dark background and vice-versa. Thus both despeckling and variable contrast stretching may be performed, with virtually no additional overhead for performing the variable contrast stretching.
  • The variable contrast stretching is not limited to linear mapping within the contrast range. Although linear mapping is preferred, non-linear mapping within the contrast range may be performed.
  • Variable contrast stretching is not limited to documents including both text and natural images. It may be applied to images of any type, including images containing only text and images containing only natural features. If the variable contrast stretching is applied to images that are entirely computer-generated (which images contain no low-amplitude noise), artifacts at sharp edges will not be produced.
  • Although the variable contrast stretching has been described in connection with grayscale values, it is not so limited. The variable contrast stretching may be performed on multiple color planes. For example, the variable contrast stretching may be performed on images in RGB color space as follows. The color image is transformed to YCbCr color space. Sharpening is applied only to the luminance channel (Y) and the sharpened result is combined with unsharpened chrominance information, and the sharpened image is transformed from YCbCr color space back to RGB color space. Consequently, color fringes at the edges are not enhanced. These artifacts might occur if all three color planes are sharpened separately.
  • The variable contrast stretching works well with a simple approximation for transforming the pixels from RGB color space to YCbCr color space and back to RGB color space. For example, the following transformations may be used. { Y = 1 4 ( R + 2 G + B ) C 1 = R - G C 2 = B - G { G = Y - 1 4 ( C 1 + C 2 ) R = G + C 1 B = G + C 2
    These transformations are fast to compute, since all multiplications and divisions are performed by bit-shifting. Because the luminance channel is changed during sharpening, the transformation back to RGB color space might result in a color component that is outside of its allowable range. Such a component may be clipped to its maximum or minimum allowable value (e.g., 0 or 255 for an 8-bit word).
  • The present invention is not limited to the specific embodiments described and illustrated above. Instead, the invention is construed according to the claims that follow.

Claims (29)

1. A method of processing pixel intensity values of a digital image, the method comprising:
clipping those pixel intensity values outside of a variable range; and
mapping those pixel intensity values within the variable range.
2. The method of claim 1, wherein the variable range for each pixel is a function of dynamic range of a local pixel neighborhood, whereby the variable range is determined on a pixel-by-pixel basis.
3. The method of claim 2, wherein the mapping is performed according to a slope that complies with the following: the slope approaches unity as the dynamic range approaches zero, the slope is greater than unity when the dynamic range is greater than zero, and the slope is a non-decreasing function of the dynamic range.
4. The method of claim 3, wherein a contrast stretching operation is performed on each pixel of interest as follows:
g ( I ) = { I - A - W m I - A < W A + D 2 W ( I - A ) I - A W M
where m represents the minimum value of the neighborhood, M represents the maximum value of the neighborhood, D represent the dynamic range, D/(2W) represents the slope, I represents pixel intensity value, g(I) represents the contrast stretching operation, A represents the middle of the dynamic range, and 2W represents width of the contrast range, the contrast range being centered about the middle of the dynamic range, the contrast range being a function of the dynamic range.
5. The method of claim 4, wherein D/(2W)=1+D/R, where R corresponds to dynamic scale for sharpening, whereby
g ( I ) = I + D R ( I - A ) for { I - A < W } .
6. The method of claim 5, wherein R has a value that is constant for all pixels in the image.
7. The method of claim 5, wherein a capturing device is used to provide the digital image; and wherein R is between one-quarter and twice a range that is normalized to cover the complete dynamic range of the capturing device.
8. The method of claim 1, wherein the digital image is a color image, and wherein a luminance channel of the image is sharpened by clipping those pixel intensity values outside of the variable range; and mapping those pixel intensity values within the variable range; and wherein the sharpened luminance channel is combined with chrominance information of the image.
9. The method of claim 8, wherein the digital image is provided in RGB color space, and wherein the method further comprises using an approximation to convert the digital image from RGB color space to YCbCr color space prior to sharpening.
10. A method of sharpening a digital image, the digital image including a plurality of pixels of interest, for each pixel of interest the method comprising:
determining a dynamic range for a pixel neighborhood; and
performing contrast stretching according to the corresponding dynamic range;
whereby the contrast stretching is performed on a pixel-by-pixel basis.
11. The method of claim 10, wherein the contrast stretching is performed on each pixel of interest by clipping a pixel intensity value lying outside a corresponding contrast range and mapping a pixel intensity value lying within the corresponding contrast range, the contrast range being a function of the dynamic range.
12. The method of claim 11, wherein the mapping is performed according to a slope that complies with the following: the slope approaches unity as the dynamic range approaches zero, the slope is greater than unity when the dynamic range is greater than zero, and the slope is a non-decreasing function of the dynamic range.
13. The method of claim 12, wherein a contrast stretching operation is performed on each pixel of interest as follows:
g ( I ) = { I - A - W m I - A < W A + D 2 W ( I - A ) I - A W M
where m represents the minimum value of the neighborhood, M represents the maximum value of the neighborhood, D represent the dynamic range, D/(2W) represents the slope, I represents pixel intensity value, g(I) represents the contrast stretching operation, A represents the middle of the dynamic range, and 2W represents width of the contrast range, the contrast range being centered about the middle of the dynamic range, the contrast range being a function of the dynamic range.
14. The method of claim 13, wherein D/(2W)=1+D/R, where R corresponds to dynamic scale for sharpening, whereby
g ( I ) = I + D R ( I - A ) for { I - A < W } .
15. The method of claim 14, wherein a capturing device is used to provide the digital image; and wherein the value of R is between one-quarter and twice a range that is normalized to cover the complete dynamic range of the capturing device.
16. A method of sharpening a digital image, the digital image including a plurality of pixels of interest, for each pixel of interest the method comprising performing the following contrast stretching operation on each pixel of interest as follows:
g ( I ) = { I - A - W m I - A < W A + D 2 W ( I - A ) I - A W M
where m represents the minimum value of the neighborhood, M represents the maximum value of the neighborhood, D represent the dynamic range, D/(2W) represents the slope, I represents pixel intensity value, g(I) represents the contrast stretching operation, A represents the middle of the dynamic range, and 2W represents width of the contrast range, the contrast range being centered about the middle of the dynamic range, the contrast range being a function of the dynamic range.
17. Apparatus for processing pixels of interest in a digital image, the apparatus comprising a processor for determining dynamic ranges of pixel neighborhoods for the pixels of interest, and performing contrast stretching on each pixel of interest according to the dynamic range of the corresponding pixel neighborhood, whereby the contrast stretching is performed on a pixel-by-pixel basis.
18. The apparatus of claim 17, wherein the processor performs the contrast stretching on each pixel of interest by clipping a pixel intensity value lying outside a corresponding contrast range and mapping a pixel intensity value lying within the corresponding contrast range, the contrast range being a function of the dynamic range.
19. The apparatus of claim 18, wherein the mapping is performed according to a slope that complies with the following: the slope approaches unity as the dynamic range approaches zero, the slope is greater than unity when the dynamic range is greater than zero, and the slope is a non-decreasing function of the dynamic range.
20. The apparatus of claim 19, wherein a contrast stretching operation is performed on each pixel of interest as follows:
g ( I ) = { I - A - W m I - A < W A + D 2 W ( I - A ) I - A W M
where m represents the minimum value of the neighborhood, M represents the maximum value of the neighborhood, D represent the dynamic range, D/(2W) represents the slope, I represents pixel intensity value, g(I) represents the contrast stretching operation, A represents the middle of the dynamic range, and 2W represents width of the contrast range, the contrast range being centered about the middle of the dynamic range, the contrast range being a function of the dynamic range.
21. The apparatus of claim 20, wherein D/(2W)=1+D/R, where R corresponds to dynamic scale for sharpening, whereby
g ( I ) = I + D R ( I - A ) for { I - A < W } .
22. The apparatus of claim 21, wherein an image capture device is used to provide the digital image; and wherein the value of R is between one-quarter and twice a range that is normalized to cover the complete dynamic range of the capturing device.
23. Apparatus for sharpening a digital image, the apparatus comprising a processor for determining a contrast range for each pixel of interest in the digital image, clipping intensity value of a pixel of interest if the intensity value lies outside of a contrast range; and mapping the pixel intensity value if the pixel intensity value lies within the contrast range; whereby the contrast range is determined on a pixel-by-pixel basis.
24. The apparatus of claim 23, wherein the contrast range for each pixel is a function of dynamic range of a local pixel neighborhood, whereby the processor determines the contrast range on a pixel-by-pixel basis.
25. The apparatus of claim 24, wherein the mapping is performed according to a slope that complies with the following: the slope approaches unity as the dynamic range approaches zero, the slope is greater than unity when the dynamic range is greater than zero, and the slope is a non-decreasing function of the dynamic range.
26. The apparatus of claim 25, wherein a contrast stretching operation is performed on each pixel of interest as follows:
g ( I ) = { I - A - W m I - A < W A + D 2 W ( I - A ) I - A W M
where m represents the minimum value of the neighborhood, M represents the maximum value of the neighborhood, D represent the dynamic range, D/(2W) represents the slope, I represents pixel intensity value, g(I) represents the contrast stretching operation, A represents the middle of the dynamic range, and 2W represents width of the contrast range, the contrast range being centered about the middle of the dynamic range, the contrast range being a function of the dynamic range.
27. The apparatus of claim 26, wherein D/(2W)=1+D/R, where R corresponds to dynamic scale for sharpening, whereby
g ( I ) = I + D R ( I - A ) for { I - A < W } .
28. The apparatus of claim 27, wherein an image capture device is used to provide the digital image; and wherein R is between one-quarter and twice a range that is normalized to cover the complete dynamic range of the capturing device.
29. An article of manufacture for a processor, the article comprising:
computer memory; and
an image sharpening program stored in the memory, the program, when executed, causing the processor to process pixels of interest, each pixel of interest being processed by clipping its intensity value if its intensity value lies outside of a variable contrast range, and mapping its intensity value if its intensity value lies within the variable contrast range.
US11/006,999 2000-09-29 2004-12-08 Image sharpening by variable contrast stretching Abandoned US20050169553A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/006,999 US20050169553A1 (en) 2000-09-29 2004-12-08 Image sharpening by variable contrast stretching

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/676,011 US6915024B1 (en) 2000-09-29 2000-09-29 Image sharpening by variable contrast mapping
US11/006,999 US20050169553A1 (en) 2000-09-29 2004-12-08 Image sharpening by variable contrast stretching

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/676,011 Continuation US6915024B1 (en) 2000-09-29 2000-09-29 Image sharpening by variable contrast mapping

Publications (1)

Publication Number Publication Date
US20050169553A1 true US20050169553A1 (en) 2005-08-04

Family

ID=24712845

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/676,011 Expired - Fee Related US6915024B1 (en) 2000-09-29 2000-09-29 Image sharpening by variable contrast mapping
US11/006,999 Abandoned US20050169553A1 (en) 2000-09-29 2004-12-08 Image sharpening by variable contrast stretching

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/676,011 Expired - Fee Related US6915024B1 (en) 2000-09-29 2000-09-29 Image sharpening by variable contrast mapping

Country Status (5)

Country Link
US (2) US6915024B1 (en)
EP (1) EP1323132B1 (en)
JP (1) JP4063665B2 (en)
AU (1) AU2002210787A1 (en)
WO (1) WO2002027657A2 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070071318A1 (en) * 2003-09-11 2007-03-29 Haruo Yamashita Visual processing device, visual processing method, visual processing program, and semiconductor device
US20070104388A1 (en) * 2005-11-09 2007-05-10 Intel Corporation Enhancing contrast of video data while preserving sharpness
US20070109447A1 (en) * 2003-09-11 2007-05-17 Haruo Yamashita Visual processing device, visual processing method, visual processing program, and semiconductor device
US20070188623A1 (en) * 2003-09-11 2007-08-16 Haruo Yamashita Visual processing device, visual processing method, visual processing program, intergrated circuit, display device, image-capturing device, and portable information terminal
US20090102967A1 (en) * 2005-01-14 2009-04-23 Martin Weston Image Processing
US7899265B1 (en) * 2006-05-02 2011-03-01 Sylvia Tatevosian Rostami Generating an image by averaging the colors of text with its background
US20110080518A1 (en) * 2009-10-01 2011-04-07 Mstar Semiconductor, Inc. Image Processing Method and Image Processing Apparatus
US20120007947A1 (en) * 2010-07-07 2012-01-12 At&T Intellectual Property I, L.P. Apparatus and method for distributing three dimensional media content
US8438502B2 (en) 2010-08-25 2013-05-07 At&T Intellectual Property I, L.P. Apparatus for controlling three-dimensional images
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
US8593574B2 (en) 2010-06-30 2013-11-26 At&T Intellectual Property I, L.P. Apparatus and method for providing dimensional media content based on detected display capability
US8640182B2 (en) 2010-06-30 2014-01-28 At&T Intellectual Property I, L.P. Method for detecting a viewing apparatus
US8918831B2 (en) 2010-07-06 2014-12-23 At&T Intellectual Property I, Lp Method and apparatus for managing a presentation of media content
US8947497B2 (en) 2011-06-24 2015-02-03 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US8947511B2 (en) 2010-10-01 2015-02-03 At&T Intellectual Property I, L.P. Apparatus and method for presenting three-dimensional media content
US8994716B2 (en) 2010-08-02 2015-03-31 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US9560406B2 (en) 2010-07-20 2017-01-31 At&T Intellectual Property I, L.P. Method and apparatus for adapting a presentation of media content
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9787974B2 (en) 2010-06-30 2017-10-10 At&T Intellectual Property I, L.P. Method and apparatus for delivering media content
US10380725B2 (en) * 2015-11-26 2019-08-13 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US11109005B2 (en) * 2019-04-18 2021-08-31 Christie Digital Systems Usa, Inc. Device, system and method for enhancing one or more of high contrast regions and text regions in projected images
EP4060601A1 (en) * 2021-03-19 2022-09-21 Acer Medical Inc. Image pre-processing for a fundoscopic image

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10202163A1 (en) * 2002-01-22 2003-07-31 Bosch Gmbh Robert Process and device for image processing and night vision system for motor vehicles
JP4042563B2 (en) * 2002-12-27 2008-02-06 セイコーエプソン株式会社 Image noise reduction
US8615142B2 (en) * 2003-02-28 2013-12-24 Hewlett-Packard Development Company, L.P. Variable contrast mapping of digital images
US7116446B2 (en) * 2003-02-28 2006-10-03 Hewlett-Packard Development Company, L.P. Restoration and enhancement of scanned document images
US7194142B2 (en) * 2003-02-28 2007-03-20 Hewlett-Packard Development Company, L.P. Selective thickening of dark features by biased sharpening filters
DE60318024T2 (en) * 2003-06-11 2008-09-11 Agfa Healthcare Nv Method and user interface for changing at least contrast or intensity of the pixels of a processed image
JP2005141477A (en) * 2003-11-06 2005-06-02 Noritsu Koki Co Ltd Image sharpening process and image processor implementing this process
JP4533737B2 (en) * 2004-02-03 2010-09-01 株式会社島精機製作所 Yarn image creating device, thread image creating method, and thread image creating program
JP4577621B2 (en) 2004-09-01 2010-11-10 日本電気株式会社 Image correction processing system and image correction processing method
IL165852A (en) * 2004-12-19 2010-12-30 Rafael Advanced Defense Sys System and method for image display enhancement
WO2006082542A1 (en) * 2005-02-07 2006-08-10 Koninklijke Philips Electronics N.V. Clipping
US8131108B2 (en) * 2005-04-22 2012-03-06 Broadcom Corporation Method and system for dynamic contrast stretch
WO2006123290A2 (en) * 2005-05-18 2006-11-23 Koninklijke Philips Electronics N.V. Image processor comprising a contrast enhancer
TWI315961B (en) * 2006-03-16 2009-10-11 Quanta Comp Inc Method and apparatus for adjusting contrast of image
EP1840831A1 (en) * 2006-03-31 2007-10-03 Sony Deutschland Gmbh Adaptive histogram equalization for images with strong local contrast
US7835588B2 (en) * 2006-11-29 2010-11-16 Nokia Corporation Contrast optimization of images
JP4894595B2 (en) * 2007-04-13 2012-03-14 ソニー株式会社 Image processing apparatus and method, and program
US8355595B2 (en) * 2007-05-15 2013-01-15 Xerox Corporation Contrast enhancement methods and apparatuses
US7792357B2 (en) * 2007-05-30 2010-09-07 Microsoft Corporation Chromatic aberration correction
US7809208B2 (en) * 2007-05-30 2010-10-05 Microsoft Corporation Image sharpening with halo suppression
US8411983B2 (en) * 2007-08-31 2013-04-02 Ati Technologies Ulc Method and apparatus for producing a contrast enhanced image
JP5349790B2 (en) * 2007-11-16 2013-11-20 キヤノン株式会社 Image processing apparatus, image processing method, and program
EP2192545B1 (en) 2008-11-27 2014-01-08 Agfa Healthcare Method of changing at least one of density and contrast of an image
EP2309448A1 (en) * 2009-09-22 2011-04-13 Nxp B.V. Local image contrast enhancement
US8781248B2 (en) * 2010-01-28 2014-07-15 Stmicroelectronics Asia Pacific Pte. Ltd. Image details preservation and enhancement
US8675957B2 (en) * 2010-11-18 2014-03-18 Ebay, Inc. Image quality assessment to merchandise an item
WO2012173733A1 (en) * 2011-06-15 2012-12-20 Marvell World Trade Ltd. Modified bicubic interpolation
US9721292B2 (en) 2012-12-21 2017-08-01 Ebay Inc. System and method for image quality scoring
US11144758B2 (en) 2018-11-15 2021-10-12 Geox Gis Innovations Ltd. System and method for object detection and classification in aerial imagery
KR102575126B1 (en) * 2018-12-26 2023-09-05 주식회사 엘엑스세미콘 Image precessing device and method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5042077A (en) * 1987-10-02 1991-08-20 General Electric Company Method of highlighting subtle contrast in graphical images
US5361308A (en) * 1992-01-10 1994-11-01 General Motors Corporation 3-D measurement of cutting tool wear
US6097853A (en) * 1996-09-11 2000-08-01 Da Vinci Systems, Inc. User definable windows for selecting image processing regions
US20030020830A1 (en) * 1996-10-15 2003-01-30 Sani Mehdi H. Video signal converter for converting non-interlaced to composite video

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5012333A (en) 1989-01-05 1991-04-30 Eastman Kodak Company Interactive dynamic range adjustment system for printing digital images
EP0547826A1 (en) * 1991-12-18 1993-06-23 Raytheon Company B-adaptive ADPCM image data compressor
US5524070A (en) * 1992-10-07 1996-06-04 The Research Foundation Of State University Of New York Local adaptive contrast enhancement
EP0793836A1 (en) * 1994-11-23 1997-09-10 Imation Corp. System and method for adaptive interpolation of image data
AU4594796A (en) * 1994-11-25 1996-06-19 Yuriy Alexandrov System and method for diagnosis of living tissue diseases
US5982926A (en) * 1995-01-17 1999-11-09 At & T Ipm Corp. Real-time image enhancement techniques
US5966134A (en) * 1996-06-28 1999-10-12 Softimage Simulating cel animation and shading
US5900732A (en) * 1996-11-04 1999-05-04 Mayo Foundation For Medical Education And Research Automatic windowing method for MR images
US6311297B1 (en) * 1997-10-23 2001-10-30 Sony Corporation Apparatus and method for mapping an image to blocks to provide for robust error recovery in a lossy transmission environment
JP3902894B2 (en) 1999-10-15 2007-04-11 理想科学工業株式会社 Image processing apparatus and image processing method
US6717698B1 (en) 2000-02-02 2004-04-06 Eastman Kodak Company Tone scale processing based on image modulation activity
JP4013498B2 (en) * 2001-07-17 2007-11-28 セイコーエプソン株式会社 Pattern drawing apparatus and pattern drawing body manufacturing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5042077A (en) * 1987-10-02 1991-08-20 General Electric Company Method of highlighting subtle contrast in graphical images
US5361308A (en) * 1992-01-10 1994-11-01 General Motors Corporation 3-D measurement of cutting tool wear
US6097853A (en) * 1996-09-11 2000-08-01 Da Vinci Systems, Inc. User definable windows for selecting image processing regions
US20030020830A1 (en) * 1996-10-15 2003-01-30 Sani Mehdi H. Video signal converter for converting non-interlaced to composite video

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7860339B2 (en) 2003-09-11 2010-12-28 Panasonic Corporation Visual processing device, visual processing method, visual processing program, intergrated circuit, display device, image-capturing device, and portable information terminal
US20070071318A1 (en) * 2003-09-11 2007-03-29 Haruo Yamashita Visual processing device, visual processing method, visual processing program, and semiconductor device
US20070109447A1 (en) * 2003-09-11 2007-05-17 Haruo Yamashita Visual processing device, visual processing method, visual processing program, and semiconductor device
US20070188623A1 (en) * 2003-09-11 2007-08-16 Haruo Yamashita Visual processing device, visual processing method, visual processing program, intergrated circuit, display device, image-capturing device, and portable information terminal
US20080107360A1 (en) * 2003-09-11 2008-05-08 Haruo Yamashita Visual processing device, visual processing method, visual processing program, integrated circuit, display device, image-capturing device, and portable information terminal
US8165417B2 (en) 2003-09-11 2012-04-24 Panasonic Corporation Visual processing device, visual processing method, visual processing program, integrated circuit, display device, image-capturing device, and portable information terminal
US7783126B2 (en) 2003-09-11 2010-08-24 Panasonic Corporation Visual processing device, visual processing method, visual processing program, and semiconductor device
US7945115B2 (en) 2003-09-11 2011-05-17 Panasonic Corporation Visual processing device, visual processing method, visual processing program, and semiconductor device
US20100309216A1 (en) * 2003-09-11 2010-12-09 Haruo Yamashita Visual processing device, visual processing method, visual processing program, and semiconductor device
US8421916B2 (en) * 2005-01-14 2013-04-16 Snell Limited Image processing
US20090102967A1 (en) * 2005-01-14 2009-04-23 Martin Weston Image Processing
US7817873B2 (en) 2005-11-09 2010-10-19 Intel Corporation Enhancing contrast of video data while preserving sharpness
US20070104388A1 (en) * 2005-11-09 2007-05-10 Intel Corporation Enhancing contrast of video data while preserving sharpness
US7899265B1 (en) * 2006-05-02 2011-03-01 Sylvia Tatevosian Rostami Generating an image by averaging the colors of text with its background
US20110080518A1 (en) * 2009-10-01 2011-04-07 Mstar Semiconductor, Inc. Image Processing Method and Image Processing Apparatus
US8743288B2 (en) * 2009-10-01 2014-06-03 Mstar Semiconductor, Inc. Image processing method and image processing apparatus
US9380294B2 (en) 2010-06-04 2016-06-28 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
US9774845B2 (en) 2010-06-04 2017-09-26 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
US10567742B2 (en) 2010-06-04 2020-02-18 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content
US8593574B2 (en) 2010-06-30 2013-11-26 At&T Intellectual Property I, L.P. Apparatus and method for providing dimensional media content based on detected display capability
US8640182B2 (en) 2010-06-30 2014-01-28 At&T Intellectual Property I, L.P. Method for detecting a viewing apparatus
US9787974B2 (en) 2010-06-30 2017-10-10 At&T Intellectual Property I, L.P. Method and apparatus for delivering media content
US8918831B2 (en) 2010-07-06 2014-12-23 At&T Intellectual Property I, Lp Method and apparatus for managing a presentation of media content
US9781469B2 (en) 2010-07-06 2017-10-03 At&T Intellectual Property I, Lp Method and apparatus for managing a presentation of media content
US20120007947A1 (en) * 2010-07-07 2012-01-12 At&T Intellectual Property I, L.P. Apparatus and method for distributing three dimensional media content
US10237533B2 (en) 2010-07-07 2019-03-19 At&T Intellectual Property I, L.P. Apparatus and method for distributing three dimensional media content
US11290701B2 (en) 2010-07-07 2022-03-29 At&T Intellectual Property I, L.P. Apparatus and method for distributing three dimensional media content
US9049426B2 (en) * 2010-07-07 2015-06-02 At&T Intellectual Property I, Lp Apparatus and method for distributing three dimensional media content
US9830680B2 (en) 2010-07-20 2017-11-28 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US10070196B2 (en) 2010-07-20 2018-09-04 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US10602233B2 (en) 2010-07-20 2020-03-24 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US9560406B2 (en) 2010-07-20 2017-01-31 At&T Intellectual Property I, L.P. Method and apparatus for adapting a presentation of media content
US9668004B2 (en) 2010-07-20 2017-05-30 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
US10489883B2 (en) 2010-07-20 2019-11-26 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9247228B2 (en) 2010-08-02 2016-01-26 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US8994716B2 (en) 2010-08-02 2015-03-31 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US9086778B2 (en) 2010-08-25 2015-07-21 At&T Intellectual Property I, Lp Apparatus for controlling three-dimensional images
US9700794B2 (en) 2010-08-25 2017-07-11 At&T Intellectual Property I, L.P. Apparatus for controlling three-dimensional images
US8438502B2 (en) 2010-08-25 2013-05-07 At&T Intellectual Property I, L.P. Apparatus for controlling three-dimensional images
US9352231B2 (en) 2010-08-25 2016-05-31 At&T Intellectual Property I, Lp Apparatus for controlling three-dimensional images
US8947511B2 (en) 2010-10-01 2015-02-03 At&T Intellectual Property I, L.P. Apparatus and method for presenting three-dimensional media content
US9270973B2 (en) 2011-06-24 2016-02-23 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US10484646B2 (en) 2011-06-24 2019-11-19 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9407872B2 (en) 2011-06-24 2016-08-02 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US10033964B2 (en) 2011-06-24 2018-07-24 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9160968B2 (en) 2011-06-24 2015-10-13 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US9736457B2 (en) 2011-06-24 2017-08-15 At&T Intellectual Property I, L.P. Apparatus and method for providing media content
US8947497B2 (en) 2011-06-24 2015-02-03 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9681098B2 (en) 2011-06-24 2017-06-13 At&T Intellectual Property I, L.P. Apparatus and method for managing telepresence sessions
US10200651B2 (en) 2011-06-24 2019-02-05 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US10200669B2 (en) 2011-06-24 2019-02-05 At&T Intellectual Property I, L.P. Apparatus and method for providing media content
US9414017B2 (en) 2011-07-15 2016-08-09 At&T Intellectual Property I, Lp Apparatus and method for providing media services with telepresence
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
US9807344B2 (en) 2011-07-15 2017-10-31 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
US9167205B2 (en) 2011-07-15 2015-10-20 At&T Intellectual Property I, Lp Apparatus and method for providing media services with telepresence
US10380725B2 (en) * 2015-11-26 2019-08-13 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US11109005B2 (en) * 2019-04-18 2021-08-31 Christie Digital Systems Usa, Inc. Device, system and method for enhancing one or more of high contrast regions and text regions in projected images
US11954824B2 (en) 2021-03-19 2024-04-09 Acer Medical Inc. Image pre-processing method and image processing apparatus for fundoscopic image
EP4060601A1 (en) * 2021-03-19 2022-09-21 Acer Medical Inc. Image pre-processing for a fundoscopic image

Also Published As

Publication number Publication date
US6915024B1 (en) 2005-07-05
EP1323132B1 (en) 2012-12-26
WO2002027657A2 (en) 2002-04-04
JP4063665B2 (en) 2008-03-19
AU2002210787A1 (en) 2002-04-08
WO2002027657A3 (en) 2003-03-13
JP2004510268A (en) 2004-04-02
EP1323132A2 (en) 2003-07-02

Similar Documents

Publication Publication Date Title
US6915024B1 (en) Image sharpening by variable contrast mapping
US6731821B1 (en) Method for enhancing compressibility and visual quality of scanned document images
US7181086B2 (en) Multiresolution method of spatially filtering a digital image
US7433514B2 (en) Tone mapping of high dynamic range images
US7155069B2 (en) Image processing apparatus, image processing method, and image processing program
US7020332B2 (en) Method and apparatus for enhancing a digital image by applying an inverse histogram-based pixel mapping function to pixels of the digital image
US7061492B2 (en) Text improvement
EP2124190A1 (en) Image processing to enhance image sharpness
US20030189579A1 (en) Adaptive enlarging and/or sharpening of a digital image
JPH08251432A (en) Real time picture enhancing technique
US6721458B1 (en) Artifact reduction using adaptive nonlinear filters
JP2001118062A (en) Automatic dynamic range compressing method
JP2019016117A (en) Image adjusting device, local contrast quantity calculating device, method, and program
JPH05273951A (en) Method and device for emphasizing digital image for electronic display
US5832123A (en) Method and apparatus for producing an enhanced two-grayscale image
US20040170339A1 (en) Selective thickening of dark features by biased sharpening filters
WO2007014014A2 (en) Adaptive contrast control systems and methods
EP2124189A1 (en) Image processing to enhance image sharpness
KR101730886B1 (en) Processing method for infrared image
JP2006519447A (en) Variable contrast mapping of digital images
JP2001086368A (en) Image processor
JP3750164B2 (en) Image processing method and image processing apparatus
JP2002077622A (en) Image processing apparatus and recording medium
CN113129246A (en) Document picture processing method and device and electronic equipment
JP2006065883A (en) Image processing method and image processing device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION