US4834530A - Shape measuring apparatus - Google Patents

Shape measuring apparatus Download PDF

Info

Publication number
US4834530A
US4834530A US07/039,455 US3945587A US4834530A US 4834530 A US4834530 A US 4834530A US 3945587 A US3945587 A US 3945587A US 4834530 A US4834530 A US 4834530A
Authority
US
United States
Prior art keywords
pattern
measured
measuring apparatus
shape measuring
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US07/039,455
Inventor
Shunji Murai
Fumio Othomo
Hitoshi Otani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Topcon Corp
Tokyo Kogaku Kikai KK
Original Assignee
Tokyo Kogaku Kikai KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tokyo Kogaku Kikai KK filed Critical Tokyo Kogaku Kikai KK
Assigned to TOKYO KOGAKU KIKAI KABUSHIKI KAISHA, 75-1, HASUNUMA-CHO, ITABASHI-KU, TOKYO, A CORP. OF JAPAN reassignment TOKYO KOGAKU KIKAI KABUSHIKI KAISHA, 75-1, HASUNUMA-CHO, ITABASHI-KU, TOKYO, A CORP. OF JAPAN ASSIGNMENT OF ASSIGNORS INTEREST. Assignors: MURAI, SHUNJI, OHTOMO, FUMIO, OTANI, HITOSHI
Application granted granted Critical
Publication of US4834530A publication Critical patent/US4834530A/en
Assigned to KABUSHIKI KAISHA TOPCON, 1-GO 75-BAN HASUNUMA-CHO, ITABASHI-KU, TOKYO-TO reassignment KABUSHIKI KAISHA TOPCON, 1-GO 75-BAN HASUNUMA-CHO, ITABASHI-KU, TOKYO-TO CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: TAMIO NISHIWAKI
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2536Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object using several gratings with variable grating pitch, projected on the object with the same angle of incidence

Abstract

A shape measuring apparatus is disclosed. It comprises a pattern portion having a pattern in which pattern elements are arranged in the pitch width direction according to a predetermined rule so that a plurality of N codes can be distinguished with respect to one another; a projecting portion having a moving portion for moving the pattern in the pitch width direction according to a rule and projection optical system for projecting the pattern onto an object to be measured; a detecting portion for detecting a first detection data and a second detection data by measuring a surface information of the object to be measured on which the pattern is projected from two different directions every time the pattern is moved by the moving portion; an extracting portion for extracting a first position data and a second position data corresponding to a point position of the object to be measured from the first and second detection data respectively; and a calculating portion for calculating a coordinate of the point position corresponding to the first position data of the object to be measured from the first position data and the second position data.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention relates to a shape measuring apparatus. More particularly, it relates to a shape measuring apparatus of the type in which a predetermined pattern is projected onto an object to be measured, each point of the object to be measured being particulated, a data based on a projection pattern being detected from the object to be measured on which the pattern is projected, the detection data being processed, thereby to obtain a shape of the object to be measured by measuring the coordinate at the respective point positions on the object to be measured by means of non-contact measurement.
2. Description of the Related Art
Heretofore, there has been known a non-contact type shape measuring apparatus in which a predetermined pattern is projected to an object to be measured and this pattern is detected from two directions to decide a corresponding point between the respective detection outputs and, based on the foregoing, the shape of the object to be measured is measured.
In recent years, in such non-contact type shape measuring apparatus, since a high accuracy and high resolution measurement is required, projection pattern images are projected to an object to be measured using plural kinds of patterns and every time the respective patterns are projected, projection data is detected so as to correspond the respective data points relative to one another when the respective patterns are projected, thereby to measure the shape of the object to be measured highly accurately.
3. Problems to be Solved by the Invention
However, in order to perform a high accuracy measurement of an object to be measured in the above-mentioned non-contact type shape measuring apparatus, plural kinds of patterns are separately required. In addition, in order to project the plural kinds of patterns to the object to be measured and to correspond the data with the respective projected patterns, the relative positions of the plural kinds of patterns must be strictly decided. Thus, the apparatus is required to be made large in size and complicated adjusting means are required.
SUMMARY OF THE INVENTION
1. Object of the Invention
The present invention was accomplished in view of the problems involved in the conventional shape measuring apparatus. It is therefore a general object of the present invention to provide a shape measuring apparatus, in which plural kinds of different projection pattern images can be obtained by moving only one pattern, thereby to measure the shape of an object to be measured highly accurately and easily.
2. Means for Solving the Problems
In order to solve the above-mentioned problems, there is essentially provided a shape measuring apparatus comprising a pattern portion having a pattern in which pattern elements are arranged in the pitch width direction according to a predetermined rule so that a plurality of N codes can be distinguished with respect to one another; a projecting means having a moving means for moving the pattern in the pitch width direction according to a rule and a projection optical system for projecting the pattern onto an object to be measured; a detecting means for detecting a first detection data and a second detection data by measuring a surface information of the object to be measured on which the pattern is projected from two different directions every time the pattern is moved by the moving means; an extracting means for extracting a first position data and a second position data corresponding to a point position of the object to be measured from the first and second detection data respectively; and a calculating portion for calculating a coordinate of the point position corresponding to the first position data in accordance with the first position data and the second position data.
3. Description of the Function of the Present Invention
According to the present invention, plural kinds of different projection pattern images can be obtained merely by moving only one pattern in the pitch width direction without preparation of plural kinds of patterns.
Other objects, advantages and features of the present invention will become readily apparent to those skilled in the art from the following detailed description of the preferred embodiment.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic view of an important portion of a shape measuring apparatus according to the present invention;
FIG. 2 is a schematic view of one example of a register for preparing an M series pattern according to the present invention;
FIG. 3 is a table of binary to decimal numerals showing the contents of a register for obtaining the M series according to the present invention;
FIG. 4 is a partly enlarged view of a pattern plate having an M series pattern according to the present invention;
FIG. 5 is an illustration for explaining the structure of a shift register for obtaining another M series pattern;
FIG. 6 is a flow chart for explaining the function of a shape measuring apparatus according to the present invention;
FIG. 7 is a schematic view of a detection output for explaining the function of a shape measuring apparatus according to the present invention;
FIGS. 8(a), 8(b) and 8(c) are illustrations for explaining tables which is used when data are processed using a shape measuring apparatus according to the present invention;
FIG. 9 is an illustration for explaining the calculation for obtaining the coordinate of a point position of an object to be measured using a shape measuring apparatus according to the present invention;
FIG. 10 is a schematic view for showing one example of projection pattern images formed on an object to be measured;
FIG. 11 is a schematic view for showing one example of a peak address processing according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
One preferred embodiment of the present invention will be described with reference to the accompanying drawings.
In FIG. 1, 1 denotes an object to be measured, and 10 denotes a projecting means. The projecting means 10 is provided with a motor M1. Furthermore, the projecting means or portion 10 is provided with a pattern plate 12, a light source 16 and a projection lens 18 as a pattern portion. The pattern plate 12 is formed with a pattern 14. The pattern 14 is projected as a pattern image to the object 1 to be measured by the light source 16 and projection lens 18. The light source 16 and projection lens 18 constitute a projection optical system for projecting the pattern 14 to the object 1 to be measured as the pattern image. The motor M1 acts as a moving portion for linearly moving the pattern plate 12 in the pitch width direction of the pattern 14.
The pattern 14 is shown and explained as an M series pattern in this embodiment. The M series pattern is the longest series pattern which can be made based on binary numbers. In general, an M series of nth (n is an integer) has a pattern element of (2n -1) bit, and (2n -1) kinds of different codes are formed by adjacent nth binary numbers. This M series can be easily made by using a shift register, for example, if n=8 is given, an 8th M series can be formed by means of feed back using a shift register D comprising 8 bits D1 through D8 as shown in FIG. 2. In this embodiment, D1 corresponds to a 20 digit, D2 corresponds to a 21 digit, D3 corresponds to a 22 digit, D4 corresponds to a 23 digit, D5 corresponds to a 24 digit, D6 corresponds to a 25 digit, D7 corresponds to a 26 digit, and D8 corresponds to 27 digit.
The constitution of this shift register D is D8 +D6 +D5 +D4 +I, wherein the symbol + is an adding mark which feeds back the contents of D8, D6, D5 and D4 to D1. Symbol I represents an input to a register component element D1 of 20 rigid. When the sum of the contents "1" of the register component elements D8, D6, D5, and D4 is an odd number, the "1" is inputted into the register component element D1, whereas when the sum of the content "1" of the register component elements D8, D6, D5 and D4 is an even number or "0", "0" is inputted into the register component element D1. As an initial value in the shift register D, the content of the register component element D1 is set to "1" and the contents of the register component elements D2 through D8 are set to "0".
That is, the content of the initial value of the shift register D is "00000001". This initial value is set by a first shifting operation. This shift register D renews its content to "00000010" by a second shifting operation. The content of this shift register becomes "00001000" by a fourth shifting operation. Since the content of D4 is "1" at fifth shifting operation, "1" is fed back to D1 and the content of the shift register D becomes "00010001". Since the content of D5 is "1" at the sixth shifting operation, "1" is fed back to D1 in the same procedure as described, and the content of the shift register D becomes "00100011". In the same procedure, the content becomes "01000111" at the seventh shifting operation. Since the contents of D8, D6, D5 and D4 are all "0" at eighth shifting operation, D1 becomes "0" and the content of the shift register becomes "10001110". Since the sum of the ocntents "1" of D8 and D4 are an even number, D1 becomes "0" and the content of the shift register becomes "00011100". When the shifting operation is repeated 255 times, the table shown in FIG. 3 is obtained. When the contents of the shift register D are converted to decimal numbers, the numbers from 1 to 255 appear once for each operations,as shown in the B through D columns. In this embodiment, the M series corresponds to a numeral series of 255 pieces which appear in the register component elements D8 through D1 of the shift register during the first to 255 shifting operations. With reference to column D8, the table of FIG. 3 is addressed vertically across each row. When the eight numbers, which appear during the first to eight shifting operations, are regarded as one binary code, it is known that they correspond to the decimal number "1" of B to D at the first shifting operation. In the same procedure, when the eight numbers, which appear during the second to ninth shifting operations, are regarded as one binary code, it is known that they correspond to the decimal number "2" of B to D at the second operation. In the same procedure, when the eight numbers, which appear during the third to tenth shifting operations, are regarded as one binary code, it is known that they correspond to the decimal number "4" of B to D at the third shifting operation. In this way, the contents of the register component elements D8 are vertically looked for corresponding to 255 decimal numbers. In this embodiment, after the eight numbers appeared during the 248 to 255 shifting operations were corresponded to the 248 shifting operation, it is known that no corresponding numbers are left. Therefore, the contents after the 255 shifting operations are assumed as "0" beforehand. This assumption is correct since the contents of this shifting registers D7 through D1 are "0" at the 255 shifting operation. That is, the shift register D at the 256 shifting operation becomes identical to the content of the first shifting operation again, and the contents of Table 3 are repeated.
In this way, when decimal numbers corresponding to the eight numbers of 255 sets which appear by this shifting operation, the same decimal numbers are not produced at all.
Accordingly, the content "0" of the M series pattern appearing at D8 corresponds to a cut transmission command and the content "1" corresponds to a begin transmission command, and a slit 20 is formed on the pattern plate 12 as shown in FIG. 4. More specifically, the pattern 14 is formed in such a manner that, by making the pitch width as P, the slit 20 is formed at a place corresponding to the content "1" and no slit 20 is formed at a place corresponding to the content "0". In this embodiment, the pitch width direction is the direction crossing the slit 20. Further, the slit width of the slit 20 is small enough with respect to the pitch width P. In the above description, an eighth M series pattern is described. However, other M series can be used as well. Regarding the patterns from the fourth to the tenth, a shift register can be prepared based on the table of FIG. 5. In the above description, the content "0" corresponds shade and the content "1" corresponds to transmission. If desired, the content "0" may correspond to a slit type filter which permits a blue color light to transmit and the content "1" may correspond to a slit type filter which permits a red color light to transmit.
In FIG. 1, a first detecting means 30 and a second detecting means 32 are disposed at both sides of the M series pattern direction formed on the pattern plate 12 of the projecting portion 10. The first detecting means or portion 30 and the second detecting means or portion 32 include objective lenses 34, 36 respectively. Above the objective lenses 34 and 36, a first linear image sensor 38 and a second linear image sensor 40 are disposed in such a manner as to corresponds to the respective objective lenses 34 and 36. The image of the object 1 to be measured on which the M series pattern 14 is projected is formed on the linear image sensors 38 and 40 by the respective objective lenses 34 and 36 respectively.
42 denotes a housing. The housing 42 contains therein the projecting portion 10, first detecting portion 30 and second detecting portion 32. The housing 42 is also movable in the Y direction of FIG. 1 by a motor M2. The first linear image sensor 38 and second linear image sensor 40 are scanned by clock signals φ1, φ1', φ2 and φ2' emitted from a timing pulse generator 52. Due to this scanning, the first linear image sensor 38 and second linear image sensor 40 receive data from each picture element and the same is outputted as a detection data to a first A/D converter 54 and a second A/D converter 56.
The first A/D converter 54 and second A/D converter 56 start a D-conversion by signals emitted from the timing signal generator 52. When this conversion procedure is finished, the first A/D converter 54 and second A/D converter 56 output signals showing the completion of the A/D conversion to the timing signal generator 52 and, at the same time, output the A/D converted detection data to a control calculating means or portion 50 from the first and second linear image sensors 38 and 40. The control calculating portion 50 gives and takes timing signals to and from the timing signal generator 52 and outputs control signals to the motors M1 and M2. In addition, the control calculating portion 50 receives detection data from the first and second A/ D converters 54 and 56 for a predetermined processing, writes and reads data in and out of a memory 58, outputs display signals to a displayer 60 to display the measuring results, and receives various commands from an operating portion 62. This control calculating portion 50 acts as an extracting means and a calculating means as will be described hereinafter.
Next, the function of a shape measuring apparatus according to the present invention will be described with reference to the flow chart of FIG. 6, the signal illustration of FIG. 8, and FIGS. 8 through 11.
When the measuring procedure is started, a value necessary for the measurement is set as an initial value in Step S1. That is, in Step S1, a Y-direction reset is performed, the housing 42 is moved to the original point on the Y-axis by means of control of the motor M2, and the number l of moving times in the Y-direction is set as l=0. The number l of moving times in the Y-direction is on integer. This number l of moving times is adjustable with respect to the object 1 to be measured. The total number of moving times is set as lF. In Step S2, the completion of the Y-axis direction operation is judged. If the scanning in the Y direction is completed, then it goes to Step13, and, if not, it goes to Step S3. In this embodiment, since l=0 is given, it becomes l≠lF and it moves to Step S3.
In Step S3, a procedure is performed to obtain eight kinds of first detection data D1l (m, n) and second detection data D2l (m, n) by displacing the M series pattern seven times (m=8) by one pitch width each time on a predetermined Y coordinate in order to obtain a code showing the point position of the object 1 to be measured based on the detection data of the first and second linear image sensors 38 and 40. In this embodiment, reference symbol n denotes a picture element number of the first and second linear image sensors 38 and 40. In this embodiment, the number of the picture elements of the first and second linear image sensors 38 and 40 is2048. Accordingly, in this embodiment, the total pieces of the first detection data D1l (m, n) and second detection data D2l (m, n) are 8×2048 pieces respectively.
First, in Step S3, the M series pattern scanning is reset, the pattern plate 12 is moved to a predetermined position by means of control of the motor M1 and the number m of the scanning times of the M series pattern is set as m=1. In this Step S3, a projection pattern image is formed on the object 1 to be measured as indicated by symbolic numeral X1 of FIG. 10. In this FIG. 10, the solid straight line shows a slit image, while the broken line corresponds to an image at a place where no slit is formed. This projection pattern image is formed on the picture element surfaces of the first and second linear image sensors 38 and 40 by the objective lenses 34 and 36 of the first and second detecting portions 30 and 32 respectively. And, in Step S4, the first linear image sensor 38 and the second linear image sensor 40 are scanned. Due to the foregoing, the detection outputs of the whole picture elements of the first and second linear image sensors 38 and 40 are taken in the first and second A/D converters in succession and the detection outputs are subjected to the A/D conversion one after another. Thereafter, the detection outputs are stored in a memory 58 as the first detection data D1l (m, n) and second detection data D2l (m, n), respectively, through the control calculating portion.
When all of the detection outputs of the first and second linear image sensors 38 and 40 were stored in the memory 58, it goes to Step S5. In Step S5, it is judged whether the number m of scanning times of the M series pattern is m=8 or not. If the number m of scanning times is m=8, it goes to Step S7 and if the number m of scanning times is not m=8, it goes to Step S6. In Step S6, a move scanning in the pitch width direction of the M series pattern is performed. That is, the motor M1 displaces the M series pattern by one pitch by means of the control calculating portion 50, the content of the number m of scan moving times of the M series pattern is increased by one piece, and it goes to Step S4. Due to the foregoing, the object 1 to be measured is formed with projection pattern images which are indicated by symbolic numerals X1 through X8 in FIG. 10. The projection pattern images X1 through X8 are displaced by one pitch, respectively, with respect to the preceding and following projection pattern images as indicated by arrows. The detection data D1l (m, n) and D2l (m, n) corresponding to the projection pattern images X1 through X8 are stored in the memory 58 every time the move scanning is performed.
Steps S7 through S1 which will be described hereinafter are for extracting the first and second position data corresponding to a predetermined position of the object 1 to be measured from the first and second detection data D1l (m, n) and D2l (m, n). First, in Step S7, a smoothing processing is performed. The terms "smoothing processing" used herein means a processing for taking out data compositions by correcting contract compositions contained in the detection data. That is, the object 1 itself has contrast, and the detection output outputted from the first and second linear image sensors 38 and 40 based on the image of the object 1 on which the pattern is not yet projected becomes something like one shown in FIG. 7(a). When the M series pattern is projected, the first and second linear image sensors 38 and 40 obtain such detection output as shown in FIG. 7(b).
In Step S7, a smoothing signal for cutback as shown by one dotted chain line of FIG. 7(c) is generated in order to remove the detection output based on the contrast composition and extract only the signal composition based on the slit image. The smoothing signal for cutback shown by one dotted chain line here is set slightly higher than the signal level of the contrast shown in FIG. 7(a). This smoothing signal is generated by only low frequency composition obtained by having the detection output shown in FIG. 7(b) transmit through a filter. In this way, the detection data D1l (m, n) and D2l (m, n) corresponding to the slit image are obtained. That is, when 255×8 pieces of the respective detection data D1l (m, n) and D2l (m, n) obtained by moving the M series pattern seven times are reviewed, they become something like those shown in FIGS. 7(d) through 7(k). The dot "." used herein shows a signal level in a certain picture element. In Step S8, a peak address detection processing is performed in which the address of a peak detection data is obtained from the first and second detection data D1 (m, n) and D2 (m, n). The slit image, which is formed on the first and second linear sensors 38 and 40 after passing through the slit and being reflected on the object 1 to be measured, is imaged on a plurality of picture elements and becomes a detection output as shown in its enlarged view in FIG. 11. Therefore, it is necessary to perform this peak address detection processing.
The peak address N corresponding to the picture elements of the first and second linear image sensors 38 and 40 corresponding to this peak is obtained from the following formula per each peak with respect to the detection data; ##EQU1## wherein nA is its first address and nB is its last address. The peak address N2 can be also obtained in the same procedure with respect to the second detection data D2l (m, n). The peak addresses N1 and N2 can obtain 255 pieces with respect to each detection data point according to the formula (1). The reason is that the peaks overlap each other as shown by a solid line in FIG. 7(l). Such obtained peak addresses N1 and N2 correspond to the center of the elements of the M series pattern.
Then, it goes to Step S9. In Step S9, the data code is decided. That is, data codes of decimal number showing the point positions corresponding to the peak addresses N1 and N2 which were obtained in Step S8 are decided. First, the first and second detection data D1l (m, n) and D2l (m, n) are compared with the respective smoothing signals. If the first and second detection data D1l (m, n) and D2l (m, n) are larger than the smoothing signals, the contents are set as "1", whereas if the first and second detection data D1l (m, n) and D2l (m, n) are smaller than the smoothing signals, the contents are set as "0". Such binary detection data are represented by D1l '(m, n) and D2l '(m, n). On the other hand, if the integer in the vicinity of the peak address N obtained in Step S9 is represented by N*, the data code AD1l (N*) corresponding to each peak address with respect to the first detection data D1l (m, n) is obtained from the following binary/decimal conversion formula; ##EQU2##
In this way, the respective first and second data codes AD1l (N*) and AD2l (N*) are obtained with respect to the first and second data D1l (m, n) and D2l (m, n). These are corresponded to the point positions on the object to be measured and are equivalent to the point position data. Then, it goes to Step S10. In Step S10, the corresponding points are searched. For this purpose, a table as shown in FIG. 8(a) is prepared beforehand in which the decimal numbers produced by the M series pattern is corresponded to the address numbers. And, regarding the first and second data codes AD1l (N*) and AD2l (N*), they are replaced with the corresponding address numbers respectively as shown in FIGS. 8(b) and 8(c), and the corresponding point positions are searched for the remaining address numbers. That is, the peak addresses N1 and N2 of the first and second data codes having the same address numbers are set as (N1, N2). The fact that the first and second data codes have the same address numbers means that they correspond to the point position of the object 1 to be measured. In FIG. 8(a), the same address number is shown for the detection numbers 4 and 5. This is shown as a measuring error. In this way, when the same address number appears, the point position cannot be decided. Therefore, such detection data should not be used.
Then, it goes to Step S11. In Step S11, the coordinate is calculated. The coordinate of the point position of the object 1 to be measured is obtained by calculation from the peak address N1 of the first data code and the peak address N2 of the second data code which have the same address number.
The distances Xs1 and Xs2 from one end of the first and second linear image sensors 38 and 40 to the positions of the peak addresses N1 and N2 are shown by Xs1 =α×N1 and Xs2 =α×N2 wherein α is the distance between the elements. Therefore, under the arrangement shown in FIG. 9, the coordinates X, Y and Z of the point P on the object 1 to be measured are obtained geometrically from the following formulas; ##EQU3## wherein β is a moving pitch distance by the motor M2, L1 and L2 are distances from the left ends of the first and second linear image sensors 38 and 40 to the centers O1 and O2 of the objective lenses 34 and 36, L is a distance between the centers O1 and O2 of the objective lenses 34 and 36, and f is a focal distance of the objective lenses 34 and 36. The X, Y and Z coordinates are prepared by using the objective lens O1 as the original coordinate point. When the coordinate calculations are finished in Step S11, the motor M2 moves the housing 42 by the pitch β and increases the number of moving times in the Y direction by one piece by means of control of the control calculating portion 50 in Step S12. Then, it goes to Step S2. In Step S2, it goes to Step S3 to continue the measurement until the number l of moving times in the Y direction becomes a predetermined number lF of times as described in the foregoing. When the number l of moving times in the Y direction reaches the predetermined number lF, it goes to Step S13. In Step S13, the coordinates of the object 1 to be measured so far obtained are put in order and written in the memory 58. When the writing procedure is finished, it goes to Step S14. In Step S14 m, the measuring results are outputted to a printer, CRT, etc. By this, the measurement is finished thoroughly. It is noted that a highly accurate measurement can be performed by shifting the initial setting of the M series pattern by n pitch.
As described in the foregoing, a shape measuring apparatus according to the present invention comprises a pattern portion having a pattern in which pattern elements are arranged in the pitch width direction according to a predetermined rule so that a plurality of N codes can be distinguished with respect to one another; a projecting portion having a moving portion for moving the pattern in the pitch width direction according to a rule and a projection optical system for projecting the pattern onto an object to be measured; a detecting portion for detecting a first detection data and a second detection data by measuring a surface information of the object to be measured on which the pattern is projected from two different directions every time the pattern is moved by the moving portion; and extracting portion for extracting a first position data and a second position data corresponding to a point position of the object to be measured from the first and second detection data respectively; and a calculating portion for calculating a coordinate of the point position corresponding to the first position data of the object to be measured from the first position data and the second position data. Accordingly, plural kinds of different projection pattern images can be obtained merely by moving only one pattern in the pitch sidth direction without any preparation of plural kinds of patterns. Thus, the shape of an object to be measured can be detected highly accurately and easily.
In this disclosure, there is shown and described only the preferred embodiment of the present invention, but it is to be understood that the present invention is not limited to this. Instead, the present invention is capable of changes and modifications within the scope of the inventive concept as expressed herein.

Claims (7)

What is claimed is:
1. A shape measuring apparatus, comprising:
means for forming an M series pattern in which pattern elements are arranged in a pitch width direction so that a plurality of N number of codes can be distinguished with respect to one another;
means for moving said M series pattern in the pitch width direction according to a rule;
projecting optical means for projecting said M series pattern onto the surface of an object to be measured to obtain a plurality of projection pattern images;
detecting means for obtaining a first detection data and a second detection data each time said M series pattern is moved by said moving means, said first and second detection data being determined by projecting said M series pattern onto the object from two different directions to obtain surface information of the object being measured;
extracting means for extracting first positional data and second positional data, including a three dimensional positional designation of each point on the surface of the object to be measured in accordance with said first detection data and said detection data; and
calculating means for calculating a three dimensional position corresponding to said first positional data of each point on the surface of the object to be measured in accordance with said first positional data and a second positional data.
2. The shape measuring apparatus of claim 1, wherein said detecting means includes at least a pair of image sensors spaced apart from one another in the pitch width direction of said M series pattern, and an imaging optical system for forming an image of the object to be measured on said pair of image sensors, the output of one of said pair of image sensors corresponding to said first positional data, and the output of the other of said pair of image sensors corresponding to said second positional data.
3. The shape measuring apparatus of claim 2, wherein each of said pair of image sensors includes a plurality of picture elements forming a display to image said first positional data and said second positional data.
4. The shape measuring apparatus of claim 1, wherein said pattern forming means includes a light transmitting means and a light shading means, said M series pattern being formed by an alternating arrangement of said light transmitting means and said light shading means.
5. The shape measuring apparatus of claim 1, wherein said pattern forming means includes light transmitting means for alternatingly transmitting different colors of light, said M series pattern being formed by an alternating arrangement of the different colors of light.
6. The shape measuring apparatus of claim 1, wherein said pattern includes a slit type blue color filter for permitting a blue color light to transmit therethrough, and a slit type red color filter for permitting a red color light to transmit therethrough, said M series pattern being formed by an alternating arrangement of said blue color filter and said red color filter.
7. The shape measuring apparatus of claims 1, 2, 3, 4, 5, or 6, wherein said M series pattern is displaced from its initial setting by every 1/n pitch in the pitch width direction.
US07/039,455 1986-04-18 1987-04-17 Shape measuring apparatus Expired - Lifetime US4834530A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP61089514A JPH0726828B2 (en) 1986-04-18 1986-04-18 Shape measuring device
JP61-89514 1986-04-18

Publications (1)

Publication Number Publication Date
US4834530A true US4834530A (en) 1989-05-30

Family

ID=13972895

Family Applications (1)

Application Number Title Priority Date Filing Date
US07/039,455 Expired - Lifetime US4834530A (en) 1986-04-18 1987-04-17 Shape measuring apparatus

Country Status (2)

Country Link
US (1) US4834530A (en)
JP (1) JPH0726828B2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5175601A (en) * 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
US20020012460A1 (en) * 2000-06-26 2002-01-31 Kabushiki Kaisha Topcon Stereo image measuring device
WO2003044460A1 (en) * 2001-11-21 2003-05-30 Mapvision Oy Ltd Method for determining corresponding points in three-dimensional measurement
US20030113020A1 (en) * 2001-12-19 2003-06-19 General Electric Company Method for the extraction of image features caused by structure light using template information
US20050134816A1 (en) * 2003-12-22 2005-06-23 Asml Netherlands B.V. Lithographic apparatus, method of exposing a substrate, method of measurement, device manufacturing method, and device manufactured thereby
US20050274909A1 (en) * 2004-06-10 2005-12-15 Asml Netherlands B.V. Level sensor for lithographic apparatus
CN101558283B (en) * 2006-10-16 2012-01-11 弗兰霍菲尔运输应用研究公司 Device and method for the contactless detection of a three-dimensional contour
WO2014000738A3 (en) * 2012-06-29 2014-03-27 Inb Vision Ag Method for capturing images of a preferably structured surface of an object and device for image capture
WO2014114663A1 (en) * 2013-01-23 2014-07-31 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Optical device and method for determining spatial coordinates of surfaces of macroscopic objects by triangulating two line-scan cameras
US9709387B2 (en) 2012-11-21 2017-07-18 Mitsubishi Electric Corporation Image generation device for acquiring distances of objects present in image space

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0820232B2 (en) * 1990-06-26 1996-03-04 富士通株式会社 Three-dimensional measuring device
DE69833813T2 (en) 1997-05-22 2006-11-30 Kabushiki Kaisha Topcon Device for correspondence search in a pair of pictures
SE517445C2 (en) * 1999-10-01 2002-06-04 Anoto Ab Position determination on a surface provided with a position coding pattern
US7835017B2 (en) 2004-12-22 2010-11-16 Asml Netherlands B.V. Lithographic apparatus, method of exposing a substrate, method of measurement, device manufacturing method, and device manufactured thereby
JP4874657B2 (en) * 2006-01-18 2012-02-15 ローランドディー.ジー.株式会社 Three-dimensional shape measuring method and apparatus
FR2938330A1 (en) * 2008-11-07 2010-05-14 Michelin Soc Tech EVALUATION OF THE SURFACE SURFACE OF A PNEUMATIC BY STEREOVISION ACTIVE
CN104634276B (en) * 2015-02-12 2018-08-07 上海图漾信息科技有限公司 Three-dimension measuring system, capture apparatus and method, depth computing method and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3866052A (en) * 1973-11-02 1975-02-11 Dynell Elec Methods for generating signals defining three-dimensional object surfaces
US4185918A (en) * 1975-08-27 1980-01-29 Solid Photography Inc. Arrangement for sensing the characteristics of a surface and determining the position of points thereon
US4259589A (en) * 1979-07-20 1981-03-31 Solid Photography, Inc. Generation of contiguous data files of three-dimensional information
US4634278A (en) * 1984-02-06 1987-01-06 Robotic Vision Systems, Inc. Method of three-dimensional measurement with few projected patterns

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6049474A (en) * 1983-08-29 1985-03-18 Matsushita Electric Ind Co Ltd Object detecting method
JPS60152903A (en) * 1984-01-21 1985-08-12 Kosuke Sato Position measuring method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3866052A (en) * 1973-11-02 1975-02-11 Dynell Elec Methods for generating signals defining three-dimensional object surfaces
US4185918A (en) * 1975-08-27 1980-01-29 Solid Photography Inc. Arrangement for sensing the characteristics of a surface and determining the position of points thereon
US4259589A (en) * 1979-07-20 1981-03-31 Solid Photography, Inc. Generation of contiguous data files of three-dimensional information
US4634278A (en) * 1984-02-06 1987-01-06 Robotic Vision Systems, Inc. Method of three-dimensional measurement with few projected patterns

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993008448A1 (en) * 1991-10-15 1993-04-29 Electro-Optical Information Systems High-speed 3-d surface measurement surface inspection and reverse-cad system
US5175601A (en) * 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
US20020012460A1 (en) * 2000-06-26 2002-01-31 Kabushiki Kaisha Topcon Stereo image measuring device
US6810142B2 (en) * 2000-06-26 2004-10-26 Kabushiki Kaisha Topcon Stereo image measuring device
WO2003044460A1 (en) * 2001-11-21 2003-05-30 Mapvision Oy Ltd Method for determining corresponding points in three-dimensional measurement
US20050012056A1 (en) * 2001-11-21 2005-01-20 Esa Leikas Method for determining corresponding points in three-dimensional measurement
US7046377B2 (en) 2001-11-21 2006-05-16 Mapvision Oy Ltd. Method for determining corresponding points in three-dimensional measurement
US7136171B2 (en) * 2001-12-19 2006-11-14 General Electric Company Method for the extraction of image features caused by structure light using template information
US20030113020A1 (en) * 2001-12-19 2003-06-19 General Electric Company Method for the extraction of image features caused by structure light using template information
US20050134816A1 (en) * 2003-12-22 2005-06-23 Asml Netherlands B.V. Lithographic apparatus, method of exposing a substrate, method of measurement, device manufacturing method, and device manufactured thereby
US20050274909A1 (en) * 2004-06-10 2005-12-15 Asml Netherlands B.V. Level sensor for lithographic apparatus
US7265364B2 (en) * 2004-06-10 2007-09-04 Asml Netherlands B.V. Level sensor for lithographic apparatus
CN101558283B (en) * 2006-10-16 2012-01-11 弗兰霍菲尔运输应用研究公司 Device and method for the contactless detection of a three-dimensional contour
WO2014000738A3 (en) * 2012-06-29 2014-03-27 Inb Vision Ag Method for capturing images of a preferably structured surface of an object and device for image capture
CN104583713A (en) * 2012-06-29 2015-04-29 Inb视觉股份公司 Method for capturing images of a preferably structured surface of an object and device for image capture
US10869020B2 (en) 2012-06-29 2020-12-15 Inb Vision Ag Method for capturing images of a preferably structured surface of an object and device for image capture
US9709387B2 (en) 2012-11-21 2017-07-18 Mitsubishi Electric Corporation Image generation device for acquiring distances of objects present in image space
WO2014114663A1 (en) * 2013-01-23 2014-07-31 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Optical device and method for determining spatial coordinates of surfaces of macroscopic objects by triangulating two line-scan cameras

Also Published As

Publication number Publication date
JPS6324116A (en) 1988-02-01
JPH0726828B2 (en) 1995-03-29

Similar Documents

Publication Publication Date Title
US4834530A (en) Shape measuring apparatus
SU1494857A3 (en) Method of partial equalizing retouching in reproduction of color images
EP0105661B1 (en) Apparatus for inspecting a circuit pattern drawn on a photomask used in manufacturing large scale integrated circuits
EP0335559B1 (en) A telecentric imaging system
US20130020386A1 (en) Information input output method using dot pattern
GB2063005A (en) A method and apparatus for laying out picture images reproduced from original pictures
US4641199A (en) Image reading apparatus
US20110311105A1 (en) Method for 3d, measurement of the surface of an object, in particular for dental purposes
US6100915A (en) Laser drawing apparatus
JP2000199702A (en) System and method for measuring shift of image sensor chip
US7342598B2 (en) Apparatus and method for recording image on printing plate
US5572009A (en) Method for encoding a machine-readable measurement scale
US5077765A (en) Method of scanning an x-ray image by means of electrometer probes, and device for performing the method
GB2336729A (en) Reticle for an object measurement system
JPH058822B2 (en)
JPS6313571A (en) Chart for adjusting optical system of image reader
JPH1051628A (en) Image reader
KR840000665B1 (en) Optical reading system
JP3556076B2 (en) Image position error measuring device
JPH10126584A (en) Image reader
KR830001720B1 (en) Image Aggregation Method in an Image Scanning Recording Device
JPS634746B2 (en)
JPH04326380A (en) Exposing position correcting system for printer device
JP3464737B2 (en) Image reading device
JP2002374392A (en) Device and method for measuring image position deviation and storage medium with stored program based on the same method

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOKYO KOGAKU KIKAI KABUSHIKI KAISHA, 75-1, HASUNUM

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNORS:MURAI, SHUNJI;OHTOMO, FUMIO;OTANI, HITOSHI;REEL/FRAME:004721/0798

Effective date: 19870414

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: KABUSHIKI KAISHA TOPCON, 1-GO 75-BAN HASUNUMA-CHO,

Free format text: CHANGE OF NAME;ASSIGNOR:TAMIO NISHIWAKI;REEL/FRAME:005133/0732

Effective date: 19890724

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12