WO1996023626A1 - Edge detection in automatic sewing apparatus - Google Patents

Edge detection in automatic sewing apparatus Download PDF

Info

Publication number
WO1996023626A1
WO1996023626A1 PCT/GB1996/000180 GB9600180W WO9623626A1 WO 1996023626 A1 WO1996023626 A1 WO 1996023626A1 GB 9600180 W GB9600180 W GB 9600180W WO 9623626 A1 WO9623626 A1 WO 9623626A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge
camera
real
sewing
pattern
Prior art date
Application number
PCT/GB1996/000180
Other languages
French (fr)
Inventor
Frederick Mark Hudman
Alan John Crispin
Maja Pokric
Boris Pokric
David Creyke Reedman
Original Assignee
British United Shoe Machinery Limited
Usm Espana, S.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB9501888.3A external-priority patent/GB9501888D0/en
Priority claimed from GBGB9509350.6A external-priority patent/GB9509350D0/en
Application filed by British United Shoe Machinery Limited, Usm Espana, S.L. filed Critical British United Shoe Machinery Limited
Publication of WO1996023626A1 publication Critical patent/WO1996023626A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q35/00Control systems or devices for copying directly from a pattern or a master model; Devices for use in copying manually
    • B23Q35/04Control systems or devices for copying directly from a pattern or a master model; Devices for use in copying manually using a feeler or the like travelling along the outline of the pattern, model or drawing; Feelers, patterns, or models therefor
    • B23Q35/08Means for transforming movement of the feeler or the like into feed movement of tool or work
    • B23Q35/12Means for transforming movement of the feeler or the like into feed movement of tool or work involving electrical means
    • B23Q35/127Means for transforming movement of the feeler or the like into feed movement of tool or work involving electrical means using non-mechanical sensing
    • B23Q35/128Sensing by using optical means

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

An edge detector arrangement is described suitable for automatic sewing apparatus in which a light pattern is incident upon an edge (14) between layers (11, 12) of material. A camera (17) being arranged to view the edge (14) in order to view a discontinuity (33; 42) in at least one line of the light pattern. This discontinuity being indicative of edge position in each segment of the edge (14) viewed by the camera (17). Such that the sequential segments of the edge (14) and so the real edge location data being consolidated as a real edge data signal set which may be used in appropriate microprocessor means in order to adapt a stitch path stored in an automatic sewing apparatus in comparison with an expected or ideal edge path location set. Edge detection may be conducted whilst the automatic sewing apparatus is in operation or as a pre-scanning stage prior to sewing. The image from the camera is grabbed by a frame grabber element (22) and may be processed to enhance discontinuity position definition.

Description

EDGE DETECTION IN AUTOMATIC SEWING APPARATUS
The present invention relates to edge detection and more particularly but not exclusively to edge detection in automatic sewing apparatus where a sewing pattern can be referenced with regard to the edge.
There are a wide variety of instances where detection of a raised edge in a surface may be necessary in order to perform subsequent manufacturing steps. An example of such edge detection is in automatic sewing machines for the shoe industry. It will be understood that several shoe styles and constructions include stitching to secure shoe elements together particularly upper components. Typically the components are laid one on top of the other within a pallet. Thus, there is a height variation edge between layers and this edge is typically used as a reference for constructional and decorative stitch patterns.
It will be understood that edge position is consequently highly important if accurate and consistent stitching is to be achieved. Unfortunately it is impossible to provide absolute accuracy in loading components due to problems of vibration displacement, lack of operative skill and other inherent limitations with regard to placing components in accurate location with regard to each other. Furthermore, the actual components may be misplaced or badly cut adding to problems of accuracy.
In view of the above it will be understood that in some circumstances it may be convenient to provide actual edge detection in association with an automatic sewing apparatus. This edge detection may be performed either by pre-scanning the whole of the surface to be sewn prior to sewing in order to determine edge location or by dynamically determining the edge position just prior to sewing.
Unfortunately in most situations where an automatic sewing apparatus is used there is little or no contrast between the components lying on top of each other and so it is difficult to determine the edge by known visual techniques. More recently attempts have been made to overcome the lack of contrast between components by utilising image shadows to provide a more distinct image of the edge. These shadow techniques involve arranging suitable illumination and a camera with regard to an edge such that the illumination element creates a shadow as viewed by the camera. Thus, irrespective of the contrast between the component layers the edge always creates a shadow when illuminated. This shadow can be readily seen by the camera and using appropriate data processing the edge can be determined.
Use of shadow techniques has disadvantages in that external lighting may effect results and lifting between components due to warping or other factors may lead to spurious effects. Furthermore it may be necessary to provide several illumination lamps in order to ensure that a shadow is created for a camera viewing the surface.
Automatic sewing apparatus are increasingly replacing more traditional manual sewing machines particularly where there is a requirement for quite intricate decorative pattern stitching for example in so called "cowboy boots".
Conventional automatic sewing apparatus comprises a sewing head mounted above a suitable X-Y table to enable a workpiece mounted in a pallet to be moved relative to the head. Each stitch motion of the sewing head and each feed motion of the X-Y table are preprogrammed as a stitch path data set and repeated sequentially in order to provide the necessary stitching of components held within the pallet. As indicated above these conventional pallet systems are highly dependant on accurate location of the components within the pallet. If such location is not achieved the stitched end product will be rejected. For the most part raised edges between components are used as references for the stitch pattern and in any event will exaggerate mislocation of the stitch pattern as viewed by an observer and thus rejection. Thus, the automatic sewing apparatus includes a predicted or ideal edge in the form of an ideal edge data set.
The stitch path data set is stored in appropriate storage means in terms of signals, normally of a digital type which are fed to the sewing head and the X-Y table as necessary. It will be appreciated in the more recent automatic sewing systems incorporating actual edge detection that this stored stitch data set is adapted in response to the actual edge detected in the pallet rather than simply repeated for each and every component assembly presented in a pallet. The adaptation is most conveniently performed upon the whole stitch path data set rather than segments in order that there are no short term corrections and thus zig-zag stitching. If the whole stitch path data set is amended gradual adjustment can be made for edge displacement from the ideal. Examples of prior automatic sewing apparatus are shown in EP0309069 and GB2237412.
In EP0309069 a real-time sewing apparatus is disclosed in which a camera is arranged upon a mechanical rotation device in order to ensure the camera remains substantially perpendicular to an edge in order to maintain a good view of that edge. Thus, the apparatus can modify the sewing position during sewing operation to compensate for any deviation in edge position.
From above it will be appreciated that by alteration of stitch length and X-Y table feed parameters accommodation can be made for displacement of the actual or real raised edge location between components in a pallet as compared to the predicted or ideal i.e. edge location necessary for consistency within the stored stitch path data set. it is an objective of the present invention to provide an edge detection arrangement for automatic sewing apparatus.
In accordance with the present invention there is provided an edge detector for a surface having a height variation edge, the detector including a laser light source arranged to cast a light pattern constituted by one or more lines towards the edge and wherein at least one of said lines of the pattern extends across the edge, a camera is arranged at an inclined position relative to the edge such that the image viewed by the camera includes a discontinuity in one of the lines of the pattern due to the height of the edge, the discontinuity being indicative of edge position, the camera being coupled to processor means whereby sequential elements of the surface are viewed and so a real edge data signal set is generated representing a true edge location on the surface.
Preferably, the laser light source and camera are arranged upon a suitably rotatable assembly under control of the microprocessor, the pattern and camera remain substantially perpendicular with respect to each other and at least one line of the pattern can be kept substantially normal to the edge as sequential segments of the surface are viewed by the camera. Preferably, the processor means includes an image data element arranged to enhance image data from the camera using an appropriate data manipulation algorithm.
Preferably, the pattern is a simple straight line or composite of several straight lines respectively crossed to ensure at least one line as viewed by the camera has a discontinuity.
Preferably, the whole surface is viewed before the microprocessor determines the real edge data signal set.
Preferably, the microprocessor is arranged to make preliminary determination of real edge location from successive sequential segments of the surface, such preliminary determination being utilised by the microprocessor to control the camera and light source to ensure sequential segments on the surface as viewed by the camera are centred about the detected edge.
Further in accordance with the present invention there is provided an automatic sewing apparatus including an edge detector as defined above, said real edge data signals being compared in comparison means with ideal edge location signals, said comparison means generating correction data signals for a stitch path assignment element dependant upon variation between said real edge location data signals and said ideal edge location signals, said stitch path assignment element using said correction data signals to adapt a stored stitch path data signal set to a real stitch path data signal set which is used to control the sewing apparatus relative movement between the surface and the sewing head and to determine actual stitch positions made by the sewing head.
Preferably, the edge detector is accommodated in a common housing with the sewing apparatus.
Preferably, the edge detector is arranged to determine the real edge of the surface whilst the automatic sewing apparatus is sewing that surface.
Alternatively, the edge detector may be arranged to determine the whole real edge of the surface prior to the automatic sewing apparatus sewing that surface. Preferably, the automatic sewing apparatus includes rejection means such that if the said surface includes a real edge which varies from the ideal edge by a greater than predetermined extent, it is rejected.
In accordance with a further embodiment of the present invention there is provided a method of determining an edge in a layered surface using an edge detector including a laser light source and a camera appropriately mounted, the method including the steps of projecting a lightpattern from said laser light source upon the edge in order that the image seen by the camera includes a discontinuity due to the edge distorting said pattern, processing said image in order to determine said real edge position by appropriate calculation from fixed parameters such as the camera position and repeating previous steps as required for further segments of the edge in order to determine by appropriate plotting the real position of a desired length of said edge within said layered surface.
It is also inherent in the present invention that there is provided a method of determining or digitising a whole component using an edge detector wherein the edge detector is arranged to traverse the whole edge of a component to determine an edge profile data set of that whole edge digitally representative of the component shape.
An embodiment of the present invention will now be described by way of example only with reference to the accompanying drawings in which:
Fig. 1 is a schematic representation illustrating a desired stitch path position relative to a reference edge such as that between two shoe upper element flats;
Fig. 2 is a schematic illustration of an automatic sewing apparatus incorporating an edge detector arrangement;
Fig. 3 is a stylised illustration of the edge detector arrangement; and,
Fig. 4 illustrates an alternate edge detector arrangement; and,
Fig. 5 illustrates in plan schematic view the present edge detector used to determine or digitise a whole component.
Consider Fig. 1 showing a base layer 11 with an upper layer 12 laying above it. There is a height variation edge 14 between the layers 11, 12 and this edge 14 provides a reference for a stitch path defined 1. It is correct stitch path 1 location with respect to edge 14 that is the objective of the edge detector in accordance with the present invention.
As indicated previously due to misalignment, misformation or simply poor location within a work station or pallet the edge 14 may be displaced from an expected or ideal location within the pallet or more importantly with respect to the X-Y table/sewing head assembly. Thus, the pre-stored sewing path as represented by a series of coded signals in a stitch path data set will provide an erroneous location of the stitch path 1 relative to the real edge 14. The edge detector in accordance with the present invention determines the real edge position with a degree of accuracy and provides through a microprocessor element an adjusted stitch path data set corrected to give proper orientation with regard to the real edge 14. However, stitch path correction can be ignored if desired. Referring to Fig. 2 which is a schematic illustration of an edge detector in accordance with the present invention incorporated within automatic sewing apparatus. A laser light source 2 presents a light pattern 3 towards the contoured surface formed by layers 11, 12. This pattern 3 comprises at least one line which impinges upon the real edge 14. A camera 17 views the surface comprising layers 11, 12 and in particular the real edge 14 and provides image data to a microprocessor 21 including several processing elements.
The camera 17 views sequential segments of the surface comprising layers 11, 12 and in particular the real edge 14 in order to determine real edge position. A segment is shown as a dotted foot print 29. It will be appreciated that the spacing of sampled segments will determine how accurately the actual or real edge is detected. Furthermore the microprocessor 21 may be arranged to ensure greater edge sampling at points of significant deviation in the edge 14 as compared to more consistent straight lengths.
The camera 14 provides an image, ie footprint 29 to the microprocessor 21 in the form of pixels showing light intensity at specific locations in the image. A frame grabber element 22 of the microprocessor 21 grabs consecutive images from the camera 17 representative of sequential segments of the edge 14. The frame grabber 22 converts these images in the form of pixel intensity levels into a digital map of the image. This digital map is passed to an edge determination element 23 which analyses the digital map in order to determine the real edge location within the image. Typically, the frame grabber 22 is a Field Programmable Gate Array (FPGA) . At the same time a Digital Signal Processor (DSP) processes the previous image and sends edge points to the transputer module built around the processor 21. Concurrently the transputer assignees the current position of the X/Y table to the each frame and switches the appropriate laser diode. As the transputer is equipped with 4 serial links. All communication with the 'outer world' is done via parallel to serial and serial to parallel converters. The edge determination element 23 views the digital map and compares light intensity in the image presented. Once the actual edge location is determined it is ready for comparison with the expected ideal or predicted edge location in an edge comparison element 24. Ideally a substantial part or preferably all of the real edge 14 is determined before comparison is made in order to ensure localised discrepancies in the edge 14 do not create spurious results i.e. zig-zag stitches in the stitch path 1 due to local irregularities in the edge 14. Once comparison between the real or actual edge 14 and the predicted ideal edge has been made, the comparison element 24 generates a correction data set which is passed to a data path adjustment element 25. In the data path adjustment element 25 the stored stitch path data set is adjusted to accommodate the real edge position and a real stitch path generated which is supplied to an automatic sewing apparatus X-Y table 26 and to the sewing head, (not shown but which would be above the edge 14 in use) of that apparatus in order that the layers 11, 12 can be sewn as required. Fig. 3 illustrates the edge detector arrangement in accordance with the present invention in schematic form. The laser projects the pattern 3 towards the surface comprising layers 11, 12. The pattern 3 is arranged to be cast upon the edge 14. The camera 17 is positioned at an inclined angle A relative to the contoured surface composing layers 11, 12. Due to this inclined angle A the camera has an image as shown in frame 31 including a discontinuity 33. The discontinuity 33 is due to the height of the edge 14 created by the overlap of layers 11, 12 distorting the camera's view 17 of the pattern 3. This discontinuity 33 is indicative of edge 14 position. Furthermore, as a laser line is used to generate the pattern 3 the image is sharp and well defined allowing accurate determination of pixel intensity by the frame grabber 22 and thus definition of the edge 14 position.
It will be appreciated that the inclination angle A of the camera 17 is highly important in determining the scope and width of the discontinuity 33. It has been found that an inclination angle A of 45 degrees provides accurate results. However, accuracy varies with angle and an inclination of 60 degrees may be more appropriate in other situations. Camera position must be fixed at least during detection in order that this is a fixed reference to determine edge position.
- It will be understood that the edge detector arrangement in accordance with the present invention is in effect utilising height variation in the surface constituted by layers 11, 12 at the edge 14 to provide the discontinuity 33. Thus, it is not only misplacement in the X and Y axis of the edge 14 in comparison with the expected ideal edge position that is determined but also any uplift of the layer 12 relative to the layer 14 due to distortion or other factors and so the edge detector arrangement in accordance with the present invention determines edge variation in the Z axis as well. Such multi-directional edge detection allows greater accuracy in stitch path allocation, by the stitch path adjustment element 25.
It will be understood that the sharpest discontinuity definition in the pattern 3 is provided when the line of pattern 3 is substantially perpendicular to the edge 14. However, in practice it will be understood that the edge 14 is curved through the surface constituted by layers 11, 12 and so it would be advantageous to ensure the pattern 3 remains substantially normal, i.e. perpendicular to the edge 14 by mounting the laser and/or the camera in suitable rotatable assemblies to ensure such configuration is maintained.
An alternative to light source and camera rotation is illustrated in Fig. 4 where the light pattern is constituted by crossed lines 72, 73. It will be appreciated that with such crossed lines 72, 73 that a discontinuity 42 will be present in most configurations of a light source 15 and the camera 17. The camera 17 views an image frame 41 including one continuous line 73 and the line 72 having a discontinuity 42. In these circumstances the line 73 may be ignored and the discontinuity 42 utilised to determine edge location. It will be also appreciated that the crossed line pattern of line 72, 73 could be provided in a multitude of manners but one approach, as illustrated in Fig. 4, is to have two lasers 70, 71 arranged to provide lines 72, 73 in an appropriate crossed relationship. As provision of crossed lines and light source/camera rotation significantly adds to both cost and operational complexity, an alternative approach may be used. For example when the pattern 3 in Fig. 3 approaches the situation where the pattern 3 is parallel with the edge 14 it is to arrange that the sequential segments viewed by the camera 17 are of relatively small spacing so that the point when the discontinuity 33 disappears is determined as the edge 14 position. Alternatively, the camera 17 can be positioned such that it is at right angles to the edge 14 and so when a pattern disappears due to the edge obscuring the image it is, accepted that this is the zenith of the edge position for edge determination purposes. Practically, it is important that at least some of the line of the pattern 3 remains in view either side of the obscured section such that projection from the segments can be utilised to determine edge position. This approach would not work if the whole line of pattern 3 were hidden by the edge 14. Alternatively, the microprocessor 21 could simply extrapolate from previous edge determination data to in affect 'fill in' missing edge location data. Returning to Fig. 4 it will be appreciated that a multitude of angles between lines 72, 73 could be used including possibly a simple perpendicular cross however it has been found most convenient to provide that the angle between lines 72, 73 is about 20 degrees. Furthermore, although two lasers 70, 71 are illustrated in the laser light source 15, it will be possible to provide lines 72, 73 using a suitable optical arrangement from the same laser source 15. In addition in order to reduce processing complexity it may be appropriate to switch off one of the lines 72, 73 for general edge detection and only activate the second line when necessary as determined by results in the microprocessor 21. As indicated previously it is the height difference between layers 11, 12 at edge 14 which is utilised to determine actual edge position. The pattern 3 lays upon the layers 11, 12 but not the edge 14 face particularly if the camera 17 is aligned with the edge 14. It will be appreciated that it is a discontinuity in the pattern 3 line which provides edge detection data at each observed segment of the edge 14. An alternative is to process the edge data received from the frame grabber 22 and by appropriate manipulation with a polynomial function such as a spine, cubic interpolation or regression algorithm to determine discontinuities as first or secondary derivative maxima results which will present spikes effectively in the processed results. These spikes representing the edge. Subsequently, edge comparison and stitch path manipulation is as for simple image extraction.
In the present invention, the camera 17 position is fixed and thus analysis of the image allows edge determination by in effect a triangulation technique in respect of the image discontinuity. Position of the camera allows determination of position of the discontinuity representative of the real edge in this image, successive sequential segment images allow the real edge to be plotted and compared to the predicted or ideal edge.
In an automatic apparatus the pallets as indicated previously are moved by an X-Y table and stitch points accurately located from either a sewing path defined using an X,Y co-ordinate look-up table or X,Y co-ordinates generated as required from stored and scanned edge X-Y co-ordinates.
It will be appreciated that the reference and/or actual material edge can be modified or optionally an old data set retrieved and adjusted as necessary.
It has been found that suitable apparatus for the edge detection arrangement in accordance with the present invention are as follows: an Oscar OS-25 CCD area scan camera, a DIPIX P360F frame grabber and a computer including an INTEL 80486 DX2 microprocessor. The P360F offers four analogue and one configurable 8 or 16 bit digital output. The illumination laser diodes have an output power of 3 milliwatts (class 3A) equipped with line generating lenses.
As can be seen in the Figures, the illumination pattern 3 is broken by the edge 14 or transition region between two overlapping elements. The photosensitive surface of the camera comprises a rectangular area divided into pixels. Each pixel corresponds to a certain area in equal ratio of a scanned image. Each pixel senses the analogue information which corresponds to the light intensity absorbed by its photocell. These analogue signals are all transmitted to the microprocessor 21 usually through a serial connection. Each analogue signal is digitised usually in 8-bit format giving 256 grey levels and stored in the frame grabber 22. This information is then consolidated and processed by the microprocessor 21 in order to enhance and determine edge position. As indicated previously there are varying levels of accuracy and tolerance with regard to edge detection.
As an alternative a Dalsa camera CA-D9-2048 may be used. In addition it may be necessary irrespective of the camera used for prevention of spurious light during scan mode.
It will be appreciated there is a so-called integration time, i.e. the time for the photo sensor to respond for each pixel and that the camera is scanning the image quite rapidly. Thus it will be understood that adjustment in the processing regime must be made to ensure appropriate edge detection. Such adjustment is made by determination of a suitable algorithm to scale response signals.
As each pixel is represented by 8 bits (256 grey levels), the observed laser line will be in the brightest region of the image histogram. Using that knowledge it is possible to separate the projected laser line from the background. This process is known as thresholding. In order to successfully separate the laser line, the threshold value needs to be calculated for each particular combination of the leather materials before the actual inspection process. A specialised algorithm has been developed for this purpose and it is based on optimization of the two main laser line parameters which are thickness of the laser line and number of empty vertical lines in the laser line. During the inspection process positions of the material edges in the field of view of the camera are continuously extracted from the discontinuity in the laser line until the X/Y table reaches the final position. Prior knowledge of the laser line orientation and the camera angle has to be incorporated into edge position calculation. After the thresholding, extracted laser line is cleaned and thinned in order to get a single pixel thin edge profile. Position and orientation of the edge and thickness of the material are calculated and information passed to the transputer processor 21. It is very important to process each frame in as short a time as possible because the speed of the X/Y table (and therefore the whole inspection time) is directly dependant on it. Currently the processing time for a single frame is in the range from 2.2 to 18 ms.
As indicated previously there are two effective approaches to pre-scanning a pallet loaded with elements. The first approach of scanning the edge substantially alone, i.e. following the edge in a raster pattern is found to be most effective for relatively short, i.e. below 6 metre, edge lengths. However, with edge lengths in excess of 6 metres it is most convenient to scan the whole pallet in strips.
In the present invention a high frame rate camera is configured to react to different laser intensity levels. Such reaction can be achieved by for example one of the following techniques:-
(1) Automatic aperture control; or
(2) multiple images of the illumination pattern at the edge between materials of different reflectivity each image captured with an integration time selected according to reflectivity, but this is not too appropriate for shoe components of similar reflectivity; or
(3) single image analysis with an average integration time for material reflectivity values; or
(4) close control of laser illumination intensity.
As indicated previously, the object to be sewn is pre- scarmed and a stitch path generated which is parallel, usually, to the actual edge. This stitch path can be generated by a range of geometric methods such as:-
(1) Linearly interpolating between edge points then projecting normal lines to the linear segments at each edge point. Determining distance from actual edge to projected stitch line on each normal. Parallel, stitch, line points can then be generated using the gradient of each edge segment. For this method, high spatial resolution is required.
(2) Cubic interpolation between three edge points. Initially, a first derivative tangent is found such that lines normal to tangents can be determined. Using an appropriate technique the stitch to edge distance on each line normal to the tangent results in a parallel stitch path. This method is most appropriate for low spatial resolution.
(3) Determine the gradient of the bisector of the angle between three edge points and then processing to calculate parallel line points.
Further elements and embodiments of the present invention will be understood by those skilled in the art. Specific apparatus for use within the edge detection system will be readily chosen by those skilled in the art to achieve required performance. As the present invention depends on a discontinuity rather than typical reflectivity it will be understood that the reflectance or colour of the surface is not of such paramount importance. Once the thickness of all layers of material to be sewn is known it is possible to determine with even greater accuracy sewing speed i.e. combination of stitches per second with spacing of stitches. Such sewing speed is highly dependent upon the total thickness of the material through 0 which the needle must pass. The present edge detection arrangement as indicated previously can determine edge depth from the image discontinuity thus appropriate sewing speed for sewing apparatus can be determined/set more accurately. The stitch path as indicated previously is stored as a program of 5 sequential steps and this program is adjusted to vary sewing speed.
Referring to Fig. 5 illustrating in schematic plan how an edge detection arrangement 101 in accordance with the present invention can be utilised to determine or digitise a whole 0 component 102 rather than merely an edge. The arrangement 101 is as previously described in that a camera 103 is arranged to view a laser pattern 3 which includes a discontinuity due to a height variation contour i.e. edge between the component 102 and its support surface. The arrangement 101 also includes a 5 processor element 104 including a frame grabber etc. to determine the position of the edge.
The arrangement 101 and more importantly the pattern 3 are moved about the whole component 102 such that the whole edge of the component 102 is determined. Thus, by appropriate
30 correlation between edge position and the arrangement position the shape of the component 102 can be digitised i.e. represented in a digital format. Thus, digital representation of the component 102 can then be passed to a CAD system 105 for manipulation and possible representation upon a V.D.U.
35106.
The CAD system 105 can use the digital representation to enable manipulation on to a design structure and possibly allow evaluation of how the component 102 will perform when bent etc. It will be understood that the processor 104 can also provide details to the CAD system 105 of component 102 thickness as represented by edge height. Alternatively, once the whole component 102 has been digitised it will be understood that in an automatic sewing apparatus, the processor 104 may be arranged to generate completely rather than adapt an existing stitch path data set. This completely new stitch path could be arranged to remain a consistent distance from the edge of component 102 over its whole surface or the processor 104 can be arranged to place a decorative stitch path design at the most advantageous position within the component 102.

Claims

Claims :
1. An edge detector arrangement for a surface having a contour or height variation edge, the detector arrangement including a laser light source arranged to cast a light pattern including one or more lines wherein, in use at least one of the lines extends across the edge, a camera arranged at an inclined position relative to the edge such that the image viewed by the camera, in use includes a discontinuity in the pattern due to the contour or edge distorting the camera's view of the pattern, the discontinuity being indicative of edge position, the camera being coupled to processor means whereby the camera is arranged to view sequential segments of the surface and the processor generates a real edge data signal set representative of true edge location in the surface from image data signals captured by the camera.
2. An edge detector as claimed in Claim 1 wherein the light source and camera are arranged upon a suitable rotatable assembly in order that under control of the processor means the image of the camera is kept substantially aligned with the edge and the pattern is kept substantially perpendicular to the edge as sequential segments of the surface are viewed.
3. An edge detector as claimed in Claim 1 or 2 wherein the processor means includes an image data element arranged to manipulate image data signals from the camera using an appropriate data manipulation algorithm to distinguish the discontinuity within the image data signals.
4. An edge detector as claimed in Claim 1 or 2 or 3 wherein the light source casts the pattern as a simple straight line or a composite of several straight lines respectively crossed, to ensure at least one line as viewed by the camera has a discontinuity in the camera image.
5. An edge detector as claimed in any proceeding claim wherein the camera and processor are arranged to view the whole surface before the microprocessor determines the real edge data signal set.
6. An edge detector as claimed in any preceding claim wherein the processor is arranged to make preliminary determination of real edge location from successive sequential segments of the surface, such preliminary determination being utilised by the processor to control the camera and light source to ensure viewed sequential segments of the surface are centred about the detected real edge.
7. An automatic sewing apparatus including an edge detector as claimed in Claim 1 and any claim dependent thereon wherein the processor includes comparison means to compare said real edge data signal set with an ideal edge location data signal set, said comparison means generating correction data signals for a stitch path assignment element depend upon variation between said real edge location data signal set and said ideal edge location data signal set, said stitch path assignment element using said correction data signals to adapt a stored stitch path data signal set to a real stitch path data set, and said real stitch path data signal set being used by said processor means to control the sewing apparatus relative movement between the surface and the sewing head and so determine a sewing path.
8. An apparatus as claimed in Claim 7 wherein the edge detector is accommodated within a. common housing with the sewing apparatus.
9. An apparatus as claimed in Claim 7 or Claim 8 wherein the edge detector is arranged to determine the real edge of the surface whilst the automatic sewing apparatus is sewing that surface.
10. An apparatus as claimed in Claim 6 or Claim 7 wherein the edge detector is arranged to determine the real edge of the surface prior to the automatic sewing apparatus sewing that surface.
11. An apparatus as claimed in any of Claims 7, 8, 9 or
10 wherein the apparatus includes rejection means arranged to reject surfaces for sewing if said surface includes a real edge which varies from the ideal edge by greater than a predetermined extent.
12. An apparatus as claimed in any of claims 7, 8, 9, 10 or 11 wherein the apparatus includes an X-Y table coupled to the processor means for movement of the surface in order to facilitate edge detection by moving the surface as appropriate under control of the processor means to bring segments of the edge into view of the camera.
13. An apparatus as claimed in claim 12 wherein the X-Y table is arranged to be controlled by the processor means to move the surface substantially in accordance with the stored stitch path data set in order to determine the real edge position.
14. An apparatus as claimed in any of Claims 7 to 13 where where the processor is arranged whereby the real stitch path data set is adjusted dependent upon material thickness i.e. edge height to regulate sewing speed.
15. A method of determining an edge in a layered surface using an edge detector including a laser light source and a camera appropriately mounted, the method including the steps of:-
(1) projecting a light pattern from said laser light source upon the edge in order that the image seen by the camera includes a discontinuity due to the edge distorting said pattern,
(2) processing said image in order to determine said real edge position by appropriate calculation from fixed parameters such as the camera position,
(3) repeating steps (1) and (2) as required for further segments of the edge in order to determine by appropriate plotting the real position of a desired length of said edge within said layered surface.
16. A method of determining or digitising a whole component using an edge detector as claimed in Claim 1 and any claim dependent therein wherein the edge detector is arranged to traverse the whole edge of a component to determine an edge profile data set of that whole edge digitally representative of the component shape, said edge profile being held as a digital data set to allow representation and/or manipulation by an appropriate processor means in order to adapt or create a stitch path for sewing apparatus.
PCT/GB1996/000180 1995-01-31 1996-01-29 Edge detection in automatic sewing apparatus WO1996023626A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
GB9501888.3 1995-01-31
GBGB9501888.3A GB9501888D0 (en) 1995-01-31 1995-01-31 Edge definition for automatic sewing apparatus
GBGB9509350.6A GB9509350D0 (en) 1995-05-09 1995-05-09 Edge detection in automatic sewing apparatus
GB9509350.6 1995-05-09
GBGB9520232.1A GB9520232D0 (en) 1995-05-09 1995-10-04 Edge detection in automatic sewing apparatus
GB9520232.1 1995-10-04

Publications (1)

Publication Number Publication Date
WO1996023626A1 true WO1996023626A1 (en) 1996-08-08

Family

ID=27267570

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1996/000180 WO1996023626A1 (en) 1995-01-31 1996-01-29 Edge detection in automatic sewing apparatus

Country Status (2)

Country Link
IL (1) IL116567A0 (en)
WO (1) WO1996023626A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0938955A1 (en) * 1997-03-28 1999-09-01 Fanuc Ltd Work line searching method and robot/sensor system having work line searching function
US6899043B2 (en) * 2002-12-12 2005-05-31 Texpa Maschinenbau Gmbh & Co. Kg Method and device for cutting and folding a fabric section
US10240271B2 (en) 2016-03-08 2019-03-26 Toyota Motor Engineering & Manufacturing North America, Inc. Sewing apparatus
US10563330B2 (en) 2016-06-08 2020-02-18 One Sciences, Inc. Methods and systems for stitching along a predetermined path

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3719971A1 (en) * 1986-06-19 1987-12-23 Mitsubishi Electric Corp NUMERIC CONTROL DEVICE
EP0309069A2 (en) * 1987-09-25 1989-03-29 Yaacov Sadeh Computerized sewing apparatus
DE3730709A1 (en) * 1987-09-12 1989-03-30 Man Technologie Gmbh Sensor-controlled controller for controlling the position of a tool relative to a workpiece
US4907169A (en) * 1987-09-30 1990-03-06 International Technical Associates Adaptive tracking vision and guidance system
GB2237412A (en) * 1989-08-30 1991-05-01 Orisol Original Solutions Ltd Edge detection; automatic sewing
GB2240193A (en) * 1989-12-28 1991-07-24 Beta Eng & Dev Ltd Sewing apparatus with correctable sewing path
WO1993023820A1 (en) * 1992-05-18 1993-11-25 Sensor Adaptive Machines, Inc. Further methods and apparatus for control of lathes and other machine tools
JPH06149774A (en) * 1992-11-10 1994-05-31 Fuji Xerox Co Ltd Simulation device based upon petri net

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3719971A1 (en) * 1986-06-19 1987-12-23 Mitsubishi Electric Corp NUMERIC CONTROL DEVICE
DE3730709A1 (en) * 1987-09-12 1989-03-30 Man Technologie Gmbh Sensor-controlled controller for controlling the position of a tool relative to a workpiece
EP0309069A2 (en) * 1987-09-25 1989-03-29 Yaacov Sadeh Computerized sewing apparatus
US4907169A (en) * 1987-09-30 1990-03-06 International Technical Associates Adaptive tracking vision and guidance system
GB2237412A (en) * 1989-08-30 1991-05-01 Orisol Original Solutions Ltd Edge detection; automatic sewing
GB2240193A (en) * 1989-12-28 1991-07-24 Beta Eng & Dev Ltd Sewing apparatus with correctable sewing path
WO1993023820A1 (en) * 1992-05-18 1993-11-25 Sensor Adaptive Machines, Inc. Further methods and apparatus for control of lathes and other machine tools
JPH06149774A (en) * 1992-11-10 1994-05-31 Fuji Xerox Co Ltd Simulation device based upon petri net

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
C.UMEAGUKWU ET AL: "Investigation of an Array Technique for Robotic Seam Tracking of Weld Joints", IEEE TRANSACTIONS ON INDUSTRIALELECTRONICS, vol. 38, no. 3, NEW YORK, pages 223 - 229, XP000234795 *
PATENT ABSTRACTS OF JAPAN vol. 10, no. 207 (M - 500)<2263> 19 July 1986 (1986-07-19) *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0938955A1 (en) * 1997-03-28 1999-09-01 Fanuc Ltd Work line searching method and robot/sensor system having work line searching function
EP0938955A4 (en) * 1997-03-28 2004-03-10 Fanuc Ltd Work line searching method and robot/sensor system having work line searching function
US6899043B2 (en) * 2002-12-12 2005-05-31 Texpa Maschinenbau Gmbh & Co. Kg Method and device for cutting and folding a fabric section
US10240271B2 (en) 2016-03-08 2019-03-26 Toyota Motor Engineering & Manufacturing North America, Inc. Sewing apparatus
US10563330B2 (en) 2016-06-08 2020-02-18 One Sciences, Inc. Methods and systems for stitching along a predetermined path
US11346030B2 (en) 2016-06-08 2022-05-31 One Sciences, Inc. Methods and systems for stitching along a predetermined path

Also Published As

Publication number Publication date
IL116567A0 (en) 1996-03-31

Similar Documents

Publication Publication Date Title
JP7150105B2 (en) 3D image processing device and 3D image processing method
US4901359A (en) Method and apparatus for automatically cutting material in standard patterns
US5608453A (en) Automatic optical inspection system having a weighted transition database
US4803371A (en) Optical scanning method and apparatus useful for determining the configuration of an object
US5102223A (en) Method and apparatus for measuring a three-dimensional curved surface shape
US4575304A (en) Robot system for recognizing three dimensional shapes
US5094538A (en) Digitizing the surface of an irregularly shaped article
US4963036A (en) Vision system with adjustment for variations in imaged surface reflectivity
JP6242098B2 (en) 3D image processing apparatus, 3D image processing method, 3D image processing program, computer-readable recording medium, and recorded apparatus
JP2005513642A (en) System for cutting a preset shape in a continuous flow of sheet material
KR20060052609A (en) Laser processing machine and laser processing method
US6807289B2 (en) Method to compensate for pattern distortion on sheet-type work material spread onto a support surface
JP6246513B2 (en) 3D image processing apparatus, 3D image processing method, 3D image processing program, computer-readable recording medium, and recorded apparatus
JP6571831B2 (en) Appearance inspection device
JP2000193601A (en) Surface defect inspecting device
JP2015021759A (en) Three-dimensional image processor, head unit for the three-dimensional image processor, and three-dimensional image processing method
WO1996023626A1 (en) Edge detection in automatic sewing apparatus
JP3958815B2 (en) Tool position measuring method in NC machine tools
JP2015021763A (en) Three-dimensional image processor, three-dimensional image processing method, three-dimensional image processing program, and computer-readable recording medium
JPS598086A (en) Form detector
JP6571830B2 (en) Appearance inspection device
JP6334861B2 (en) Appearance inspection apparatus, appearance inspection method, appearance inspection program, and computer-readable recording medium
JPH08141987A (en) Cutting device
JP2539015B2 (en) Pellet bonding method and device
JP6207270B2 (en) 3D image processing apparatus, 3D image processing method, 3D image processing program, computer-readable recording medium, and recorded apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): BR CN JP KR US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase