WO2002018893A1 - Détecteur tactile optique - Google Patents
Détecteur tactile optique Download PDFInfo
- Publication number
- WO2002018893A1 WO2002018893A1 PCT/JP2001/007462 JP0107462W WO0218893A1 WO 2002018893 A1 WO2002018893 A1 WO 2002018893A1 JP 0107462 W JP0107462 W JP 0107462W WO 0218893 A1 WO0218893 A1 WO 0218893A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- marker
- tactile sensor
- markers
- group
- elastic body
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 46
- 239000003550 marker Substances 0.000 claims abstract description 168
- 239000003086 colorant Substances 0.000 claims abstract description 32
- 238000003384 imaging method Methods 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims description 27
- 238000009826 distribution Methods 0.000 claims description 17
- 238000006073 displacement reaction Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 5
- 230000007423 decrease Effects 0.000 claims description 2
- 239000000470 constituent Substances 0.000 claims 1
- 238000001228 spectrum Methods 0.000 abstract description 5
- 230000006399 behavior Effects 0.000 abstract 1
- 239000010410 layer Substances 0.000 description 41
- 239000013598 vector Substances 0.000 description 23
- 238000010586 diagram Methods 0.000 description 15
- 238000004519 manufacturing process Methods 0.000 description 12
- 230000008859 change Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 6
- 230000008901 benefit Effects 0.000 description 4
- 238000004040 coloring Methods 0.000 description 4
- 239000000853 adhesive Substances 0.000 description 3
- 229920001971 elastomer Polymers 0.000 description 3
- 238000010030 laminating Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 229920002379 silicone rubber Polymers 0.000 description 3
- 230000001070 adhesive effect Effects 0.000 description 2
- 230000008602 contraction Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 239000003973 paint Substances 0.000 description 2
- 239000005060 rubber Substances 0.000 description 2
- 239000000243 solution Substances 0.000 description 2
- 239000011550 stock solution Substances 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 239000012790 adhesive layer Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000806 elastomer Substances 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 239000000049 pigment Substances 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 238000010008 shearing Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000004945 silicone rubber Substances 0.000 description 1
- 238000007711 solidification Methods 0.000 description 1
- 230000008023 solidification Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 238000013316 zoning Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L1/00—Measuring force or stress, in general
- G01L1/24—Measuring force or stress, in general by measuring variations of optical properties of material when it is stressed, e.g. by photoelastic stress analysis using infrared, visible light, ultraviolet
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L1/00—Measuring force or stress, in general
- G01L1/24—Measuring force or stress, in general by measuring variations of optical properties of material when it is stressed, e.g. by photoelastic stress analysis using infrared, visible light, ultraviolet
- G01L1/247—Measuring force or stress, in general by measuring variations of optical properties of material when it is stressed, e.g. by photoelastic stress analysis using infrared, visible light, ultraviolet using distributed sensing elements, e.g. microcapsules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
Definitions
- the present invention relates to an optical tactile sensor, and more particularly, to a tactile sensor used for a tactile sensor for a mouth pot or a hand.
- the sensor body is made optically transparent by a transparent elastic body and some force embedded inside it. Create with markers that change the behavior. Then, by estimating the state of the contact surface by capturing the best possible behavior with an imaging system such as a CCD device, the aim is to estimate the state of the contact surface.
- An advantage of this is that a high-density tactile sensor can be constructed at a much lower cost than a mechanical one due to the recent development of an imaging device.
- this information Power and the aforementioned problem of lack of information has not been solved.
- Conventional optical tactile sensors also extract only one type of information (one-dimensional information).
- the present invention has been made to solve the problems of the conventional tactile sensor, and introduces color, that is, multi-channel sensing using an optical spectrum into an optical tactile sensor, to achieve various points on a surface. It is an object of the present invention to provide a tactile sensor that can obtain information of a plurality of degrees of freedom. Disclosure of the invention
- the present invention has been devised to solve such a problem, and is an optical tactile sensor including a tactile unit and an imaging unit, wherein the tactile unit includes a transparent elastic body and an elastic body.
- Each marker group is composed of a number of colored markers or b, and the markers constituting different marker groups have different colors from each other,
- the behavior of the colored marker when an object comes into contact with the elastic body is photographed by the imaging means.
- By photographing the behavior of the colored marker at least one or more of displacement, distortion, and inclination of the colored marker when the object comes into contact with the elastic body is observed.
- the information of the force acting on is detected.
- a plurality of types of information can be individually collected by a simple method called “color coding”, and a plurality of types of optical tactile information can be obtained simultaneously.
- force vectors can be estimated and reconstructed by collecting independent observations of unknown numbers or more by “color classification” and stably solving the inverse problem.
- the colored markers are imaged and image processed by an imaging means, in one preferred example a CCD camera. For example, comparing the images at the time of contact with the object and the state before that (the state where no external force is applied to the transparent elastic body), and detects the amount of movement of the marker. Or, always (when no external force is applied to the transparent elastic body), embed the marker in the transparent elastic body in a disposition mode so that the marker cannot be recognized, and attach the object to the transparent elastic body.
- the marker is recognized in accordance with the displacement, deformation, and inclination of the marker caused by the distortion around the marker position when the body comes into contact, and the information is detected from the appearance of the colored marker.
- the photographing means is disposed so that the side of the bright elastic body in contact with the object is located on the opposite side. Further, when there are a plurality of colored markers having different colors from each other, it is desirable to select a certain colored marker and capture it individually to facilitate processing after photographing. The selection of a colored marker is performed by using, for example, a color filter.
- a plurality of marker groups are embedded in the transparent elastic body, and each marker group is composed of a large number of markers, and the markers that constitute different marker groups are mutually different for each group. They have different colors, and the groups have different spatial arrangements. Examples of the different spatial arrangement include a plurality of marker groups arranged in a stack within the thickness of the elastic body, or a plurality of marker groups arranged to cross each other. . In this way, the obtained image is already in a state of performing some information processing (for example, providing some kind of two-dimensional information).
- the shape of the colored marker is not particularly limited, but preferable examples include spherical, cylindrical, columnar, strip-like, and planar shapes.Preferred embodiments relating to these shapes are as follows. Details will be described in the section of the embodiment. BRIEF DESCRIPTION OF THE FIGURES
- FIG. 1 is a principle diagram of a sensor device according to the present invention
- FIG. 2 is a principle diagram of a sensor according to the first embodiment
- FIG. 3 is a manufacturing process of the sensor according to the first embodiment
- FIG. 4 is a diagram illustrating the principle of the sensor according to the second embodiment
- FIG. 5 is a diagram illustrating an example of an image obtained at the time of contact in the sensor according to the second embodiment
- FIG. 6 is a diagram showing a manufacturing process of the sensor according to the second embodiment
- FIG. 7 is a principle diagram of the sensor according to the third embodiment
- FIG. It is a figure showing the manufacturing process of the sensor concerning an embodiment.
- FIG. 9 relates to the fifth embodiment.
- FIG. 10 is a diagram showing the distribution of force vectors generated between the tactile sensor and the contact object.
- Fig. 11 is an optical tactile sensor using marker movement measurement.
- Fig. 12 is a diagram showing the reconstructed force vector distribution when one point is pushed vertically
- Fig. 13 is a diagram showing the force vector distribution when one point is pushed horizontally.
- FIG. 14 is a diagram showing the reconstructed force vector distribution
- FIG. 14 is a diagram showing the reconstructed force vector distribution when two points are pressed
- FIG. FIG. 16 is a side view of the staircase method according to the embodiment.
- FIG. 16 is an image obtained by observation from the upper surface of the elastic body shown in FIG.
- FIG. 15 is a steady state
- (b) is (C) shows a case where a horizontal force is applied to the contact surface
- (c) shows a case where a vertical force is applied to the contact surface.
- Reference numeral 17 denotes the pyramid-shaped bottom surface.Three sets of surfaces facing the same direction are colored red, green, and blue, respectively.
- FIG. 18 is a diagram showing coloring of each surface according to inclination
- FIG. 19 is a diagram showing a case where scattered light due to a white surface is observed using a directional light source.
- Figure 0 shows a sensor with a combination of a stepped optical tactile sensor and a light guide film.
- Figure 21 shows the force applied to the light path and the contact between the step surface and the light guide.
- FIG. 7 is a diagram showing how an observed image changes.
- FIG. 1 is a principle diagram of an optical tactile sensor device according to the present invention.
- the sensor device includes a transparent elastic body 1 made of a translucent elastic member, and a colored marker 2 is embedded in the transparent elastic body 1.
- the transparent elastic body 1 and the colored marker 2 constitute a tactile part.
- the colored marker 2 provided inside the transparent elastic body 1 is configured to be displaced or distorted.
- the sensor device further includes a camera 4 and a light source 5 as imaging means.
- the optical camera 4 is disposed so as to sandwich the transparent elastic body 1 and on the side opposite to the side where the object 3 is infested, and captures the displacement and distortion of the marker 12 by the camera 4. ⁇ .
- the light source 5 may be a light guide using a waveguide (optical fiber).
- the transparent elastic body 1 is preferably made of silicone rubber, but other rubbers or elastics It may be formed from another elastic member such as a stoma.
- the marker is preferably formed of a resilient member, and is more preferably formed of the same material as the transparent elastic body 1,
- the marker is also formed of an elastic member (preferably having the same elastic constant as the elastic body).
- the material of the marker is not particularly limited as long as the marker is sufficiently small so as not to hinder the deformation of the elastic body. Further, a portion of the elastic body may constitute a marker.
- the present invention a situation in which a large number of optical markers are distributed in the transparent elastic body 1 and the marker is displaced, distorted, or tilted due to deformation of the elastic body 1 caused by contact of an object with the elastic body 1
- the camera captures images of the object and detects information on the object to be touched and the displacement and strain inside the elastic body caused by corrosion.
- the camera as the photographing means is a digital camera, that is, a camera that outputs image data as electric signals, and in one preferable example, a CCD camera.
- the imaging means according to the present invention is not limited to a CCD camera, and may be, for example, a digital camera using a C-MOS image sensor.
- identifying a colored marker by color is one of the most important mechanisms for enhancing the intelligence of the sensor of the present invention, and it is desirable that a color filter be mounted on the image sensor. Even if the image sensor does not have a color filter (in this case, only the light intensity, that is, only a black and white image is taken), the light source built into the sensor is included in the reflected color spectrum.
- the image taken when each light source is illuminated captures only the light emitted from the corresponding marker. do.
- the image sensor in this case, RGB of the camera
- the image sensor captures only the light intensity and prepares Red, Green, and Blue as light sources. Only reflected from the marker, the other two markers absorb the light, resulting in a camera Catch only markers. If this is performed for Green and Blue in a time-sharing manner, information equivalent to (1) can be obtained.
- the three colors Red, Blue, and Green referred to here are merely examples. In practice, the color spectrum is infinite, and each marker's “color” (more precisely, the reflected light spectrum) is If is independent, theoretically there are countless choices.
- a plurality of different colors are assigned to markers to give a captured image information having multidimensional or multiple degrees of freedom.
- displacement and shear strain near the marker's location are converted into image information.
- the present invention is characterized by using a colored marker, but the present invention has two broadly divided embodiments.
- One is that although the marker itself is little devised, the obtained image is processed on a PC to obtain the distortion information of the elastic body, which is a so-called image processing method.
- One embodiment corresponds to this.
- the other is that the marker itself has intelligence and the obtained image is already existing physical information. This is the so-called intelligent material method, and the second, third, and fourth methods described later
- the fifth embodiment corresponds to this.
- spherical markers are arranged in the depth direction.
- color-coded spherical microscopic pieces are used as markers, which are distributed in several layers by color. For example, if it is divided into RGB (red, green, blue), it is easy to separate these into layers by using the color filter of the camera.
- the marker group 2A composed of red fine spherical markers and the marker group 2A are buried at the shallowest part (far side to the camera) from the surface where the object 3 of the elastic body 1 contacts.
- the marker group 2B composed of green fine spherical markers is deeper than the layer where the marker is placed, and the blue fine spherical marker is further deeper than the layer where the marker group 2B is embedded (closer to the camera).
- a marker group 2C composed of markers is buried.
- the camera is not shown in FIG. 2, the camera is disposed so as to face the surface opposite to the surface with which the object 3 contacts.
- the spherical markers forming each layer do not overlap each other in the layer direction (vertical direction in the figure). This state is desirably obtained by, for example, randomly distributing the spherical markers forming each layer at a certain density.
- Each image is an image with very small spatial autocorrelation (binary so-called white noise (white noise) image) if the markers are randomly located.
- white noise white noise
- the pixel movement amount at each point can be obtained. Specifically, correlation calculation is performed between the image taken beforehand and the image after contact, and the moving vector at each point is detected. In order to do this, it is desirable that the marker has a size of about 1 ⁇ 1 or more and about 10 ⁇ 10 or less in terms of camera pixels.
- the marker density in each layer be smaller in the upper layer (on the side farther away from the surface in contact with the object), that is, in the portion closer to the camera.
- the upper marker tends to obscure the lower marker, which is not desirable for image processing.
- the spatial frequency of displacement is lower because the upper layer is farther from the contact surface, This is because high-density markers are not required for calculation.
- the camera should preferably use a pinhole camera or a camera with a large depth of focus that is in focus at all depths, but if a camera with a lens with a small depth of focus is used, it is preferable to focus on the bottom layer .
- a camera with a lens with a small depth of focus it is preferable to focus on the bottom layer .
- the spatial frequency of the upper layer is low, high resolution is not required to capture it, and even if the lowermost layer is focused, defocusing at higher layers will actually affect image processing. Because it does not affect.
- a number of colored spheres (markers) are embedded in the elastic body, and the movement of each point is measured by imaging with a CCD camera. Since the movement measured at this time is a horizontal movement, it has two components, x and y. Therefore, information of two degrees of freedom is obtained from each marker. Eventually, the plane distribution of the moving vector can be obtained.
- the force vector since the force vector has three components at each point, it is not enough to reconstruct the force vector by sensing only two components for each point on the surface as described above. However, if you prepare another layer of markers as shown in Fig. 11, Another set of moving vector distributions can be obtained.
- each marker group can be separated from the captured image, and each movement vector can be calculated separately. I can do it.
- the marker may be, for example, a fine white noise (white noise) or a sphere several millimeters in diameter. In this prototype, markers with a diameter of about 1 mm are used, which may cause a problem that the lower marker is shielded by the upper marker. In order to avoid this, the markers of each layer are arranged at positions that do not overlap each other.
- Figures 12 to 14 show the force vectors reconstructed as a result of the experiment.
- Fig. 12 shows the case where one center point of the contact surface is pushed vertically
- Fig. 13 shows the case where the same point is pushed horizontally. From these two results, it can be seen that the force applied to at least one point can be reconstructed as a vector.
- Fig. 14 shows the case where the contact surface is pressed at two points. From this result, it can be seen that the distribution of the surface contact force is reconstructed.
- This method is simple in principle and easy to manufacture, but since the image is not used as tactile information, the horizontal movement vector must be calculated from the image.
- the marker 1 is desirably made of the same elastic body as the main body plus a pigment in order to eliminate the influence on the deformation of the sensor main body.
- the marker may be made of a different material having the same elastic characteristic.
- any material may be used as long as the size of the marker is sufficiently small that the influence on the deformation of the sensor body is negligible.
- the shape of the marker is spherical, and the thickness of the layer formed at this time is desirably substantially equal to the diameter of the marker. This ensures that markers of the same color are at the same depth.
- each marker layer 1 OA (markers were distributed
- the unit elastic body) and the transparent layer 10 B (unit elastic body without a marker) are laminated.
- a transparent layer can be used as an adhesive, but it may be separately bonded with a transparent adhesive having little effect on the elastic body.
- three marker layers 10 A are shown, but as described above, the marker distribution density increases from the upper layer to the lower layer (that is, from the side closer to the camera to the side farther from the camera). I have.
- the size of the markers described below is governed by the resolution required for the application. Considering the case where the spherical marker is used as a tactile sensor of a robot hand as an application example, the diameter of this spherical marker is about 0.1 mm to 0.2 mm in the case of a single ⁇ .
- the marker is an ultrafine cylinder or ultrafine cylinder having a minute cross section.
- the marker is vertically embedded in the thickness of the transparent elastic body.
- the marker extends along an imaginary line connecting the object in contact with the elastic body and the camera.
- the elastic body 2 has a group of markers formed by juxtaposing a large number of markers at a predetermined depth, and the marker groups are arranged in three levels at different depths.
- the marker group 2 OA composed of a red extra-fine cylindrical marker is located at the shallowest part from the surface where the object 3 of the elastic body 1 comes into contact, and it is more than the layer in which the marker group 2 OA is embedded.
- the deep part is composed of a group of superfine green cylindrical markers 20 B, and the deeper part of the layer where the marker group 20 B is buried is composed of blue extrafine cylindrical markers. Marker group 20 C is buried.
- the camera is not shown in FIG. 4, the camera is arranged so as to face the surface opposite to the surface with which the object 3 contacts.
- the three-tiered marker group is color-coded differently from each other, and in the illustrated one, is colored from blue, green, and red, but the color of the marker is not limited to these. Color is not limited as long as the camera can be identified. It is desirable that the markers constituting the marker groups 20 A, 20 B, and 20 C buried in the respective layers do not overlap each other (in the vertical direction) between the respective layers. Since each marker constituting the marker group has a minute cross section, nothing can be seen at all times when viewed from the camera arranged on the upper side in the figure. When shear strain occurs at the marker position due to contact with the object, it tilts in proportion to it, and when viewed from above, the transparent elastic body looks suddenly colored.
- the marker groups are arranged in a color-coded manner in the depth direction, they are colored according to the shear strain at the depth.
- a rainbow pattern (Fig. 5) around the contact point is observed. This indicates a change in the shear strain according to the depth.
- the vertical component and the horizontal component of the stress acting on the contact surface can be separately detected.
- FIG. 6 shows an example of the sensor manufacturing process. First, the undiluted solution of the colored marker is placed in a container with a large amount of tiny holes at the bottom, and extruded before it hardens to make a cylinder with an aspect ratio.
- the diameter of the cross section of the ultrafine marker is, in one example, 0.1 mm to 0.5 mm, and the length is, in one example, about 10 to 100 times the diameter.
- This is added to the uncured transparent elastomer solution. These processes are integrated, and it is desirable to extrude a colored marker into the transparent elastic body stock solution. After further solidification, slice into appropriate thickness. This is manufactured for each colored marker of a different color, and the sensor is constructed by laminating it in multiple layers.
- a third embodiment according to the present invention will be described with reference to FIG.
- a very thin strip for example, about 0.001 mm
- another color-coded marker group is arranged at an angle different from that of the marker group.
- two preferred marker groups 200 A a plurality of juxtaposed red thin strips
- 200 B a plurality of juxtaposed blue strips
- Marker group consisting of thin strips are arranged so that each marker is orthogonal to each other, but the spatial arrangement relationship of the plurality of marker groups is not limited to this.
- the front and back of the strip constituting the marker may be formed in different colors.
- the strip marker is embedded in the thickness of the rectangular parallelepiped transparent elastic body 1 having a predetermined thickness.
- the strip marker extends vertically to the surface of the elastic body 1 that contacts the object, and the camera is disposed so as to face the surface opposite to the surface that the object 3 contacts.
- the strip markers are so thin that you can usually see nothing when viewed from above (from the camera). When a shear strain is generated at the marker position due to contact with the object, the marker is inclined in proportion to the shear strain, and the transparent elastic body looks suddenly colored as in the first embodiment.
- the generated color already contains the direction component information of the distortion.
- the color is red, and the portion where the distortion is different by 90 degrees is colored blue.
- the colors appear mixed and appear neutral, and if you look at the length and B of the RGB output of the camera, they are the x and y components of the shear strain.
- the gripping task which is one of the important basic movements of the robot hand (holding the object without dropping it)
- the shearing strain is generated on the contact surface by looking in which direction it is working Since the friction force can be estimated, applications in that direction can be expected.
- FIG. 7 may be laminated to form a sensor.
- FIG. 8 shows an example of a method for manufacturing a sensor according to the third embodiment. (1) A transparent elastic body and a colored elastic body are laminated.
- the thickness of the transparent elastic body is lm ni and the thickness of the color layer is 0.0 lmm, but for practical use such as a robot hand, the transparent elastic body is about 0.1 mm and the color layer is about 0.01 mm. desirable.
- the laminate in (1) is cut in a direction perpendicular to the laminating direction. It is desirable that the thickness of the cut is the same as the thickness of the transparent elastic body described above.
- Each of the sliced slices is bonded with an elastic body colored differently from (1). Since some silicone rubbers have self-adhesive properties, they can be easily adhered by using them, but a separate adhesive layer may be used.
- the laminate in (3) is cut out along a plane orthogonal to the two orthogonal layers (the two elastic bodies of different colors).
- the thickness to be cut out depends on the application and the hardness of the elastic body, but it is considered to be about 1 to 20 with the interval between the laminated colored layers as 1. It is also conceivable to cut out at an angle as in (4 ′) in the manufacturing process (4). In this case, when observed from directly above during use as shown in FIG. 7, the color is already observed because the colored layer is already inclined from the beginning. That is, an offset is provided, and a zero point (a state in which the colored layer is vertical and has no color) is avoided, so that it is not necessary to separately color the front and back of the colored layer.
- the fourth embodiment is an improvement of the third embodiment.
- the biggest problem with zoning is the complexity of the fabrication process.
- the so-called staircase method described below is a simplified one that has the same sensing ability.
- Prepare a step-like interface as shown in Fig. 15 (elastic body forms a marker). Because of the step shape, the interface can be divided into two groups with the same direction. Color each group the same color (here Red and Blue). Then the image taken from above will look like Figure 16. If the width of each band is sufficiently smaller than one pixel on the image sensor, the image will be observed as a single color that is a mixture of the colors of the two bands. When the sensor body makes contact, the slope of each band changes, which is observed as a color change.
- each band When a horizontal force is applied to the contact surface (Fig. 16 (3 ⁇ 4)), each band produces a rotational movement at its own location, so that in the image taken from the top, one surface group shrinks and the other group shrinks. Spread and observed. That is, it is observed as a change in the ratio of two colors. It is. Also, when a vertical force is applied to the contact surface (Fig. 16 (c)), the slope of each band changes equally, so the ratio of the two colors does not change, but the brightness itself of the image changes. In other words, the “difference” and “sum” of the luminance of each observed color (red and blue) change substantially due to the water and vertical force components, respectively. Observation gives an observation that contains the horizontal and vertical components of the force vector at that point as information.
- each surface is created as a white scattering surface, and when using it as a sensor, apply light of each color from the vertical direction of each surface and observe the scattered reflected light.
- Figure 19 the silicone stock solution may be colored white from the beginning when the stair surface is created, so that the manufacturing process is extremely simple, and the light intensity of each light source is used during use as a sensor. By adjusting, the crosstalk between each color channel can be reduced and more accurate sensing can be performed.
- the essence of the staircase method is that the rotational movement at each point of the Although the vertical contraction corresponds to the average inclination of each surface, that is, the sum of the luminance of each color, there may be a problem that the sensitivity to the vertical contraction is low with respect to the rotational movement. However, it is possible to increase the sensitivity to vertical displacement by making the following minor modifications.
- the light guide path is a film made of the same material as the sensor body or a hard transparent elastic body, and has a thickness of about 0.5 plate to about 1 dragon.
- a transparent elastic body having a color filter that transmits only Red and Blue is disposed on the light guide path with each step surface group of the transparent elastic body having a step shape as described above.
- This filter can be made in the same process as the case of the colored surface described above.
- This sensor system contacts the contact object through the light guide path. When there is no contact, there is no contact between the steps and the light guide, and the camera image remains dark. However, when the light guide path and the apex of the stairway start contacting due to contact with the contact object, the white light filled in the light guide path is imaged as colored light through the color filter on the stairway face.
- Figure 21 illustrates this situation.
- the contact state between the step surface and the light guide changes depending on the direction of the force vector distribution on the light guide surface.
- a vertical force is applied, the vertex of the staircase is deformed symmetrically in Fig. 21 (right), so that an image in which Red and Blue are mixed to the same degree is obtained.
- the luminance of the image indicates the vertical resistance.
- a force close to the horizontal is applied, the stairs are deformed in the horizontal direction, causing asymmetry in the contact between the light guide path and each surface of the stairs. This asymmetry, the ratio between Red and Blue, represents the horizontal force.
- the difference between the luminance and the sum of the luminance of each color channel is the same as that described above in terms of sensing, but has the following two advantages compared to the above.
- the dynamic range of the pixel itself can be fully utilized.
- FIG. 9 shows an example in which two stages are superimposed, the number of stages is not limited to two stages.
- the sensor is configured by laminating rectangular parallelepiped unit elastic bodies 10 having a predetermined thickness.
- an object comes into contact from below, and an image is taken with a camera from above, and the object comes into contact with the lower surface of the lower unit elastic body 10.
- a plurality of perfectly circular planar markers 200 are provided on the upper surface of the lower unit elastic body 10.
- the circular marker 200 is divided into three sectors 200 A, 200 B, and 200 C which are equally divided from the center of the circle. In the form of, it is painted red, green, and blue.
- the shape of the marker is not limited to a circle, and the colors are not limited to three colors, but may be two colors or four or more colors.
- the diameter of the plane marker ⁇ can be about 1 mm to 2 mm in one example.
- a black concealment marker 6 of the same size as the circular marker 20 is provided on the upper surface of the unit elastic body 10 in the upper layer.
- the concealment marker 6 is replaced by the circular marker 2
- the upper and lower unit elastic bodies 10 are laminated and adhered so as to completely polymerize to 0.
- the lower colored marker 20 cannot be seen because it is blocked by the upper concealment marker 6, but when shear distortion occurs, the positions of the concealed marker 6 and the colored marker 20 are shifted and they are colored. .
- the marker is painted in three colors RGB, and the direction of the distortion can be known from the generated color.
- the present invention can be widely applied to a tactile sensor, and is preferably used as a tactile sensor for a robot hand.
Description
Claims
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE60135861T DE60135861D1 (de) | 2000-08-31 | 2001-08-30 | Optischer tastsensor |
EP01961195A EP1321753B1 (en) | 2000-08-31 | 2001-08-30 | Optical tactile sensor |
AU2001282549A AU2001282549A1 (en) | 2000-08-31 | 2001-08-30 | Optical tactile sensor |
KR1020037002118A KR100846305B1 (ko) | 2000-08-31 | 2001-08-30 | 광학식 촉각센서 |
US10/344,821 US6909084B2 (en) | 2000-08-31 | 2001-08-30 | Optical tactile sensor having a transparent elastic tactile portion |
CA2419252A CA2419252C (en) | 2000-08-31 | 2001-08-30 | Optical tactile sensor |
JP2002523568A JP4100615B2 (ja) | 2000-08-31 | 2001-08-30 | 光学式触覚センサ |
HK03108540A HK1056602A1 (en) | 2000-08-31 | 2003-11-22 | Optical tactile sensor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000264407 | 2000-08-31 | ||
JP2000-264407 | 2000-08-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2002018893A1 true WO2002018893A1 (fr) | 2002-03-07 |
Family
ID=18751829
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2001/007462 WO2002018893A1 (fr) | 2000-08-31 | 2001-08-30 | Détecteur tactile optique |
Country Status (12)
Country | Link |
---|---|
US (1) | US6909084B2 (ja) |
EP (1) | EP1321753B1 (ja) |
JP (1) | JP4100615B2 (ja) |
KR (1) | KR100846305B1 (ja) |
CN (1) | CN1264003C (ja) |
AT (1) | ATE408809T1 (ja) |
AU (1) | AU2001282549A1 (ja) |
CA (1) | CA2419252C (ja) |
DE (1) | DE60135861D1 (ja) |
HK (1) | HK1056602A1 (ja) |
RU (1) | RU2263885C2 (ja) |
WO (1) | WO2002018893A1 (ja) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005029027A1 (ja) * | 2003-09-16 | 2005-03-31 | Toudai Tlo, Ltd. | 光学式触覚センサを用いた力ベクトル再構成法 |
WO2005085785A1 (ja) * | 2004-03-09 | 2005-09-15 | Nagoya Industrial Science Research Institute | 光学式触覚センサ、センシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置、ロボットハンド |
WO2005124305A1 (ja) * | 2004-06-16 | 2005-12-29 | Toudai Tlo, Ltd. | 光学式触覚センサ |
JP2007147443A (ja) * | 2005-11-28 | 2007-06-14 | Nitta Ind Corp | 光学式触覚センサ |
JP2007518966A (ja) * | 2003-09-16 | 2007-07-12 | 株式会社東京大学Tlo | 光学式触覚センサ及び該センサを用いた力ベクトル分布再構成法 |
JP2008008746A (ja) * | 2006-06-29 | 2008-01-17 | Univ Of Tokyo | 反射像を用いた触覚センサ |
JP2009297347A (ja) * | 2008-06-16 | 2009-12-24 | Juki Corp | ボタン認識機構およびボタン供給装置 |
US8629987B2 (en) | 2009-12-01 | 2014-01-14 | Seiko Epson Corporation | Optical-type position detecting device, hand apparatus, and touch panel |
JP2014017000A (ja) * | 2008-01-11 | 2014-01-30 | O-Net Wavetouch Limited | 接触感応装置 |
JP2018088254A (ja) * | 2016-03-29 | 2018-06-07 | 株式会社齋藤創造研究所 | 入力装置および画像表示システム |
JP2018088240A (ja) * | 2016-11-24 | 2018-06-07 | 株式会社齋藤創造研究所 | 入力装置および画像表示システム |
WO2018235214A1 (ja) * | 2017-06-21 | 2018-12-27 | 株式会社齋藤創造研究所 | マニピュレーターおよびロボット |
JP2019526089A (ja) * | 2016-05-13 | 2019-09-12 | センソブライト・インダストリーズ・リミテッド・ライアビリティ・カンパニーSensobright Industries, Llc | マルチファンクションセンシングシステム |
CN114402183A (zh) * | 2020-08-17 | 2022-04-26 | 汇顶科技(香港)有限公司 | 触觉传感器 |
Families Citing this family (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10106699C2 (de) * | 2001-02-14 | 2003-11-27 | Leica Microsystems | Berührungssensor und Vorrichtung zum Schutz eines hervorstehenden Bauteils |
US7505811B2 (en) | 2001-11-19 | 2009-03-17 | Dune Medical Devices Ltd. | Method and apparatus for examining tissue for predefined target cells, particularly cancerous cells, and a probe useful in such method and apparatus |
US20070255169A1 (en) * | 2001-11-19 | 2007-11-01 | Dune Medical Devices Ltd. | Clean margin assessment tool |
WO2006103665A2 (en) * | 2005-03-29 | 2006-10-05 | Dune Medical Devices Ltd. | Electromagnetic sensors for tissue characterization |
US7809425B2 (en) * | 2003-07-24 | 2010-10-05 | Dune Medical Devices Ltd. | Method and apparatus for examining a substance, particularly tissue, to characterize its type |
US8721565B2 (en) * | 2005-08-04 | 2014-05-13 | Dune Medical Devices Ltd. | Device for forming an effective sensor-to-tissue contact |
US8116845B2 (en) * | 2005-08-04 | 2012-02-14 | Dune Medical Devices Ltd. | Tissue-characterization probe with effective sensor-to-tissue contact |
US8032211B2 (en) * | 2002-01-04 | 2011-10-04 | Dune Medical Devices Ltd. | Probes, systems, and methods for examining tissue according to the dielectric properties thereof |
US8019411B2 (en) * | 2002-01-04 | 2011-09-13 | Dune Medical Devices Ltd. | Probes, systems, and methods for examining tissue according to the dielectric properties thereof |
WO2005089065A2 (en) * | 2004-03-23 | 2005-09-29 | Dune Medical Devices Ltd. | Clean margin assessment tool |
US7149973B2 (en) * | 2003-11-05 | 2006-12-12 | Sonic Foundry, Inc. | Rich media event production system and method including the capturing, indexing, and synchronizing of RGB-based graphic content |
US7904145B2 (en) | 2004-03-23 | 2011-03-08 | Dune Medical Devices Ltd. | Clean margin assessment tool |
US9750425B2 (en) | 2004-03-23 | 2017-09-05 | Dune Medical Devices Ltd. | Graphical user interfaces (GUI), methods and apparatus for data presentation |
CN1920477B (zh) * | 2005-08-23 | 2011-04-06 | 阮刚 | 接触面形变的传感方法和装置 |
US8147423B2 (en) * | 2007-03-01 | 2012-04-03 | Dune Medical Devices, Ltd. | Tissue-characterization system and method |
SE531527C2 (sv) * | 2007-10-01 | 2009-05-12 | Bioresonator Ab | Förfarande vid och en anordning för opåverkad materialundersökning |
JP5449336B2 (ja) | 2008-06-19 | 2014-03-19 | マサチューセッツ インスティテュート オブ テクノロジー | 弾性撮像を使用する接触センサ |
DE102008037861A1 (de) * | 2008-08-15 | 2010-03-18 | Siemens Aktiengesellschaft | Optischer Tastsensor |
JP2010224665A (ja) * | 2009-03-19 | 2010-10-07 | Sony Corp | 光触覚変換システム、及び触覚フィードバックの提供方法 |
JP5549203B2 (ja) * | 2009-12-01 | 2014-07-16 | セイコーエプソン株式会社 | 光学式位置検出装置、ハンド装置およびタッチパネル |
WO2011068718A1 (en) | 2009-12-03 | 2011-06-09 | The Procter & Gamble Company | Method for assessment of force properties generated by the fiber tip |
US8823639B2 (en) * | 2011-05-27 | 2014-09-02 | Disney Enterprises, Inc. | Elastomeric input device |
KR20140041890A (ko) | 2011-07-28 | 2014-04-04 | 메사추세츠 인스티튜트 오브 테크놀로지 | 고해상도 표면 측정 시스템 및 방법 |
US8994694B2 (en) | 2011-11-30 | 2015-03-31 | Blackberry Limited | Optical interference based user input device |
EP2600229A1 (en) * | 2011-11-30 | 2013-06-05 | Research In Motion Limited | Optical interference based user input device |
FR2989829B1 (fr) * | 2012-04-20 | 2014-04-11 | Commissariat Energie Atomique | Capteur tactile photosensible |
KR101382287B1 (ko) * | 2012-08-22 | 2014-04-08 | 현대자동차(주) | 적외선을 이용한 터치스크린 및 그 터치스크린의 터치 인식장치 및 방법 |
KR101371736B1 (ko) * | 2012-08-22 | 2014-03-07 | 현대자동차(주) | 터치스크린의 터치 인식방법 |
TWI482953B (zh) * | 2012-12-28 | 2015-05-01 | Univ Nat Chiao Tung | 壓力與剪力量測裝置及方法 |
WO2014138716A2 (en) | 2013-03-08 | 2014-09-12 | Gelsight, Inc. | Continuous contact-based three-dimensional measurement |
US9245916B2 (en) | 2013-07-09 | 2016-01-26 | Rememdia LC | Optical positioning sensor |
DE102013017007B4 (de) * | 2013-10-14 | 2015-09-10 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Roboter mit einem Endmanipulatorarm mit Endeffektor sowie Verfahren zur Bestimmung eines Kraft- und Drehmomenteintrages auf einen Endeffektor eines Roboters |
LU92408B1 (en) * | 2014-03-21 | 2015-09-22 | Olivier Raulot | User gesture recognition |
FR3034053B1 (fr) * | 2015-03-26 | 2017-03-17 | Continental Automotive France | Systeme de palette tactile mobile ou deformable formant une interface homme-machine adapte sur un volant de vehicule |
US9557164B2 (en) * | 2015-04-15 | 2017-01-31 | General Electric Company | Data acquisition devices, systems and method for analyzing strain sensors and monitoring turbine component strain |
US9851196B2 (en) * | 2015-04-17 | 2017-12-26 | Rememdia LC | Sensor |
WO2017010824A1 (ko) * | 2015-07-14 | 2017-01-19 | 한국생산기술연구원 | 헤드서포트 및 헤드서포트 시스템 |
FR3040090B1 (fr) * | 2015-08-13 | 2019-06-14 | Museum National D'histoire Naturelle | Capteur de force |
CN105318994B (zh) * | 2015-11-30 | 2018-05-15 | 华南理工大学 | 一种基于图像识别的力测量装置 |
CN106092382B (zh) * | 2016-07-20 | 2018-09-11 | 山东大学 | 一种基于弹性体三维形变的触觉传感器及检测方法 |
CN108161994B (zh) * | 2017-12-20 | 2020-07-10 | 清华大学 | 一种多模态触觉感知装置 |
KR102115437B1 (ko) | 2018-02-07 | 2020-05-28 | 한국기계연구원 | 광학식 측정 장치 |
US11226195B1 (en) * | 2018-06-27 | 2022-01-18 | United States Of America Represented By The Secretary Of The Air Force | Method and system for measuring strain in a 3D printed part |
CN109015763A (zh) * | 2018-08-30 | 2018-12-18 | 清华大学 | 一种基于感温变色油墨材料的多模态触觉感知装置 |
CN113227762A (zh) * | 2018-09-06 | 2021-08-06 | 胶视公司 | 复古图形传感器 |
US10562190B1 (en) * | 2018-11-12 | 2020-02-18 | National Central University | Tactile sensor applied to a humanoid robots |
GB2579846A (en) * | 2018-12-18 | 2020-07-08 | Univ Bristol | Improvements in or relating to tactile sensing |
GB201907744D0 (en) * | 2019-05-31 | 2019-07-17 | The Shadow Robot Company Ltd | Tactile sensor |
CN112097675A (zh) | 2019-06-17 | 2020-12-18 | 香港科技大学 | 触觉传感器 |
JP7345902B2 (ja) * | 2019-07-04 | 2023-09-19 | 株式会社FingerVision | 触覚センサ、触覚センサシステム及びプログラム |
CN110849516B (zh) * | 2019-09-09 | 2021-07-02 | 南京邮电大学 | 一种光电式柔性触觉传感器及其制作方法 |
WO2021076697A1 (en) * | 2019-10-15 | 2021-04-22 | Massachusetts Institute Of Technology | Retrographic sensors with compact illumination |
WO2021081084A1 (en) * | 2019-10-21 | 2021-04-29 | The Regents Of The University Of California | Multi-directional high-resolution optical tactile sensors |
US11772262B2 (en) * | 2019-10-25 | 2023-10-03 | Dexterity, Inc. | Detecting slippage from robotic grasp |
US11607816B2 (en) | 2019-10-25 | 2023-03-21 | Dexterity, Inc. | Detecting robot grasp of very thin object or feature |
CN111805562B (zh) * | 2020-06-05 | 2023-03-10 | 清华大学 | 触觉传感器及机器人 |
CN112485140B (zh) * | 2020-11-06 | 2022-02-18 | 浙江大学 | 一种集成于柔性手指的水果硬度传感器 |
US20240060837A1 (en) * | 2020-12-15 | 2024-02-22 | Massachusetts Institute Of Technology | Retrographic sensors with fluorescent illumination |
KR20230109303A (ko) * | 2022-01-13 | 2023-07-20 | 삼성전자주식회사 | 광학식 촉각 센서 |
US20230251149A1 (en) * | 2022-02-04 | 2023-08-10 | Massachusetts Institute Of Technology | Flexible optical tactile sensor |
CN114659678A (zh) * | 2022-04-12 | 2022-06-24 | 深圳市松果体机器人科技有限公司 | 面形柔性触觉传感器 |
CN114659460A (zh) * | 2022-04-12 | 2022-06-24 | 深圳市松果体机器人科技有限公司 | 一种采集按摩信号的装置 |
CN114681282A (zh) * | 2022-04-12 | 2022-07-01 | 深圳市松果体机器人科技有限公司 | 一种手持的按摩传感设备 |
CN114659679A (zh) * | 2022-04-12 | 2022-06-24 | 深圳市松果体机器人科技有限公司 | 柔性触觉传感器 |
CN114910199B (zh) * | 2022-05-09 | 2023-08-18 | 北京纳米能源与系统研究所 | 一种触觉传感器、制备方法及信息采集方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61205831A (ja) * | 1985-03-08 | 1986-09-12 | Nippon Telegr & Teleph Corp <Ntt> | マトリツクス触覚センサ |
JPS62115308A (ja) * | 1985-11-14 | 1987-05-27 | Univ Osaka | 実時間歪み分布測定方法および装置 |
JPH02198306A (ja) * | 1989-01-28 | 1990-08-06 | Shimizu Corp | コンクリート部材の面内変位計測方法 |
JPH03135704A (ja) * | 1989-10-20 | 1991-06-10 | Central Glass Co Ltd | 板状体の歪検査方法 |
JP2000227371A (ja) * | 1999-02-05 | 2000-08-15 | Masahiko Matsubara | 面圧力分布検出装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4668861A (en) | 1984-12-12 | 1987-05-26 | The Regents Of The University Of California | Tactile sensor employing a light conducting element and a resiliently deformable sheet |
US4599908A (en) | 1985-03-18 | 1986-07-15 | Sheridan Thomas B | Opto-mechanical touch sensor |
US5311779A (en) * | 1992-01-03 | 1994-05-17 | Inabagomu Co., Ltd. | Pressure-sensitive sensor |
JPH09257609A (ja) * | 1996-03-18 | 1997-10-03 | Mitsubishi Cable Ind Ltd | 光ファイバ式触覚センサ |
US6089516A (en) * | 1998-12-11 | 2000-07-18 | Yates; Paul M. | Decorative cushion providing wide lateral movement support |
-
2001
- 2001-08-30 CN CNB018147364A patent/CN1264003C/zh not_active Expired - Fee Related
- 2001-08-30 AT AT01961195T patent/ATE408809T1/de not_active IP Right Cessation
- 2001-08-30 KR KR1020037002118A patent/KR100846305B1/ko not_active IP Right Cessation
- 2001-08-30 JP JP2002523568A patent/JP4100615B2/ja not_active Expired - Lifetime
- 2001-08-30 CA CA2419252A patent/CA2419252C/en not_active Expired - Fee Related
- 2001-08-30 RU RU2003108731/28A patent/RU2263885C2/ru not_active IP Right Cessation
- 2001-08-30 EP EP01961195A patent/EP1321753B1/en not_active Expired - Lifetime
- 2001-08-30 AU AU2001282549A patent/AU2001282549A1/en not_active Abandoned
- 2001-08-30 DE DE60135861T patent/DE60135861D1/de not_active Expired - Lifetime
- 2001-08-30 US US10/344,821 patent/US6909084B2/en not_active Expired - Fee Related
- 2001-08-30 WO PCT/JP2001/007462 patent/WO2002018893A1/ja active IP Right Grant
-
2003
- 2003-11-22 HK HK03108540A patent/HK1056602A1/xx not_active IP Right Cessation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61205831A (ja) * | 1985-03-08 | 1986-09-12 | Nippon Telegr & Teleph Corp <Ntt> | マトリツクス触覚センサ |
JPS62115308A (ja) * | 1985-11-14 | 1987-05-27 | Univ Osaka | 実時間歪み分布測定方法および装置 |
JPH02198306A (ja) * | 1989-01-28 | 1990-08-06 | Shimizu Corp | コンクリート部材の面内変位計測方法 |
JPH03135704A (ja) * | 1989-10-20 | 1991-06-10 | Central Glass Co Ltd | 板状体の歪検査方法 |
JP2000227371A (ja) * | 1999-02-05 | 2000-08-15 | Masahiko Matsubara | 面圧力分布検出装置 |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007518966A (ja) * | 2003-09-16 | 2007-07-12 | 株式会社東京大学Tlo | 光学式触覚センサ及び該センサを用いた力ベクトル分布再構成法 |
WO2005029027A1 (ja) * | 2003-09-16 | 2005-03-31 | Toudai Tlo, Ltd. | 光学式触覚センサを用いた力ベクトル再構成法 |
US7420155B2 (en) | 2003-09-16 | 2008-09-02 | Toudai Tlo, Ltd. | Optical tactile sensor and method of reconstructing force vector distribution using the sensor |
US7460964B2 (en) | 2003-09-16 | 2008-12-02 | Toudai Tlo, Ltd. | Force vector reconstruction method using optical tactile sensor |
US7707001B2 (en) | 2004-03-09 | 2010-04-27 | Nagoya Industrial Science Research Institute | Control of object operating force, object gripping force and robot hands |
WO2005085785A1 (ja) * | 2004-03-09 | 2005-09-15 | Nagoya Industrial Science Research Institute | 光学式触覚センサ、センシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置、ロボットハンド |
JP2005257343A (ja) * | 2004-03-09 | 2005-09-22 | Nagoya Industrial Science Research Inst | 光学式触覚センサ、光学式触覚センサを利用したセンシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置及びロボットハンド |
JP4621827B2 (ja) * | 2004-03-09 | 2011-01-26 | 財団法人名古屋産業科学研究所 | 光学式触覚センサ、光学式触覚センサを利用したセンシング方法、センシングシステム、物体操作力制御方法、物体操作力制御装置、物体把持力制御装置及びロボットハンド |
WO2005124305A1 (ja) * | 2004-06-16 | 2005-12-29 | Toudai Tlo, Ltd. | 光学式触覚センサ |
US7659502B2 (en) | 2004-06-16 | 2010-02-09 | Toudai Tlo, Ltd. | Optical tactile sensor |
JP2007147443A (ja) * | 2005-11-28 | 2007-06-14 | Nitta Ind Corp | 光学式触覚センサ |
JP2008008746A (ja) * | 2006-06-29 | 2008-01-17 | Univ Of Tokyo | 反射像を用いた触覚センサ |
JP2014017000A (ja) * | 2008-01-11 | 2014-01-30 | O-Net Wavetouch Limited | 接触感応装置 |
JP2009297347A (ja) * | 2008-06-16 | 2009-12-24 | Juki Corp | ボタン認識機構およびボタン供給装置 |
US8629987B2 (en) | 2009-12-01 | 2014-01-14 | Seiko Epson Corporation | Optical-type position detecting device, hand apparatus, and touch panel |
JP2018088254A (ja) * | 2016-03-29 | 2018-06-07 | 株式会社齋藤創造研究所 | 入力装置および画像表示システム |
JP2019526089A (ja) * | 2016-05-13 | 2019-09-12 | センソブライト・インダストリーズ・リミテッド・ライアビリティ・カンパニーSensobright Industries, Llc | マルチファンクションセンシングシステム |
JP2018088240A (ja) * | 2016-11-24 | 2018-06-07 | 株式会社齋藤創造研究所 | 入力装置および画像表示システム |
JP2021180019A (ja) * | 2016-11-24 | 2021-11-18 | 株式会社齋藤創造研究所 | 入力装置および画像表示システム |
JP7209310B2 (ja) | 2016-11-24 | 2023-01-20 | 株式会社齋藤創造研究所 | 入力装置および画像表示システム |
WO2018235214A1 (ja) * | 2017-06-21 | 2018-12-27 | 株式会社齋藤創造研究所 | マニピュレーターおよびロボット |
JPWO2018235214A1 (ja) * | 2017-06-21 | 2019-07-04 | 株式会社齋藤創造研究所 | マニピュレーターおよびロボット |
CN114402183A (zh) * | 2020-08-17 | 2022-04-26 | 汇顶科技(香港)有限公司 | 触觉传感器 |
CN114402183B (zh) * | 2020-08-17 | 2024-03-08 | 汇顶科技(香港)有限公司 | 触觉传感器 |
Also Published As
Publication number | Publication date |
---|---|
KR20030040398A (ko) | 2003-05-22 |
JP4100615B2 (ja) | 2008-06-11 |
CA2419252C (en) | 2011-03-29 |
KR100846305B1 (ko) | 2008-07-16 |
EP1321753B1 (en) | 2008-09-17 |
CN1264003C (zh) | 2006-07-12 |
US6909084B2 (en) | 2005-06-21 |
HK1056602A1 (en) | 2004-02-20 |
US20030178556A1 (en) | 2003-09-25 |
EP1321753A4 (en) | 2006-05-17 |
AU2001282549A1 (en) | 2002-03-13 |
DE60135861D1 (de) | 2008-10-30 |
CA2419252A1 (en) | 2003-02-11 |
RU2263885C2 (ru) | 2005-11-10 |
ATE408809T1 (de) | 2008-10-15 |
CN1571920A (zh) | 2005-01-26 |
EP1321753A1 (en) | 2003-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2002018893A1 (fr) | Détecteur tactile optique | |
KR101926406B1 (ko) | 터치 스크린용 위치 검출 시스템 및 이에 사용하는 프리즘 필름 | |
RU2354943C2 (ru) | Способ восстановления вектора силы с использованием оптического тактильного датчика | |
Lin et al. | Sensing the frictional state of a robotic skin via subtractive color mixing | |
US8928625B2 (en) | Retroreflector for use in touch screen applications and position sensing systems | |
RU2371686C2 (ru) | Оптический тактильный датчик | |
US7945311B2 (en) | Retroreflective marker-tracking systems | |
WO2020165171A1 (en) | Optical tactile sensor | |
KR101660606B1 (ko) | 광학 측정 센서 | |
CN112097675A (zh) | 触觉传感器 | |
JP2021536579A (ja) | レトログラフィックセンサー | |
CN112219153A (zh) | 包括具有两种不同折射率的颜色分离器的图像传感器 | |
CN217035642U (zh) | 图像采集设备 | |
RU2782368C2 (ru) | Ретрографические датчики | |
KR20200090296A (ko) | 구조적 깊이 카메라 시스템에서 인코딩 장치 및 방법 | |
CN207114070U (zh) | 压力分布传感器 | |
CN1908579A (zh) | 测量接触面形变的传感方法和装置 | |
US9442296B2 (en) | Device and method for determining object distances | |
KR20210138451A (ko) | 카메라 장치 | |
KR20180106276A (ko) | 동작 인식 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2419252 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020037002118 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10344821 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 018147364 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001961195 Country of ref document: EP |
|
ENP | Entry into the national phase |
Country of ref document: RU Kind code of ref document: A Format of ref document f/p: F Ref document number: 2003108731 Country of ref document: RU Kind code of ref document: A Format of ref document f/p: F |
|
WWP | Wipo information: published in national office |
Ref document number: 1020037002118 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2001961195 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWG | Wipo information: grant in national office |
Ref document number: 2001961195 Country of ref document: EP |