Method for monitoring linear dimensions of three-dimensional objects
10648789 · 2020-05-12
Assignee
Inventors
- Andrei V. Klimov (Moscow, RU)
- Aleksandr Georgievich Lomakin (Moscow, RU)
- Sergey V. Sukhovey (Moscow, RU)
- Gleb A. Gusev (Moscow, RU)
- Artem L. Yukhin (Moscow, RU)
Cpc classification
G01B11/2545
PHYSICS
G01B11/25
PHYSICS
International classification
Abstract
A method of 3D measurement is performed using a first camera and a second camera located at different distances from the projector. The method includes projecting a known projection pattern that includes at least two non-crossing lines to form a first band and a second band on a surface of an object. The method includes recording first and second images of the object using the first and second cameras, respectively. The method includes determining a first longitudinal coordinate of a first point within the first band and a first vertical coordinate of the first point within the first band; determining a second longitudinal coordinate of the first point within the first band; and determining a second vertical coordinate of the first point within the first band. The method includes determining a final vertical coordinate of the first point by comparing the first longitudinal coordinate to the second longitudinal coordinate.
Claims
1. A method comprising: at a 3D measurement system comprising a projector, a computer, a first camera and a second camera, wherein the first camera and the second camera are located at different distances from the projector: projecting a known projection pattern that includes at least two non-crossing lines to form a first band and a second band on a surface of an object; while projecting the known projection pattern onto the surface of the object: recording, using the first camera, a first image of the surface of the object; and recording, using the second camera, a second image of the surface of the object; using the first image acquired by the first camera, determining, by the computer, a first longitudinal coordinate of a first point within the first band and a first vertical coordinate of the first point within the first band, wherein the first vertical coordinate is along a first coordinate axis that is parallel to an optical axis of the projector and the first longitudinal coordinate is along a second coordinate axis that is perpendicular to the optical axis of the projector; and using the second image acquired by the second camera, determining a second longitudinal coordinate of the first point within the first band and determining a second vertical coordinate of the first point within the first band, wherein the second vertical coordinate is along the first coordinate axis that is parallel to the optical axis of the projector and the second longitudinal coordinate is along the second coordinate axis that is perpendicular to the optical axis of the projector; and determining a final vertical coordinate of the first point by comparing the first longitudinal coordinate to the second longitudinal coordinate, wherein: the final vertical coordinate is determined by adjusting the first vertical coordinate based on a comparison between the first longitudinal coordinate and the second longitudinal coordinate; and the final vertical coordinate is along the first coordinate axis.
2. The method of claim 1, further comprising, identifying the second band, comprising: using the first image acquired by the first camera, determining a third longitudinal coordinate of a second point within the second band and a third vertical coordinate of the second point within the second band, wherein: the third longitudinal coordinate is along the second coordinate axis and the third vertical coordinate is the first coordinate axis; and determining the final vertical coordinate includes comparing the third longitudinal coordinate of the second point within the second band with the first longitudinal coordinate of the first point within the first band and comparing the third vertical coordinate of the second point within the second band with the first vertical coordinate of the first point within the first band.
3. The method of claim 1, wherein using the second image acquired by the second camera to adjust the determined first vertical coordinate comprises determining a distortion of the first band due to a curvature of the surface of the object.
4. The method of claim 1, wherein determining the first longitudinal coordinate comprises: at the projector, projecting the known projection pattern with a minimum brightness level and projecting the known projection pattern with a maximum brightness level; creating a sinusoidal representation of pixel values created from projecting the known projection pattern with the minimum and the maximum brightness levels; and determining the first longitudinal coordinate as the maximum pixel value of the sinusoidal representation of pixel values.
5. The method of claim 1, wherein a distance between the first camera and the projector is less than a distance between the second camera and the projector.
6. The method of claim 1, wherein an angle formed between a center beam of the first camera and a center beam of the projector is less than an angle formed between a center beam of the second camera and the center beam of the projector.
7. The method of claim 1, further comprising, shifting the projected first band and second band along a longitudinal axis that is perpendicular to the optical axis of the projector, and capturing, by the second camera, a plurality of additional images, each image captured corresponding to a shift in the first band and the second band.
8. The method of claim 1, wherein the first point within the first band is a point of ambiguity.
9. The method of claim 8, further comprising determining that the first point belongs to the first band.
10. The method of claim 9, wherein determining that the first point belongs to the first band comprises determining a focal depth of the first camera and a focal depth of the second camera.
11. The method of claim 1, wherein the angle formed between a center beam of the first camera and a center beam of the projector is within a minimum threshold angle.
12. The method of claim 11, wherein the angle within the minimum threshold angle is formed using a semitransparent mirror in a path of the center beam of the first camera and a path of the center beam of the projector.
13. The method of claim 11, wherein the angle formed between the center beam of the first camera and the center beam of the projector is less than or equal to the arctangent of the ratio between the distance between the first band and the second band and a focal depth of a lens of the first camera.
14. The method of claim 1, wherein the first camera is located on a first side of the projector and the second camera is located on a second side of the projector, the second side of the projector opposite the first side of the projector.
15. The method of claim 1, further comprising a third camera to record a third image, wherein the final vertical coordinate is determined based on comparing the first longitudinal coordinate to a third longitudinal coordinate of the first point within the first band in the third image, wherein the third longitudinal coordinate is along the second coordinate axis.
Description
DRAWING FIGURES
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
PREFERABLE EMBODIMENT OF THE INVENTION
(12)
(13) The distance L between the camera and the projection unit is called the base. The base can be chosen as follows.
(14) L=s*tg , where s is the distance from the projection unit to the intersection point of the central beams of the projection unit and the camera (m).
(15) In the simplest case, projection unit 1 projects one horizontal band 3 which coincides with the central beam of the projection unit in
(16) Then this band is usually used to scan the surface along the Y axis in
(17) If camera 2 sees only one band projected by projection unit 1 per frame, to obtain such measurements this band would have to be shifted by the smallest distance possible and as many images would have to be received from camera 2 as possible. This invariably requires a lot of time. The common affordable camera 2 has the frame rate of 25 fps and the resolution of 1 MP, i.e. 1,000 pixels along the Y coordinate axis and 1,000 pixels along the X coordinate axis. We have 1,000 pixels on the band along the X coordinate axis, i.e. 1,000 measurements. To obtain the same number of measurements along both the axes, we have to project the band 1,000 times shifting it by one pixel along the Y coordinate axis, receiving 1,000 frames from camera 2 for this purpose, which takes 40 seconds. If the number of images should be decreased and more measurements obtained from one camera 2 image, in accordance with the method, two bands should be projected, as in
(18) The ambiguity must be resolved when several bands are projected. For this purpose the following terms and algorithms are introduced: Tinterval between the bands, Tzthe measured volume usually defined by the focal depth of the lenses used in the projection unit and camera 2. Focal depth Tz is the distance along the Z axis within which we can observe a sufficiently contrasting image of the bands projected by us, i.e. we can see where the band starts and finishes. Focal depth Tz can be the reference value of the camera lens.
(19) Focal depth Tz of the camera lens for each specific case can be determined, for instance, as follows: Tz=2DC/(f/s).sup.2
(20) where: D is the camera lens aperture (m.sup.2), C is the camera pixel size (m), f is the camera lens focal distance (m), s is the distance from the projection unit to the intersection point of the central beams of the projection unit and the camera (m).
(21) In camera 2 image a projected band usually has the width of (takes up) several pixels of the CCD array of camera 2, due to the fact that the bands can be defocused by the lens or that the object may dissipate light by reflection, the bands have no clearly defined Y coordinate.
(22) The subpixel determination algorithm is used to determine the Y coordinate. The subpixel determination algorithm consists of the following:
(23) Projection unit 1 projects the image of parallel bands in
(24) The available options for resolving ambiguities when several lines are projected simultaneously:
(25) A conclusion can be made based on
(26)
(27) It can be understood from these drawings that relationship tg =T/Tz exists between angle , interval T and measurement area Tz, as well as relationship tg =Y/Z exists between Y and angle .
(28) It is obvious that the greater angle , the larger is the shift of the band Y observed in camera 2 image, with the band projected as line 19 in the camera image, which enables us to determine the Z coordinate with greater accuracy, i.e. our system has greater sensitivity to measurements along the Z axis. Besides, the greater the angle, the less the domain of determinacy Tz. This is obvious if the Tz value in
(29) With the minimum value of the triangulation angle the camera clearly perceives the projected line and longitudinal coordinate Y, but the perception accuracy of vertical coordinate Z is at its minimum. With the greatest value of the band triangulation angle the bands in the image begin merging, and it is difficult to determine longitudinal coordinate Y, but the perception accuracy of vertical coordinate Z is at its maximum. This stipulates the use of at least two cameras installed at different triangulation angles.
(30) The device in
(31) To solve this problem, it is suggested to use semitransparent mirror 34 or a prism in the path of the beams of camera 22 and the projection system, which makes it possible to space the camera and the projection unit further apart.
(32) The second solution for placing cameras as close to the projection unit as possible:
(33) Place cameras 22 and 23 on the right and left of projection unit 1.
(34) The third method is shown in
(35) Generally, the method for 3D measurement of an object with structured backlighting is implemented as follows. Using projection unit 1, a predetermined image with at least two non-crossing lines along one of its longitudinal axes is projected onto the controlled object. The light of projection unit 1 reflected from the object is recorded with at least two cameras located at different distances from the projection unit thus forming different triangulation angles between the central beam of the projection unit and the central beams of the cameras. In the image from the first camera 2 the longitudinal coordinates of the line centers are determined as the brightest pixels.
(36) Then each line projected by projection unit 1 and formed by the reflected light received by each camera is identified by comparing the coordinates of the lines perceived by the cameras. For this purpose the triangulation angle between the central beam of projection unit 1 and central beam of the first camera 22, placed at a minimum distance from projection unit 1 and a minimum angle 1, is chosen and set equal to the arctangent of the ratio of the distance between the projected bands and the focal depth Tz of this camera lens.
(37) Such conditions imposed on the relative position of projection unit 1 and camera 22 provide for the maximum unambiguity in identifying each projected band. Interval T in
(38) Longitudinal coordinates of the line centers and vertical coordinates are determined in the image of the first camera as the quotient of longitudinal coordinate Y1 by the tangent of the triangulation angle between the central beam of the projection unit and the central beam of the first camera.
(39) Using the line center search algorithmthe subpixel determination algorithmand based on the relationship Z=Y1/tg1 (Y1coordinates in the image from the first camera), the Z coordinates of all the projected bands are calculated with a certain error , which mainly depends on the triangulation angle 1, on the number of pixels in the CCD array of the camera, and the pixel noise of the selected camera.
(40) The line image width error (starting with the second camera) shall not exceed T/Cos .sub.2.
(41) To adjust the vertical coordinate Z, its value obtained with the second camera located at a greater triangulation angle .sub.2 than that of the first camera is used, wherefore the position of the same lines is identified in the second camera image as the lines closest to the longitudinal coordinates calculated as the product of the above vertical coordinate Z determined using the first camera and the tangent of the second camera triangulation angle. Thus, to adjust the Z coordinate of the projected bands, the second camera 23 located at a greater triangulation angle .sub.2 to the projection unit .sub.2>1 is used. Bands 20 and 21 projected by projection unit 1 onto the image from the second camera 23 look as 26 and 27. For clarity, bands 26 and 27 are represented with a slight shift, whereas in fact they merge in the image from the second camera and are hard to identify. But if the Z coordinate obtained earlier according to the formula Z=Y1/tg1 for band 20 is projected according to the formula Y2=Z*tg.sub.2 onto the image from camera 23, noise curve 28 becomes visible which will help us identify the position of band 20 onto the image from camera 23. The same procedure shall be followed for each band to differentiate it from others. The center of each line has to be re-determined with adjustment based on the image from camera 23, as well as the new more accurate Z coordinate calculated. Angle .sub.2 is chosen so that a does not exceed T/Cos .sub.2.
(42) Then, similarly to the described procedure for determining coordinates using the first camera, the second camera is used to determine the adjusted values of the longitudinal and vertical coordinates for these lines.
(43) The vertical coordinate value obtained using the third, fourth and subsequent cameras is used for further adjustment of the vertical coordinate. For further adjustment of Z coordinates of the projected bands additional cameras with large triangulation angles can be used to achieve the required accuracy of the band's Z coordinate definition. Each subsequent camera with a large triangulation angle shall meet the conditions provided above for cameras with a small triangulation angle. In some cases, at least two cameras are located on different sides of the projection unit, but the images and triangulation angles of all cameras have to be located on one side of the central beam of the projection unit, which can be ensured using a semitransparent mirror positioned across the central beams of the projection unit and, preferably, of the first camera in
(44) Coordinates are measured and determined using a computer processor, and a 3D image is output to the computer display.
(45) The technical result consists in simplification and complete automation of the process of controlling linear dimensions of three-dimensional objects, reduction of the measurement process duration and nearly complete elimination of errors in the event of mechanical oscillations arising in positions of the equipment (projection unit and cameras) in relation to the measurement object, as the projection unit and the cameras are executed as a portable tool in a single housing.
INDUSTRIAL APPLICABILITY
(46) This invention is implemented with general-purpose equipment widely used in the industry.