SYSTEM FOR DETERMINING THREE-DIMENSIONAL IMAGES
20210215476 · 2021-07-15
Inventors
Cpc classification
G01B11/2545
PHYSICS
G06T7/521
PHYSICS
G01B11/2513
PHYSICS
International classification
Abstract
The invention concerns a method of determination of a three-dimensional image of an object (Board), including the projection of a plurality of first images onto the object, each first projected image including first light patterns spaced apart by a first period; the acquisition, for each first projected image, of a first two-dimensional image of the object; the projection of a plurality of second images onto the object, each second projected image including second light patterns spaced apart by a second period different from the first period; the acquisition, for each second projected image, of a second two-dimensional image of the object; and the detection of a translucent area of the object by comparison of first signals obtained from the first images and of second signals obtained from the second images, and, for the translucent area, the determination of each point of the translucent area based on the first and second signals.
Claims
1. A method of determining a 3D image of an object, comprising: the projection by at least one projector of a plurality of first images onto the object, each first projected image comprising first light patterns spaced apart by a first period; the acquisition, for each first projected image, of at least one first two-dimensional image of the object by at least one image sensor; the projection by said at least one projector of a plurality of second images onto the object, each second projected image comprising second light patterns spaced apart by a second period different from the first period; the acquisition, for each second projected image, of at least one second two-dimensional image of the object by said at least one image sensor; and the determination of a first height, or of first intermediate data from which the first height can be determined, for each point of the object based on the first two-dimensional images, the determination of a second height, or of second intermediate data from which the second height can be determined, for each point of the object based on the second two-dimensional images, the detection of at least one translucent area of the object by comparison of the first and second heights or of first and second intermediate data and, for each point of the translucent area, the determination of a third height for said point based on the first and second heights for said point and on the first and second periods or of third intermediate data from which the third height is determined for said point based on the first and second intermediate data for said point and on the first and second periods.
2. The method according to claim 1, further comprising: the projection by said at least one projector of a plurality of third images onto the object, each third projected image comprising third light patterns spaced apart by a third period different from the first period and different from the second period; the acquisition, for each third projected image, of at least one third two-dimensional image of the object by said at least one image sensor; and the determination, for the translucent area, of the height of each point of the translucent area based on the first and second signals and on third signals obtained from the third images.
3. The method according to claim 1, wherein the first patterns are periodic along a given direction, with a period equal to the first period in the range from 1 mm to 15 mm.
4. The method according to claim 3, wherein the first light patterns comprise first light fringes.
5. The method according to claim 3, wherein the second patterns are periodic along the given direction, with a period equal to the second period in the range from 1 mm to 15 mm.
6. The method according to claim 5, wherein the second light patterns comprise second light fringes.
7. The method according to claim 6, wherein the first fringes are straight and parallel and wherein the second fringes are straight and parallel.
8. The method according to claim 1, wherein the first patterns are not periodic, the first period corresponding to the average interval between the first patterns.
9. The method according to any of claim 1, wherein the first light patterns are phase-shifted from a first projected image to the next one and wherein the second light patterns are phase-shifted from a second projected image to the next one.
10. A system for determining three-dimensional images of an object, comprising: at least one projector configured to project a plurality of first images onto the object, each first projected image comprising first light patterns spaced apart by a first period, and a plurality of second images onto the object, each second projected image comprising second light patterns spaced apart by a second period different from the first period; at least one image sensor configured to acquire, for each first projected image, at least one first two-dimensional image of the object and, for each second projected image, at least one second two-dimensional image of the object; and a unit (16) configured to determine a first height, or first intermediate data from which the first height can be determined, for each point of the object based on the first two-dimensional images, to determine a second height, or second intermediate data from which the second height can be determined, for each point of the object based on the second two-dimensional images, to detect at least one translucent area of the object by comparison of the first and second heights or of the first and second intermediate data and, for each point of the translucent area, to determine a third height for said point based on the first and second heights for said point and on the first and second periods or third intermediate data from which the third height is determined for said point based on the first and second intermediate data for said point and on the first and second periods.
11. The system according to claim 10, comprising a unit for supplying digital images and wherein the projector is capable of projecting said plurality of images onto the object, each of said images being formed by the projector based on one of said digital images.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] The foregoing and other features and advantages will be discussed in detail in the following non-limiting description of specific embodiments in connection with the accompanying drawings, in which:
[0024] The foregoing features and advantages, as well as others, will be described in detail in the following description of specific embodiments given by way of illustration and not limitation with reference to the accompanying drawings, in which:
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
DETAILED DESCRIPTION OF THE PRESENT EMBODIMENTS
[0033] For clarity, the same elements have been designated with the same reference numerals in the various drawings and, further, the various drawings are not to scale. Unless otherwise specified, expressions about, approximately, and substantially mean to within 10%, preferably to within 5%. Further, only those elements which are useful to the understanding of the present description have been shown and will be described.
[0034] In the following description, embodiments will be described in the case of the optical inspection of electronic circuits. However, these embodiments may apply to the determination of three-dimensional images of all types of objects, particularly for the optical inspection of mechanical parts. Call (OX) and (OY) two perpendicular directions. As an example, direction (OX) is horizontal.
[0035]
[0036] Electronic circuit Board is placed on a conveyor 12, for example, a planar conveyor. Conveyor 12 is capable of displacing circuit Board parallel to direction (OY). As an example, conveyor 12 may comprise an assembly of straps and of rollers driven by a rotating electric motor 14. As a variation, conveyor 12 may comprise a linear motor displacing a carriage supporting electronic circuit Board. Circuit Board for example corresponds to a rectangular card having a length and a width varying from 50 mm to 550 mm.
[0037] Optical inspection installation 10 comprises a system 15 for determining a 3D image of electronic circuit Board. According to an embodiment, system 15 is capable of determining a 3D image of circuit Board by projection of images, for example, fringes, onto the circuit to be inspected. System 15 may comprise an image projection device P comprising at least one projector, a single projector P being shown in
[0038] System 15 further comprises an image acquisition device C comprising at least one camera, for example, a digital camera. As an example, two cameras C are shown in
[0039] The means for controlling conveyor 12, camera C, and projector P of previously-described optical acquisition system 10 are within the abilities of those skilled in the art and are not described in further detail. As a variant, the displacement direction of circuit Board may be a horizontal direction perpendicular to the direction (OY) shown in
[0040] System 15 is capable of determining a 3D image of circuit Board. A 3D image of circuit Board corresponds to a cloud of points, for example, of several million points, of at least a portion of the external surface of circuit Board, where each point of the surface is located by its coordinates (x, y, z) determined with respect to a three-dimensional space reference system R.sub.REF (OX, OY, OZ). In the following description, plane (OX, OY) is called reference plane Pl.sub.REF. The z coordinate of a point of the surface of the object then corresponds to the height of the point measured with respect to reference plane Pl.sub.REF. As an example, reference plane Pl.sub.REF corresponds to the plane containing the upper surface or the lower surface of the printed circuit. Plane Pl.sub.REF may be horizontal. Preferably, direction (OZ) is perpendicular to plane (OX, OY), that is, perpendicular to the upper or lower surface of the printed circuit.
[0041]
[0042]
[0043]
[0044] The inventors have shown the existence of a dependency relationship between the error E which occurs during the determination of the points of the 3D image belonging to a translucent portion of the circuit and the period T of the images projected onto the circuit for the determination of the 3D image.
[0045]
[0046]
[0047]
[0048] The inventors have carried out many tests and have shown that for the translucent materials used in electronics and microelectronics, there is a relation close to proportionality between bias E and period T of the light patterns projected onto the object to be inspected.
[0049] Further, the inventors has shown by many tests that a relation close to proportionality between bias E and period T of the light patterns projected onto the object to be inspected is obtained whatever the type of periodic patterns used.
[0050]
[0051] According to an embodiment, each image projected for the determination of a 3D image comprises periodic patterns along a preferred direction. In particular, when patterns correspond to periodic fringes, the period of the patterns corresponds to the distance between two successive fringes. In the examples shown in
[0052] Further, the inventors have shown by many tests that a relation close to proportionality between bias E and period T of the light patterns projected onto the object to be inspected is also obtained, even when the projected light patterns do not have a periodic character but comprise spaced apart light patterns, the average space between adjacent light patterns, possibly along a preferred direction, then corresponding to the previously-described period T.
[0053]
[0054]
[0055] At step 30, first images are projected onto the object to be inspected, each first image comprising light patterns having a first period T1. Period T1 may be in the range from 1 mm to 15 mm. In the present embodiment of a method of determining a 3D image, at step 30, a plurality of first images are successively projected onto circuit Board. The first images differ from one another by an offset of the patterns along a preferred direction. As an example, for the image 24 shown in
[0056] According to an embodiment, processing unit 16 comprises a unit for determining a digital image and projector P is capable of projecting an image obtained from the digital image. According to an embodiment, projector P is of the type comprising a lamp emitting a beam which is directed towards an optical motor. The optical motor modulates the beam, according to the digital image, to form an image which is projected onto circuit Board. The optical motor may comprise an active area. As an example, the optical motor may comprise an array of liquid crystal shutters or LCD shutter which operates by transmission, the light beam crossing the LCD shutter. As a variant, the optical motor may implement the DLP (digital light processing) technology, which relies on the use of a device comprising an array of adjustable micro-mirrors, the light beam reflecting on the mirrors. As a variant, the optical motor may implement the LCoS (liquid crystal on silicon) technology, which relies on the use of a liquid crystal device, the light beam reflecting on the device. According to another variant, the optical motor may implement the GLV (grating light valve) technology, which relies on the use of a dynamically adjustable diffraction grating based on reflecting bands. According to another embodiment, projector P may implement at least one laser beam which is modulated according to the digital image, the image being obtained by an array scanning of the modulated laser beam.
[0057] Advantageously, when projector P is capable of projecting an image obtained from a digital image, the projected images may be simply obtained by modifying the digital image which controls projector P.
[0058] At step 32, second images are projected onto the object to be inspected, each second image comprising the same type of light patterns as the first images but with a second period T2 different from first period T1. Period T2 may be in the range from 1 mm to 15 mm. In the present embodiment of a method of determining a 3D image, at step 32, a plurality of second images with the patterns having the second period are successively projected onto circuit Board. The second images differ from one another by an offset of the patterns having the second period along a preferred direction. A 2D image is acquired during the projection of each new second image with light patterns onto circuit Board.
[0059] Generally, the larger period T, the greater the reconstruction depth, that is, the size of the height interval over which the 3D image may be determined by the method. Thereby, at least one of periods T1 or T2 is selected to have the desired reconstruction depth.
[0060] Step 32 may be repeated once or more than once with different periods.
[0061] At step 34, processing unit 16 determines a corrected 3D image of circuit Board.
[0062] According to an embodiment, processing unit 16 determines a first 3D image from the images acquired at step 30 and a second 3D image from the images acquired at step 32. Processing unit 16 then compares the first and second 3D images, for example, by determining, for each point of the 3D image, the difference between the height Z1 of the first 3D image and the height Z2 of the second 3D image. For the opaque portions of circuit Board, the difference between heights Z1 and Z2 is substantially null, for example, smaller the a given threshold. For the translucent portions of circuit Board, the difference between heights Z1 and Z2 is not null, for example, greater than a given threshold. Processing unit 16 thus determines the translucent portions of circuit Board. For each point of the translucent portions, processing unit 16 may determine the real height Z, for example, by extrapolation, from heights Z1 and Z2 and periods T1 and T2, considering that the relation between the height and the period is substantially linear.
[0063] Another embodiment of determination of a corrected 3D image will now be described. In this embodiment, the determination of the presence of translucent portions is carried out before the end of the method of determination of the first and second 3D images, which would normally be obtained with the first images and the second images, based on first intermediate data used for the determination of the first 3D image and on second intermediate data used for the determination of the second 3D image. As an example, the difference between the first intermediate data and the second intermediate data is determined. For the opaque portions of circuit Board, the difference between the first and second intermediate data is substantially null, for example, smaller than a given threshold. For the translucent portions of circuit Board, the difference between the first and second intermediate data is not null, for example, greater than a given threshold. Processing unit 16 thus determines the translucent portions of circuit Board. A determination of intermediate data corrected for the translucent portions is then performed and a 3D image corrected for the translucent portions is directly determined from the corrected intermediate data.
[0064] A more detailed embodiment will now be described for a specific example of a method of determining a 3D image.
[0065] Each point Q.sub.i of the scene has a corresponding point .sup.Cq.sub.i in the image plane of camera C and a corresponding point .sup.Pq.sub.i in the image plane of projector P. A reference frame R.sub.C(O.sub.C, X, Y, Z) associated with camera C is considered, where O.sub.C is the optical center of camera C, direction Z is parallel to the optical axis of camera C, and directions X and Y are perpendicular to each other and perpendicular to direction Z. In reference frame R.sub.C, to simplify the following description, it can approximately be considered that point .sup.Cq.sub.i has coordinates (.sup.Cu.sub.i, .sup.Cv.sub.i, f.sub.C), where f.sub.C is the focal distance of camera C. A reference frame R.sub.P(O.sub.P, X, Y, Z) associated with projector P is considered, where O.sub.P is the optical center of projector P, direction Z is parallel to the optical axis of projector P, and directions X and Y are perpendicular to each other and perpendicular to direction Z. In reference frame R.sub.P, to simplify the following description, it can be approximately considered that point .sup.Pq.sub.i has coordinates (.sup.Pu.sub.i, .sup.Pv.sub.i, f.sub.P), where f.sub.P is the focal distance of projector P.
[0066] Generally, calling P.sub.P the projection matrix of projector P and P.sub.C the projection matrix of camera C, one has the following equation system (1) for each point Q.sub.i, noted in homogeneous coordinates:
[0067] Each point Q.sub.i corresponds to the intersection of a line D.sub.C associated with camera C and of a line D.sub.P associated with projector P.
[0068] Each point .sup.Pq.sub.i of the image projected by projector P is associated a phase .sub.i(z.sub.i). Light intensity I.sup.C(.sup.Cq.sub.i(z.sub.i)), measured by the pixel at point .sup.Cq.sub.i of the image acquired by the camera and corresponding to point Q.sub.i, follows relation (2) hereafter:
I.sup.C(.sup.cq.sub.i(z.sub.i))=A(z.sub.i)+B(z.sub.i)cos .sub.i(z.sub.i)(2)
[0069] where A(h.sub.i) is the light intensity of the background at point Q.sub.i of the image, B(z.sub.i) shows the amplitude between the minimum and maximum intensities at point Q.sub.i of the projected image.
[0070] According to an example, projector P successively projects N different images onto the circuit, where N is a natural integer greater than 1, preferably greater than or equal to 4, for example, equal to 8.
[0071] A 2/N phase-shift is applied for each new first or second image projected with respect to the previous first or second projected image. Light intensity I.sub.d.sup.C(.sup.Cq.sub.i (z.sub.i)), measured by the pixel at point .sup.Cq.sub.i for the d-th image acquired by the camera corresponding to point Q.sub.i, follows relation (3) hereafter:
where d is an integer which varies from 0 to N1.
[0072] Vector .sub.i.sup.C(z.sub.i) is defined according to relation (4) hereafter:
[0073] It is a linear equation system. It can be demonstrated that phase .sub.i(z.sub.i) is given by relation (5) hereafter:
[0074] According to the previously-described embodiment where intermediate data are used for the determination of the translucent portions, phase .sub.i(z.sub.i) may correspond to the intermediate data used.
[0075] A literal expression of height z.sub.i can generally be obtained.
[0076] An example of expression of height z.sub.i will be described in a specific configuration where projector P and camera C are of telecentric type and where the following conditions are fulfilled:
the optical axes of projector P and of camera C are coplanar;
the projected images are of the type shown in
lines D.sub.P are perpendicular to plane Pl.sub.REF and lines D.sub.C form an angle with plane Pl.sub.REF.
[0077] In this configuration, equation system (1) may then be simplified according to the following equation system (6):
[0078] considering that point Q.sub.iREF of coordinates (x.sub.iREF, y.sub.iREF, 0) is the point of reference plane Pl.sub.REF associated with point of camera C.
[0079] In the image plane of projector P, abscissa .sup.Pu.sub.i of point .sup.Pq.sub.i follows, for example, relation (7) hereafter:
.sup.Pu.sub.i=a.sub.i(z.sub.i)+b(7)
[0080] where a and b are real numbers, a being equal to p.sub.1/2 with p.sub.1 corresponding to the pitch of sinusoidal fringes 25.
[0081] Based on relations (6) and (7), the following relation (8) is obtained:
[0082] where .sub.1(Q.sub.iREF) is equal to the phase at point Q.sub.iREF of reference plane Pl.sub.REF, that is, to the phase in the absence of circuit Board.
[0083] According to the previously-described embodiment where the 3D images are used for the determination of the translucent portions, height z.sub.i may be used.
[0084] Specific embodiments have been described. Various alterations and modifications will occur to those skilled in the art. In particular, although an embodiment has been described where the determination of the 3D image is performed from an algorithm using the camera and the projector, it should be clear that the 3D image determination method may be implemented by a triangulation method using at least two cameras.