DEPTH DATA MEASURING HEAD, MEASURING APPARATUS, AND MEASURING METHOD
20250327661 · 2025-10-23
Inventors
Cpc classification
G01B11/2545
PHYSICS
G01B11/245
PHYSICS
International classification
Abstract
Disclosed are a depth data measuring head, a measuring device and a measuring method. The measuring head comprises: a projection device for projecting linear light to an imaging area; an image sensor comprising N memory cell sets, each memory cell set exposed in an exposure switch cycle t.sub.e that is phase-shifted by 2/N relative to each other. The projection device completes a single pattern scan within a scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, a projection cycle t.sub.p includes N waveform projection areas with a width of 2/N and light intensity encoded based on the imaging pattern. Therefore, when a pattern scan is completed within the scanning cycle
, each of the N N memory cell sets of the image sensor images a different striped-light pattern, and there is a phase shift of 2/N between the N striped-light patterns. The present invention can realize the acquisition of multiple images of a single linear light scan, greatly improve the synthesis speed of the depth map, and is suitable for capturing a moving target object.
Claims
1. A depth data measuring head, comprising: a projection device for projecting linear light that moves in a first direction to an imaging area, wherein the length of the linear light is in a second direction perpendicular to the first direction; an image sensor including N memory cell sets, each memory cell set exposed in an exposure switch cycle t.sub.e that is phase-shifted by 2/N relative to each other, where N is an integer greater than 1, wherein the projection device completes a single pattern scan within a scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, the linear light changes brightness with a projection cycle t.sub.p, which has the same duration as the exposure switch cycle t.sub.e, the projection cycle t.sub.p includes N waveform projection areas, each with a width of 2/N, where the light intensity in each waveform projection area is encoded such that when a single pattern scan is completed within the scanning cycle
, the N memory cell sets in the image sensor each capture a different striped-light pattern, the N striped-light patterns form a set of N-step phase-shifted patterns, each having a phase-shifted of 2/N relative to the others.
2. The depth data measuring head according to claim 1, wherein the projection cycle t.sub.p of the linear light is synchronized with the exposure switch cycle t.sub.e of the first memory cell set of the N memory cell sets.
3. The depth data measuring head according to claim 1, wherein N=2.sup.n, and n is an integer greater than or equal to 1.
4. The depth data measuring head according to claim 1, wherein each waveform projection area corresponds to a projected rectangular wave with a width of 2/N or 0, and based on the light intensity distribution of the set of N-step phase-shifted patterns corresponding to the subphase , the light intensity of each waveform projection area is determined.
5. The depth data measuring head according to claim 4, wherein the set of N-step phase-shifted patterns is a sine wave four-step phase-shifted pattern, and the light intensity value of each waveform projection area in each projection cycle t.sub.p is obtained based on the exposure of N memory cell sets corresponding to the waveform projection area, wherein the light intensity value is not less than zero.
6. The depth data measuring head according to claim 1, wherein the dwell time t.sub.c corresponding to the linear light scanning on each column of pixels is not less than the quotient of the scanning cycle divided by the number of columns C, and the dwell time t.sub.c is more than 10 times the projection cycle t.sub.p.
7. The depth data measuring head according to claim 5, wherein in each subphase T.sub.i, the linear light is projected m times with the projection cycle t.sub.p, and the duration of each subphase T.sub.i is greater than the dwell time t.sub.c.
8. The depth data measuring head according to claim 1, wherein each pixel in the image sensor includes N memory cells, and each of the N memory cells of a pixel belongs to one of the N memory cell sets.
9. The depth data measuring head according to claim 8, wherein the N memory cells included in each pixel are N charge memory cells, and the exposure positions of the N charge memory cells corresponding to one pixel are the same, but the exposure cycles are phase-shifted 2/N from each other, and when a single pattern scan is completed within the scanning cycle , the set of N-step phase-shifted patterns are obtained from the N sets of charge memory cells, and the set of N-step phase-shifted patterns is used to generate one depth map of the imaging area.
10. The depth data measuring head according to claim 1, wherein the image sensor includes N sets of uniformly distributed pixels on the imaging surface, each pixel having one corresponding memory cell, when a single pattern scan is completed within the scanning cycle , the set of N-step phase-shifted patterns is obtained from the N memory cell sets corresponding to the N pixel sets, and the set of N-step phase-shifted patterns is used to generate one depth map of the imaging area.
11. The depth data measuring head according to claim 1, wherein the projection device completes a single pattern scan in a first scanning cycle such that the N memory cell sets in the image sensor each capture a different striped-light pattern, and the N striped-light patterns form a set of Gray code patterns, the projection device completes another single pattern scan in a second scanning cycle
, such that the N memory cell sets each capture a different striped-light pattern, and the N striped-light patterns form the set of N-step phase-shifted pattern, wherein based on the set of Gray code patterns, one depth map of the imaging area is generated from the set of N-step phase-shifted patterns.
12. The depth data measuring head according to claim 1, wherein the projection device includes: a light-emitting device for generating linear light; and a reflecting device for reflecting the linear light, thereby projecting the linear light moving in a direction perpendicular to the stripe direction to the capturing area at a predetermined frequency, the length direction of the linear light is the length direction of the projected striped light, and the reflecting device includes one of the following: a mechanical galvanometer that vibrates back and forth at the predetermined frequency; a micromirror device that reciprocates at the predetermined frequency; and a mechanical rotating mirror that rotates unidirectionally at the predetermined frequency.
13. The depth data measuring head according to claim 1, wherein the image sensor includes a first image sensor and a second image sensor whose relative positions are fixed, wherein the first image sensor and the second image sensor each include the N memory cell sets and are exposed synchronously with each other.
14. The depth data measuring head according to claim 1, wherein the projection device completes pattern scans within scanning cycles , each of the scanning cycle
includes multiple repeating subcycles
, each subcycle
includes N subphases T.sub.1-T.sub.N, and in each subphase T.sub.i, the linear light is projected with brightness changes at the projection cycle t.sub.p, the duration of the projection cycle t.sub.p is the same as the duration of the exposure switch cycle t.sub.e, and the projection cycle t.sub.p includes a bright area, wherein, in the subphases T.sub.1-T.sub.N, the position of the bright area in the projection cycle t.sub.p changes at an interval of 2/N phase, so that when a single pattern scan is completed within each scanning cycle
, the N memory cell sets of the image sensor each image a different striped-light pattern, and when pattern scans are completed within scanning cycles
, N striped light patterns constitute a set of N step phase-shifted patterns with a 2/N phase-shifted between each other, wherein is an integer greater than or equal to 2.
15. The depth data measuring head according to claim 1, wherein each subcycle includes N subphases T.sub.1-T.sub.N, and in each subphase T.sub.i, the linear light is projected with brightness changes at the projection cycle t.sub.p, the duration of the projection cycle t.sub.p is the same as the duration of the exposure switch cycle t.sub.e, and the projection cycle t.sub.p includes a bright area, wherein, in the subphases T.sub.1-T.sub.N, the position of the bright area in the projection cycle t.sub.p changes at an interval of 2/N phase, so that when a single pattern scan is completed within each scanning cycle
, the N memory cell sets of the image sensor each image a different striped-light pattern, and the N striped-light patterns constitute the set of N-step phase-shifted patterns with a 2/N phase-shifted between each other.
16. A depth data measuring device, comprising: a depth data measuring head, comprising: a projection device for projecting linear light that moves in a first direction to an imaging area, wherein the length of the linear light is in a second direction perpendicular to the first direction; an image sensor including N memory cell sets, each memory cell set exposed in an exposure switch cycle t.sub.e that is phase-shifted by 2/N relative to each other, where N is an integer greater than 1, wherein the projection device completes a single pattern scan within a scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, the linear light changes brightness with a projection cycle t.sub.p, which has the same duration as the exposure switch cycle t.sub.e, the projection cycle t.sub.p includes N waveform projection areas, each with a width of 2/N, where the light intensity in each waveform projection area is encoded such that when a single pattern scan is completed within the scanning cycle
, the N memory cell sets in the image sensor each capture a different striped-light pattern, the N striped-light patterns form a set of N-step phase-shifted patterns, each having a phase-shifted of 2/N relative to the others; and a processor connected to the depth data measuring head, configured to obtain one depth map of the imaging area from the obtained N striped-light patterns when completing a single pattern scan within the scanning cycle
.
17. The depth data measuring device according to claim 16, wherein the depth data measuring head includes: a first depth imaging measuring head and a second depth imaging measuring head whose relative positions are fixed, wherein after the first depth imaging measuring head completes a first pattern scan, the second depth imaging measuring head performs a second pattern scan, and a first set of N-step phase-shifted patterns obtained by the first pattern scan and a second set of N-step phase-shifted patterns obtained by the second pattern scan are synthesized into depth information of an object to be imaged in an imaging area based on the relative position.
18. A depth data measuring method, comprising: projecting linear light that moves in a first direction to an imaging area, wherein the length of the linear light is in a second direction perpendicular to the first direction, linear light projection completes a single pattern scan in one scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, the linear light changes brightness with a projection cycle t.sub.p, which has the same duration as the exposure switch cycle t.sub.e, the projection cycle t.sub.p includes N waveform projection areas, each with a width of 2/N, where the light intensity in each waveform projection area is encoded, where N is an integer greater than 1; capturing the imaging area using an image sensor including N memory cell sets to obtain N image frames under the linear light scanning projection, each memory cell set exposed in an exposure switch cycle t.sub.e that is phase-shifted by 2/N relative to each other; and obtaining depth data of the object to be measured in the imaging area based on the image frames, wherein the light intensity in each waveform projection area is encoded such that when a single pattern scan is completed within the scanning cycle
, the N memory cell sets in the image sensor each capture a different striped-light pattern, the N striped-light patterns form a set of N-step phase-shifted patterns, each having a phase-shifted of 2/N relative to the others.
19. The depth data measuring method according to claim 18, wherein the depth data measuring head includes a first depth imaging measuring head and a second depth imaging measuring head whose relative positions are fixed, and the method includes: using a first depth imaging measuring head of a depth data measuring device to perform a first pattern scan on an imaging area to obtain a first set of N-step phase-shifted patterns; using a second depth imaging measuring head of the depth data measuring device to perform a second pattern scan on the imaging area to obtain a second set of N-step phase-shifted patterns; synthesizing depth information of an object to be imaged in the imaging area from the first set of N-step phase-shifted patterns and the second set of N-step phase-shifted patterns based on the relative positions of the first depth imaging measuring head and the second depth imaging measuring head.
20. The method according to claim 19, further includes: the depth data measuring device captures multiple sets of first set N-step phase-shifted patterns and multiple sets of second set N-step phase-shifted patterns in relative motion with respect to the object to be imaged; and synthesizes the depth information generated from the multiple sets of first set N-step phase-shifted patterns and multiple sets of second set N-step phase-shifted patterns into the model information of the object to be imaged based on a calibration point.
Description
BRIEF DESCRIPTION OF FIGURES
[0023] The above and other objects, features and advantages of the present disclosure will become more apparent by describing the exemplary embodiments of the present disclosure in more detail with reference to the accompanying drawings, wherein, in the exemplary embodiments of the present disclosure, the same reference numerals generally represent same parts.
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031] .
[0032]
[0033] .
[0034]
[0035] .
[0036]
[0037] .
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
DETAILED DESCRIPTION
[0045] Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
[0046] According to the principle of structured light measurement, accurate determination of the scan angle is the key to the entire measurement system. Point and line-shaped structured light can calculate and obtain the scanning angle through mechanical devices such as rotating mirrors, and the significance of image encoding and decoding is to determine the scanning angle of the encoded structured light (i.e., surface structured light) system.
[0047] To improve matching accuracy, the number of projected patterns in the temporal code can be increased.
[0048]
[0049] The projection device 310 is used to scan and project structured light with stripe coding to the capturing area. For example, in three consecutive image frame projection cycles, the projection device 310 can successively project three patterns as shown in
[0050] As shown in
[0051] In the present disclosure, define the direction in which the light exits the measuring head as the z direction, the vertical direction of the shooting plane as the x direction, and the horizontal direction as the y direction. Therefore, the stripe structured light projected by the projection device can be the result of the linear light extending in the x direction moving in the y direction. Although in other embodiments, the stripe structured light obtained by the linear light extending in the horizontal y direction moving in the x direction can also be synchronized and imaged, it is still preferred to use vertical striped light for illustration in the present disclosure.
[0052]
[0053] In practical applications, the laser generator is used to generate linear and/or infrared laser light, and the laser generator performs high-speed switching to scan and project light and dark structured light corresponding to the stripe code. High-speed switching can include high-speed switching of laser generators and high-speed code switching.
[0054] In one embodiment, the laser generator can continuously emit laser light with the same intensity, and the projected striped-light pattern is realized by turning on and off the laser generator. In this case, since the laser generator only projects light of one intensity with a different periodic duty cycle, each pixel of the image sensor integrates the projected light to determine the presence or absence of irradiation light, the equipped image sensor can be a black and white image sensor.
[0055] In another embodiment, the laser generator itself can emit laser light with varying light intensity, for example, a laser light whose output light intensity changes sinusoidally within a large period according to the applied power. The sinusoidally transformed laser light described above can be combined with striped light projection, whereby a pattern of alternating light and dark with varying brightness between the bright stripes can be scanned and projected. In this case, the image sensor needs to have the capability of imaging different light intensities, so it can be a multi-level grayscale image sensor. Clearly, grayscale projection and imaging can provide more precise pixel-to-pixel matching than black-and-white projection and imaging, thereby improving the accuracy of depth data measurements.
[0056] In one embodiment, the laser generator 411 can be a linear laser generator that generates a line of light extending in the x direction (a direction perpendicular to the paper in
[0057] It should be understood that in order to realize the projection of the striped-light pattern, the linear light itself needs to change from bright to dark (or in a simple implementation, change from turn on to turn off) during the continuous movement of the linear light in the y direction. For example, when it is necessary to scan the first pattern of
[0058] In one embodiment, the above-mentioned reflection mechanism 412 can be a micromirror device (also referred to as a digital micromirror device, DMD), and can be implemented as a MEMS (Micro Electro Mechanical System).
[0059] In order to obtain a high-precision depth map, the depth data measuring head shown in
[0060] In view of this, the present invention proposes a new depth data measurement solution, which utilizes an image sensor provided with different memory cell sets capable of phase-shifted exposure, and by skillfully adjusting the brightness variations of the projected linear light, in a single scan of the linear light, different memory cell sets of the image sensor can each obtain a different phase-shifted stripe image, thereby realizing multiple striped images capture in a single linear light scan. As a result, the synthesis speed of the depth map can be greatly improved, and it is suitable for shooting moving target objects.
[0061] In one embodiment, the present invention can be implemented as a depth imaging measuring head, including: a projection device and an image sensor. The projection device can be used to project a linear light moving along a first direction (e.g., the y direction in
[0062] Each pixel in the image sensor includes N memory cells, each of the N memory cells of each pixel belongs to one of N memory cell sets, and each memory cell set is exposed with an exposure switching cycle t.sub.e separated by a phase of 2/N, wherein N is an integer greater than 1.
[0063] Here, for ease of understanding, N=4 is taken as an example to illustrate the structure and exposure of the image sensor.
[0064] The memory cell is, for example, a unit for accumulating and storing the charge generated by the charge photodiode based on the received light and outputting it based on the charge storage amount. The N memory cells corresponding to the same pixel all obtain charge from one photosensitive element, but each is exposed with an exposure switching cycle t.sub.e that is 2/N phases apart from each other. In each pixel, there is one memory cell corresponding to one of the N memory cell sets. Thus, in a 600800 pixel image sensor, if it is assumed that N=4, 6008004 memory cells are included. These 6008004 memory cells belong to 4 sets, each set includes 600800 memory cells, each corresponding to a different pixel. The memory cells of each set can be exposed with the same cycle, and there is a phase difference of /2 between adjacent sets. Since the N memory cells share a photosensitive element, it is possible to achieve multi-image acquisition in a single scan without reducing the original resolution of the image sensor.
[0065]
[0066]
[0067] When the image sensor used is capable of performing grouped phase-shifted exposure as shown in
[0068] Specifically, the projection device can complete a pattern scan within a scanning cycle (for example, recorded as scanning cycle ). In the scanning cycle
, it is assumed that the linear light scans the imaging area at a uniform speed and repeats with multiple repeating subcycles
. A light projection embodiment in each subcycle
is described in detail below.
[0069] In each subcycle , the linear light changes its brightness with a projection cycle t.sub.p, and the duration of the projection cycle t.sub.p is the same as the duration of the exposure switch cycle t.sub.e. The projection cycle t.sub.p includes N waveform projection areas with a width of 2/N, and the projected light intensity of each waveform projection area is encoded, so that when a pattern scan is completed within the scanning cycle
, the N memory cell sets of the image sensor each image a different striped-light pattern, and the N striped-light patterns constitute a set of N-step phase-shifted patterns with a 2/N phase shift between each other.
[0070] In one embodiment, each waveform projection area corresponds to a projected rectangular wave with a width of 2/N or 0 (0 can also be regarded as a rectangular wave with a light intensity of 0), and based on the light intensity distribution of a set of N-step phase-shifted patterns corresponding to the subphase , the light intensity of each waveform projection area is determined.
[0071] In one embodiment, a set of N-step phase-shifted patterns is a sine wave four-step phase-shifted pattern, and the light intensity value of each waveform projection area in each projection cycle t.sub.p is obtained based on the exposure of N memory cell sets corresponding to the waveform projection area, wherein the light intensity value is not less than zero.
[0072] For the convenience of explanation, it is assumed here that N=4, and it is assumed that the laser intensity of each cycle during the line laser scanning projection cycle t.sub.p is Q.sub.1/Q.sub.2/Q.sub.3/Q.sub.4 respectively. Therefore, within a subcycle , the integrated brightness of the four pixels is as follows:
[0073] Therefore, the values of Q.sub.1.sup.Q.sub.4 can be determined according to the pattern type corresponding to the required 4-step phase-shifted imaging.
[0074] . In the example shown in
Here, Q can be regarded as the integral brightness of the brightest part of the phase-shifted stripe shown in . When the linear light is scanned at a uniform speed, the value of t can correspond to pixels at different positions in the image sensor.
[0075] The brightness values of Q.sub.1.sup.Q.sub.4 with respect to t can be inversely calculated through formula (2). Since there are N=4 unknown numbers Q.sub.1.sup.Q.sub.4 in formula (2), and the rank of formula (2) is N1=3, Q.sub.1.sup.Q.sub.4 can actually have countless solutions. One solution is given as follows:
Where O*A=Q, O is the number of exposure cycles received by each pixel when the pixel is scanned by the linear light (for example, in the following example, each pixel column can complete 100 exposures in 2 us when it is scanned by the linear light, and O can be regarded as 100).
[0076] However, since the light intensity cannot be negative, the values of Q.sub.1.sup.Q.sub.4 need to be kept non-negative. Formula (3) is valid in the range of t=0.sup./4, and when the linear light is scanned to the range of t=/4.sup./2, another solution value can be given based on formula (2):
[0077]
[0078] As shown and combined with
[0079] Assuming that the scanning cycle is 3.84 ms, the image sensor includes 1920 columns and the linear light projection cycle t.sub.p is 20 ns, since the time each pixel is scanned by the linear light, that is, the dwell time t.sub.c can be equal to the quotient of the scanning cycle
divided by the number of columns C, that is, 3.84 ms/1920=2 us, or considering that the linear light has a certain width, the dwell time t.sub.c is not less than 2 us, so each pixel column can complete 100 exposures in the 2 us of the linear light. In this case, O=100.
[0080] In one embodiment, the linear light can be made to maintain the waveform of Q.sub.1=A/2, Q.sub.2=Q.sub.3=0, Q.sub.4=A/2 corresponding to t=0 for the first 100 projection cycles t.sub.p (here A=Q/O=Q/100) at the beginning of the current subcycle . In a preferred embodiment, based on the slight change of t, the values of Q.sub.1.sup.Q.sub.4 can be fine-tuned in each projection cycle t.sub.p, such as using the solution value of formula (2).
[0081] As mentioned above, the subcycle is repeated for a predetermined number of times, for example, M times, and thereby a scanning cycle
is completed.
. For example, in the example of
is repeated 16 times, thereby obtaining a four-step phase-shifted diagram with a phase difference of /2. Obviously,
of linear light scanning. In the embodiment where the values of Q.sub.1.sup.Q.sub.4 are fine-tuned in each projection cycle t.sub.p based on the solution value of formula (2), each projection cycle t.sub.p has a small increment of t=2/6000=/3000 compared to the previous projection cycle t.sub.p, which leads to different values.
[0082] Returning to
[0083] As the linear light scans from the position corresponding to t=/6 to the position corresponding to t=/3, the values of Q.sub.1 to Q.sub.4 can continue to be solved. Since cos t=sin t at t=/4, and cos t<sin t in the process of /4.sup.3/4, the formula (3) that Q.sub.4<0 is no longer applicable, and the formula (4) can be used to solve that Q.sub.1=3A/4, Q.sub.2=0.183A, Q.sub.3=0.067A, and Q.sub.4=0 at t=/3. And at t=/3, it can be considered that the linear light scans to the 10th pixel column in the 60 pixel columns of the current cycle.
[0084] As the linear light scans from the position corresponding to t=/3 to the position corresponding to t=/2, the values of Q.sub.1.sup.Q.sub.4 can be solved based on formula (4). At t=/2, it can be considered that the linear light scans to the 15th pixel column in the 60 pixel columns of the current cycle, and based on formula (4), it can be solved that Q.sub.1=A/2, Q.sub.2=A/2, Q.sub.3=0, and Q.sub.4=0 at t=/2. It should be noted that since pixels 1-4 perform 4-step phase-shifted imaging, Q.sub.1+Q.sub.2+Q.sub.3+Q.sub.4=A exists in each projection cycle t.sub.p, and since pixels 1-4 are 2/N phases apart from each other and are also exposed in the projection cycle t.sub.p and the on time accounts for 10%, in each projection cycle t.sub.p, pixels 1-4 can obtain a total integrated brightness of 2A*t.sub.p/4.
[0085] As described above, in conjunction with
[0086] In addition, it should be understood that although subphases T.sub.1-T.sub.4 are shown in ), when realizing the above sinusoidal wave 4-step phase-shifted pattern, it is only necessary to calculate the change in the brightness value of the projected laser within a subcycle
according to the above formula (2), and there is no need to additionally divide the subphases T.sub.1-T.sub.4.
[0087] In one embodiment, other striped lights and N-step phase-shifted imaging can be combined to obtain a more accurate depth image. in each 4-step phase-shifted pattern. However, due to the cyclic repetition of the 4-step phase-shifted pattern, there is a problem of being unable to identify cross-cycle depth jumps. In this case, Gray code can be used to perform preliminary imaging of the subject in the capturing space, and then 4-step phase-shifted imaging can be performed. Specifically, the projection device can complete a pattern scan in the first scanning cycle
(for example, the corresponding waveform of each linear light projection cycle is solved by the illustrated light-dark diagram, so that each of the N memory cell sets of the image sensor images a different striped-light pattern, and the N striped-light patterns constitute a set of Gray code patterns. The projection device completes a pattern scan in the second scanning cycle
, so that each of the N memory cell sets of the image sensor images a different striped-light pattern, and the N striped-light patterns constitute the set of N-step phase-shifted patterns, wherein, based on the set of Gray code patterns, a depth map of the imaging area is generated from the set of N-step phase-shifted patterns.
[0088] In addition, when multiple scanning cycles are introduced, not only other striped-light patterns plus N-step phase-shifted patterns can be implemented, but also N-step phase-shifted patterns can be implemented. Here, is an integer greater than or equal to 2. Specifically, the projection device completes pattern scans within scanning cycles
, each of the scanning cycle
includes multiple repeating subcycles
, each subcycle
includes N subphases T.sub.1-T.sub.N, and in each subphase T.sub.i, the linear light is projected with brightness changes at the projection cycle t.sub.p, the duration of the projection cycle t.sub.p is the same as the duration of the exposure switch cycle t.sub.e, and the projection cycle t.sub.p includes a bright area, wherein, in the subphases T.sub.1-T.sub.N, the position of the bright area in the projection cycle t.sub.p changes at an interval of 2/N phase, so that when a single pattern scan is completed within each scanning cycle
, the N memory cell sets of the image sensor each image a different striped-light pattern, and when pattern scans are completed within scanning cycles
, N stripe patterns constitute a set of N step phase-shifted patterns with a 2/N phase-shifted between each other, wherein is an integer greater than or equal to 2.
[0089] For ease of understanding, =2, N=4 is used as an example. That is, an image sensor including 4 pixel sets exposed by exposure switch cycles the with a phase interval of 2/N, and two scans are used to achieve the acquisition of an eight-step phase-shifted pattern.
[0090] In order to achieve the acquisition of the eight-step phase-shifted pattern using image sensors with 4 memory cell sets, the projection device needs to perform two scans. In the first scan, that is, in the first scanning cycle , multiple repeating subcycles
are included, and in each subcycle
, 4 subphases T.sub.1-T.sub.4 are included. The position of the bright area in the projection cycle t.sub.p changes at intervals of /4 phase, so that when a pattern scan is completed within the first scanning cycle
, the first 4 patterns in the eight-step phase-shifted pattern are obtained. In the second scan, that is, in the second scan cycle
, it also includes multiple repeating subcycles
, and in each subcycle
, it also includes N subphases T.sub.1-T.sub.4. The position of the bright area in the projection cycle t.sub.p changes at intervals of /4 phase (but the phase of the bright area is different from that in the first scan), so that when a pattern scan is completed in the second scan cycle
, the last four patterns in the eight-step phase-shifted pattern are obtained. Therefore, the eight-step phase-shifted pattern can be obtained through two scans, thereby achieving higher depth imaging accuracy.
[0091] In addition, although the example of N=4 is given above for the convenience of explanation, in other embodiments, N may also take other values. Specifically, N=2.sup.n, where n is an integer greater than or equal to 1. Thus, for example, a higher precision 8-step phase shift using an image sensor with 8 memory cell sets, a 16-step phase shift using an image sensor with 16 memory cell sets, etc. can be achieved.
[0092] The image sensor may include N pixel sets, corresponding to N memory cell sets. It may also be that the N memory cells included in each pixel of the image sensor are N charge memory cells, and the exposure positions of the N charge memory cells corresponding to the same pixel are the same, but the exposure cycles are phase-shifted 2/N from each other, and when a pattern scan is completed within the scanning cycle , the set of N-step phase-shifted patterns are obtained from the N charge memory cell sets, and the set of N-step phase-shifted patterns are used to generate a depth map of the imaging area.
[0093] . In the example shown in
[0094] So, based on formulas (1) and (5), solve Q.sub.1.sup.Q.sub.4 and get an optimized solution as follows:
Wherein, O*A=Q, O is, for example, the number of exposure cycles received by each pixel when the linear light is scanned.
[0095] In the optimization solution based on formula (6), each repeating subcycle can be divided into 4 subphases T.sub.1-T.sub.4, and the subphases T.sub.1-T.sub.4 each corresponds to the 0.sup./2 part, /2.sup. part, .sup.3/2 part and 3/2.sup.2 part of a 2 subcycle T, and the position of the bright area in the projection cycle t.sub.p changes at intervals of 2/4 phase.
[0096] In the example of N=4, each subcycle includes 4 subphases T.sub.1-T.sub.4.
[0097] As shown, the projected laser is projected with a projection cycle t.sub.p of the same length as the exposure switch cycle t.sub.e, and it can be considered that each laser projection cycle t.sub.p is always synchronized with the exposure switch cycle t.sub.e of the first memory cell set (although the laser projection cycle t.sub.p will be turned on at different phases in different subphases), and is switched at a duty cycle of 25% (i.e., the bright area is 2/N), that is, the waveform is a rectangular wave with a duty cycle of 25%. In the four subphases T.sub.1-T.sub.4, the projection start time of the projection laser in each projection cycle t.sub.p is respectively aligned with the exposure start time of the 1st to 4th memory cell sets.
[0098] Specifically, first, as shown in
[0099] As shown in
[0100] As shown in
[0101] As shown in is completed, and the subphase T.sub.1 of the next subcycle
is entered.
[0102] In a simple implementation, the number of projection cycles of each subphase can be made the same, that is, m.sub.1=m.sub.2=m.sub.3=m.sub.4, that is, the duration of T.sub.1-T.sub.4 is the same. In this case, if the linear light scans the imaging plane at a uniform speed, the phase-shifted diagram shown in for a predetermined number of times, for example, M times, and thereby complete a scanning cycle
.
. For example, in the example of
is repeated 16 times, thereby obtaining a four-step phase-shifted diagram of light and dark stripes with a phase difference of /2.
[0103] Therefore, when obtaining the N-step phase-shifted diagram of light and dark stripes, each subcycle includes N subphases T.sub.1-T.sub.N, and in each subphase T.sub.i, the linear light changes brightness with a projection cycle t.sub.p, the duration of the projection cycle t.sub.p is the same as the duration of the exposure switch cycle t.sub.e, and the projection cycle t.sub.p includes a bright area, wherein in the subphases T.sub.1-T.sub.N, the position of the bright area in the projection cycle t.sub.p changes at an interval of 2/N phase, thereby enabling the N memory cell sets of the image sensor to image a different striped-light pattern when a pattern scan is completed within the scanning cycle
, and there is a phase shift of 2/N between the N striped-light patterns. Thus, through one scan of the linear light, for example, from left to right (corresponding to one scanning cycle
), a set of N-step phase-shifted patterns can be directly obtained from N memory cell sets of the image sensor.
[0104] The dwell time t.sub.c corresponding to the linear light scanning on each column of pixels is not less than the quotient of the scanning cycle divided by the number of columns C. In order to ensure sufficient exposure, the dwell time t.sub.c is more than 10 times the projection cycle t.sub.p, and preferably, the dwell time t.sub.c is more than 50 times the projection cycle t.sub.p. For example, assuming that the scanning cycle
is 3.84 ms (i.e., the frame rate is 1000/3.84260 frames/s), when the image sensor includes 1920 columns (in the case of the pixel unit shown in
divided by the number of columns C, i.e., 3.84 ms/1920=2 us, or, considering that the linear light has a certain width, the dwell time t.sub.c is not less than 2 us. When the linear light projection cycle t.sub.p is 20 ns and the duty cycle is 25%, each pixel column can complete 100 exposures in the 2 us when the linear light scans, and the two memory cell sets corresponding to the bright area position of the current projection cycle t.sub.p can achieve an exposure of 5 ns100=0.5 us. In addition, since the time for the linear light to scan through a column of pixels is long enough compared to the exposure cycle of the pixels (for example, 100 times of the above example), for the currently irradiated pixel column, it can be approximately considered that the linear light itself is a light source whose position has not moved in these 2 us.
[0105] In order to achieve four-step phase shift, the duration of each subphase T.sub.i must be no less than the time for the linear light to scan through a pixel unit column. When using the image sensor with 1920 columns in the above example to obtain the four-step phase-shifted diagram with 32 stripes (16 bright stripes and 16 dark stripes) shown in
[0106] As shown above, in combination with that constitute a scanning cycle
. Thus, a set of N-step phase-shifted patterns obtained is a striped-light pattern with obvious distinction between bright and dark areas and repeated multiple times as shown in
[0107] The above embodiment of obtaining the light and dark stripes shown in
[0108] The projection and imaging solution of the present invention can be used for a monocular solution (i.e., a solution equipped with one image sensor) or a binocular solution (i.e., a solution equipped with two image sensors with fixed relative positions for synchronous imaging). When the image sensor includes a first image sensor and a second image sensor with fixed relative positions, the first image sensor and the second image sensor can each include the N memory cell sets and be exposed synchronously with each other. In other words, in the binocular solution, when a pattern scan is completed within the scanning cycle , the first image sensor can obtain N striped light images, the second image sensor can obtain N striped light images, and these 2N striped light images can be used to obtain depth data.
[0109] Further, in order to realize scanning projection, the projection device of the present invention includes: a light emitting device for generating linear light; and a reflecting device for reflecting the linear light, projecting the linear light moving in the vertical direction of the stripe direction to the capturing area at a predetermined frequency, the length direction of the linear light is the length direction of the projected stripe, and the reflecting device includes one of the following: a mechanical galvanometer that reciprocates at the predetermined frequency; a micromirror device that reciprocates at the predetermined frequency; and a mechanical rotating mirror that rotates unidirectionally at the predetermined frequency. Here, the projected linear light can be a linear light with a high-order Gaussian or flat-top Gaussian distribution, thereby providing a highly consistent brightness distribution in the width direction of the linear light.
[0110] The present invention also discloses a measuring device using the above-mentioned measuring head. Specifically, a depth data measuring device can include the depth data measuring head as described above, and a processor connected to the depth data measuring head, which is configured to obtain a depth map of the imaging area from the N striped-light patterns obtained when a pattern scan is completed within the scanning cycle . In the binocular solution, the processor can determine the depth data of the object in the capturing area according to the predetermined relative positions of the first and second image sensors and the N first two-dimensional image frames and the N second two-dimensional image frames obtained by imaging the structured light. In different embodiments, the measuring head can have a relatively independent package, or it can be packaged in the measuring device together with the processor.
[0111]
[0112] The processor 1530 is connected to the measuring head, for example, to each of the projection device 1510 and the two image sensors 1520, and is configured to determine the depth data of the shooting object in the capturing area according to the predetermined relative positions of the first and second image sensors 1520_1 and 1520_2 and the N first two-dimensional image frames and N second two-dimensional image frames obtained by imaging the structured light.
[0113]
[0114]
[0115] In step S1610, projecting linear light that moves in a first direction to an imaging area, wherein the length of the linear light is in a second direction perpendicular to the first direction, linear light projection completes a single pattern scan in one scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, the linear light changes brightness with a projection cycle t.sub.p, which has the same duration as the exposure switch cycle t.sub.e, The projection cycle t.sub.p includes N waveform projection areas, each with a width of 2/N, where the light intensity in each waveform projection area is encoded, where N is an integer greater than 1.
[0116] In step S1620, capturing the imaging area using an image sensor in which each pixel includes N memory cells to obtain N image frames under the linear light scanning projection, wherein the N memory cells of each pixel each belong to one of N memory cell sets, and each memory cell set is exposed with an exposure switching cycle t.sub.e that is 2/N phases apart from each other.
[0117] In step S1630, obtaining depth data of the object to be measured in the imaging area based on the image frames.
[0118]
[0119] In step S1610, projecting linear light to an imaging area, wherein the linear light projection completes a single pattern scan in one scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, the projection cycle t.sub.p includes N waveform projection areas with a width of 2/N and light intensity encoded based on the imaging pattern.
[0120] In step S1620, capturing the imaging area using an image sensor including N pixel sets uniformly distributed on the imaging surface to obtain N image frames, wherein each pixel set is exposed with an exposure switching cycle t.sub.e that is 2/N phases apart from each other.
[0121] In step S1630, obtaining depth data of the object to be measured in the imaging area based on the first and second two-dimensional image frames.
[0122] Regardless of the storage solution, the projected light intensity of each waveform projection area is encoded so that when a pattern scan is completed within the scanning cycle , each of the N memory cell sets of the image sensor images a different striped-light pattern, and the N striped-light patterns constitute a set of N-step phase-shifted patterns with a 2/N phase shift between each other.
[0123] Furthermore, when imaging using actively projected structured light, problems with shadows and blind spots may occur.
[0124]
[0125] In view of this, the present invention can be implemented as a depth data measuring device with dual cameras (i.e., dual measuring heads).
[0126] Further, the measuring head 1910 and the measuring head 1920 may perform sequential structured light projection and imaging under the control of the same control device (e.g., a processor, not shown in the figure). Here, sequentially means that the measuring head 1910 first performs a scan and imaging, and the measuring head 1920 performs another scan and imaging, and each performs linear light projection and imaging in different time periods. Since the exposure cycle of each measuring head is shorter than the framing time of depth detection (for example, the exposure cycle of a frame of depth is 5 ms, and the depth detection frame rate is 30 fps, then the interval of each frame of depth is about 28 ms), the exposure of other measuring heads can be performed in the exposure interval of the current measuring head, and exposure imaging and depth calculation are performed alternately. In other words, the depth data measuring device of the present invention may include the two measuring heads as described above, or may include more measuring heads that perform exposure imaging and depth calculation alternately, and the imaging information of these measuring heads may be stitched together in the same xyz axis space. Due to the use of inter-frame interleaving detection, the system frame rate detection frame rate is not affected, and the depth detection frame rate can still reach 30 fps.
[0127] As shown, measuring head 1910 and measuring head 1920 can have the same configuration, for example, both can be two measuring heads with exactly the same configuration, and are configured to successively capture the same imaging area (object to be imaged in the imaging area) from different angles. Measuring head 1910 and measuring head 1920 can be implemented as a measuring head including N memory cell sets and capable of completing N images in a single scan as described above.
[0128] Here, sequential capturing means that after the measuring head 1910 completes the first pattern scanning (the projection device 1911 performs scanning and projection of linear light, and the image sensor 1912 performs imaging of the projected linear light), the measuring head 1920 performs the second pattern scanning (the projection device 1921 performs scanning and projection of linear light, and the image sensor 1922 performs imaging of the projected linear light), and the first set of N-step phase-shifted patterns obtained by the first pattern scanning and the second set of N-step phase-shifted patterns obtained by the second pattern scanning are synthesized into the depth information of the object to be imaged in the imaging area based on the relative position.
[0129] Further, the housing of the depth data measuring device 1900 may also include a handle 1931 structure, and may be implemented as a handheld modeling device. In this implementation, the depth data measuring device capture multiple sets of first set N-step phase-shifted patterns and multiple sets of second set N-step phase-shifted patterns that are in relative motion with respect to the object to be imaged, and may synthesize the depth information generated from the multiple sets of first set N-step phase-shifted patterns and multiple sets of second set N-step phase-shifted patterns into the model information of the object to be imaged based on the calibration point.
[0130]
[0131] The processor 2030 is connected to the two measuring heads, for example, to the projection device 2011 and the image sensor 2012, and to the projection device 2021 and the image sensor 2022. The projection device 2011 may perform a scanning projection as described above under the control of the processor 2030, and the N memory cell sets of the image sensor 2012 perform corresponding imaging, thereby obtaining a set of N-step phase-shifted images after a scanning projection. Subsequently, the projection device 2021 may perform a scanning projection as described above under the control of the processor 2030, and the N memory cell sets of the image sensor 2022 perform corresponding imaging, thereby obtaining a set of N-step phase-shifted images after a scanning projection. The two sets of N-step phase-shifted images can each be used to synthesize a depth map of the object to be imaged, and the two depth maps can be stitched together based on the positional relationship of the two measuring heads, thereby obtaining more comprehensive data of the object to be imaged.
[0132]
[0133] In step S2110, using a first depth imaging measuring head of a depth data measuring device to perform a first pattern scan on an imaging area to obtain a first set of N-step phase-shifted patterns.
[0134] In step S2120, using a second depth imaging measuring head of the depth data measuring device to perform a second pattern scan on the imaging area to obtain a second set of N-step phase-shifted patterns.
[0135] In step S2130, synthesizing depth information of the object to be imaged in the imaging area from the first set of N-step phase-shifted patterns and the second set of N-step phase-shifted patterns based on the relative positions of the first depth imaging measuring head and the second depth imaging measuring head.
[0136] The scanning imaging of the first depth imaging measuring head and the second depth imaging measuring head each include: projecting linear light that moves in a first direction to the imaging area, wherein the length of the linear light is in a second direction perpendicular to the first direction, linear light projection completes the single pattern scan in one scanning cycle , which includes multiple repeating subcycles
, in each subcycle
, the linear light changes brightness with a projection cycle t.sub.p, which has the same duration as the exposure switch cycle t.sub.e, the projection cycle t.sub.p includes N waveform projection areas, each with a width of 2/N, where the light intensity in each waveform projection area is encoded, where N is an integer greater than 1; capturing the imaging area using an image sensor including N memory cell sets uniformly distributed on a imaging surface to obtain N image frames under the linear light scanning projection, each memory cell set exposed in an exposure switch cycle t.sub.e that is phase-shifted by 2/N relative to each other; and obtaining depth data of the object to be measured in the imaging area based on the image frames. The light intensity in each waveform projection area is encoded such that, during a single pattern scan within the scanning cycle
, the N memory cell sets in the image sensor each capture a different striped-light pattern, the N striped-light patterns form a set of N-step phase-shifted patterns, each having a phase-shifted of 2/N relative to the others.
[0137] The above steps S1910 to S1930 can be repeated to perform dynamic imaging of, for example, a moving imaging object. In this case, the depth data measuring device of the present invention can maintain a constant position.
[0138] In another embodiment, the depth data measuring device of the present invention can actively change its position. As shown in the previous
[0139] The depth data measuring head, measuring device and measuring method according to the present invention have been described in detail above with reference to the accompanying drawings. The depth data measurement solution of the present invention uses an image sensor with different sets of pixels capable of phase-shifted exposure to image the projected phase-shifted linear light, so that in a single scan of the linear light, different sets of pixels of the image sensor can each obtain different phase-shifted striped light images, thereby achieving the acquisition of multiple striped light images in a single linear light scan. Thus, the synthesis speed of the depth map can be greatly improved, and it is suitable for capturing moving target objects.
[0140] The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present disclosure. In this regard, each block in a flowchart or block diagram may represent a module, program segment, or part of code that includes one or more Executable instructions. It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. It should also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by a dedicated hardware-based system that performs the specified functions or operations, or may be implemented by a combination of dedicated hardware and computer instructions.
[0141] Having described various embodiments of the present disclosure, the foregoing description is exemplary, not exhaustive, and is not limited to the disclosed embodiments. Many modifications and alterations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen to best explain the principle of each embodiment, practical application or improvement of technology in the market, or to enable other ordinary skilled in the art to understand each embodiment disclosed herein.