DISTANCE MEASUREMENT SYSTEM AND METHOD FOR CALIBRATING DISTANCE MEASUREMENT SENSOR
20210356593 · 2021-11-18
Inventors
Cpc classification
G01S17/58
PHYSICS
G01S17/42
PHYSICS
G01S17/894
PHYSICS
International classification
G01S17/42
PHYSICS
G01S17/58
PHYSICS
Abstract
There is provided a distance measurement system in which a plurality of distance measurement sensors are installed to generate a distance image of an object in a measurement region, the system including a cooperative processing device that performs alignment between the distance measurement sensors, and composes distance data from the plurality of distance measurement sensors to display the distance image of the object. In order to perform the alignment between the distance measurement sensors, trajectories (referred to as motion lines) of a person moving in the measurement region are acquired by the plurality of distance measurement sensors, and the cooperative processing device performs calibration of an installation position of each of the distance measurement sensors such that the motion lines acquired by the distance measurement sensors coincide with each other in a common coordinate system.
Claims
1. A distance measurement system in which a plurality of distance measurement sensors are installed to generate a distance image of an object in a measurement region, wherein the distance measurement sensor is of a type that measures a distance to the object based on a transmission time of light, the system comprises a cooperative processing device that performs alignment between the distance measurement sensors, and composes distance data from the plurality of distance measurement sensors to display the distance image of the object, and in order to perform the alignment between the distance measurement sensors, trajectories (hereinafter, referred to as motion lines) of a person moving in the measurement region are acquired by the plurality of distance measurement sensors, and the cooperative processing device performs calibration of an installation position of each of the distance measurement sensors such that the motion lines acquired by the distance measurement sensors coincide with each other in a common coordinate system.
2. The distance measurement system according to claim 1, wherein the cooperative processing device includes a coordinate conversion unit that uses sensor installation information of the plurality of distance measurement sensors to convert the distance data from the plurality of distance measurement sensors into position data in the common coordinate system, an image composition unit that composes measurement data to generate one distance image, a display unit that displays the composed distance image, a person detection unit that detects the motion lines of the person, which are effective for the calibration, from the distance data input from the plurality of distance measurement sensors, and a calibration unit that corrects the sensor installation information to be used by the coordinate conversion unit, based on a result of the distance image in which the motion lines of the person are composed.
3. The distance measurement system according to claim 2, wherein the person detection unit acquires body length information of the person as accompanying information of the person detected, and when the motion lines of the person which are acquired by the plurality of distance measurement sensors are aligned with each other, the calibration unit calculates a similarity between the motion lines, and with reference to the body length information of the person, which is acquired by the person detection unit, and time information of when the distance data is acquired, performs alignment between the motion lines such that the body length information or the time information coincide with each other.
4. The distance measurement system according to claim 2, wherein the person detection unit acquires distances from the distance measurement sensors to the person as accompanying information of the person detected, and evaluates reliability of the motions lines of the person detected, according to the distances from the distance measurement sensors to the person, and the calibration unit performs alignment between the motion lines that are evaluated as having high reliability by the person detection unit.
5. The distance measurement system according to claim 2, wherein the person detection unit acquires point clouds included in a person region, as accompanying information of the person detected, and evaluates reliability of the motion lines of the person detected, according to the point clouds included in the person region, and the calibration unit performs alignment between the motion lines that are evaluated as having high reliability by the person detection unit.
6. The distance measurement system according to claim 2, wherein the person detection unit acquires directions of detection of the person in viewing angles as accompanying information of the person detected, and evaluates reliability of the motion lines of the person detected, according to the directions of detection of the person in the viewing angles, and the calibration unit performs alignment between the motion lines that are evaluated as having high reliability by the person detection unit.
7. The distance measurement system according to claim 2, wherein the person detection unit acquires whether or not an obstacle is present in front of the person, as accompanying information of the person detected, and evaluates reliability of the motion lines of the person detected, according to whether or not the obstacle is present in front of the person, and the calibration unit performs alignment between the motion lines that are evaluated as having high reliability by the person detection unit.
8. The distance measurement system according to claim 4, wherein the motion lines are displayed on the display unit in a state where densities or colors of the motion lines are changed according to the reliability of the motion lines evaluated by the person detection unit.
9. The distance measurement system according to claim 5, wherein the motion lines are displayed on the display unit in a state where densities or colors of the motion lines are changed according to the reliability of the motion lines evaluated by the person detection unit.
10. The distance measurement system according to claim 6, wherein the motion lines are displayed on the display unit in a state where densities or colors of the motion lines are changed according to the reliability of the motion lines evaluated by the person detection unit.
11. The distance measurement system according to claim 7, wherein the motion lines are displayed on the display unit in a state where densities or colors of the motion lines are changed according to the reliability of the motion lines evaluated by the person detection unit.
12. The distance measurement system according to claim 2, wherein a user adjusting unit is provided such that a user selects the motion lines, which are effective for the calibration, from the motion lines of the person which are detected by the person detection unit, and the user performs fine adjustment when the sensor installation information is corrected by the calibration unit.
13. A method for calibrating a distance measurement sensor when a plurality of the distance measurement sensors are installed to generate a distance image of an object in a measurement region, wherein the distance measurement sensor is of a type that measures a distance to the object based on a transmission time of light, and in order to perform alignment between the distance measurement sensors, the method comprises a step of detecting a person, who moves in the measurement region, with the plurality of distance measurement sensors to acquire trajectories (hereinafter, referred to as motion lines) of the person; and a step of performing calibration of sensor installation information of each of the distance measurement sensors such that the motion lines acquired by the distance measurement sensors coincide with each other in a common coordinate system.
14. The method for calibrating a distance measurement sensor according to claim 13, wherein in the step of acquiring the motion lines, body length information of the person is acquired as accompanying information of the person detected, and in the step of performing the calibration, a similarity between the motion lines acquired by the plurality of distance measurement sensors is calculated, and with reference to the body length information of the person detected and time information of when the distance is measured, alignment between the motion lines is performed such that the body length information or the time information coincide with each other.
15. The method for calibrating a distance measurement sensor according to claim 13, wherein in the step of acquiring the motion lines, body length information of the person is acquired as accompanying information of the person detected, and in the step of performing the calibration, a similarity between the motion lines acquired by the plurality of distance measurement sensors is calculated, and with reference to the body length information of the person detected and time information of when the distance is measured, alignment between the motion lines is performed such that the body length information or the time information coincide with each other.
16. The method for calibrating a distance measurement sensor according to claim 13, wherein in the step of acquiring the motion lines, point clouds included in a person region are acquired as accompanying information of the person detected, and reliability of the motion lines of the person detected is evaluated according to the point clouds included in the person region, and in the step of performing the calibration, alignment between the motion lines that are evaluated as having high reliability in the step of acquiring the motion lines is performed.
17. The method for calibrating a distance measurement sensor according to claim 13, wherein in the step of acquiring the motion lines, directions of detection of the person in viewing angles are acquired as accompanying information of the person detected, and reliability of the motion lines of the person detected is evaluated according to the directions of detection of the person in the viewing angles, and in the step of performing the calibration, alignment between the motion lines that are evaluated as having high reliability in the step of acquiring the motion lines is performed.
18. The method for calibrating a distance measurement sensor according to claim 13, wherein in the step of acquiring the motion lines, whether or not an obstacle is present in front of the person is acquired as accompanying information of the person detected, and reliability of the motion lines of the person detected is evaluated according to whether or not the obstacle is present in front of the person, and in the step of performing the calibration, alignment between the motion lines that are evaluated as having high reliability in the step of acquiring the motion lines is performed.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0022] Hereinbelow, an embodiment of the present invention will be described. In the calibration of distance measurement sensors of the present embodiment, trajectory data (motion line data) of a person moving in a measurement space is acquired by each of the distance measurement sensors, and alignment (correction of installation position information) between the sensors is performed such that the trajectory data acquired by the distance measurement sensors coincide with each other in a common coordinate system.
[0023]
[0024] In the example illustrated in
[0025]
[0026]
[0027] The object 9 is present at a position spaced apart by a distance D from the light emitting unit 11 and the light receiving unit 12. Here, when the speed of light is c and the time difference from when the light emitting unit 11 emits the irradiation light 31 to when the light receiving unit 12 receives the reflected light 32 is t, the distance D to the object 9 is obtained by D=c×t/2. Incidentally, in practical distance measurement performed by the distance calculation unit 14, instead of using the time difference t, an irradiation pulse of a predetermined width is emitted, and the two-dimensional sensor 12a receives the irradiation pulse while shifting the timing of an exposure gate. Then, the distance D is calculated from the values of the amounts of received light (accumulated amount) at different timings (exposure gate method).
[0028]
[0029] In the cooperative processing device 2, arithmetic processing such as coordinate conversion, image composition, or calibration is performed, and a program used for the arithmetic processing is stored in a ROM, and the program is deployed to a RAM to be executed by a CPU, so that the above function is realized (not illustrated). Incidentally, regarding a person detection process and a calibration process, an operator (user) can also appropriately performs adjustment by using a user adjusting unit (not illustrated) while looking at an image of the motion line displayed on the display unit 24.
[0030] Next, a calibration method will be described. In the present embodiment, a motion line of a person is used as a measurement object (marker) for the calibration process; however, for comparison, first, a method using a reflective tape will be described.
[0031]
[0032]
[0033]
[0034] In the calibration process, the information of the installation position and the azimuth angle of the sensor is corrected such that the measured positions 8a and 8b of the reflective tape 8 coincide with each other. Then, coordinate conversion is performed based on the corrected installation information, and the virtual measurement images are displayed again. This process is repeated until the measured positions 8a and 8b coincide with each other. Hereinafter, the procedure of calibration will be described.
[0035]
[0036]
[0037]
[0038] In the above calibration method using the reflective tape 8, an operation of affixing the reflective tape, which is a marker, to a measurement site is needed. At that time, when the number of the sensors increases, the load in the affixing operation increases, and depending on the measurement environment, the floor surface may not be flat or there may be an obstacle, so that it is difficult to affix the reflective tape. Therefore, the present embodiment is characterized in that not the reflective tape but motion line data of a moving person is used. Hereinafter, a calibration method using motion line data will be described.
[0039]
[0040]
[0041]
[0042] In the calibration process, the information of the installation position and the azimuth angle of the sensor is corrected such that the motion lines 9a and 9b of the person 9 coincide with each other. Then, coordinate conversion is performed based on the corrected installation information, and the motion lines are displayed again. This process is repeated until the motion lines coincide with each other. Hereinafter, the procedure of calibration will be described.
[0043]
[0044]
[0045]
[0046] As described above, in the present embodiment, calibration is performed using the motion line data which is the movement trajectory of the person, and there is no need for the operator to affix the reflective tape to the floor surface as in the comparative example. Therefore, the load on the operator in the calibration operation can be reduced, and calibration can be easily executed regardless of the measurement environment. In addition, trajectory data for calibration of various shapes can be easily obtained, so that an improvement in accuracy of calibration can be expected.
[0047] In addition, in the present embodiment, since the motion line data of the head of the person is used, calibration can be performed at the height position of the head of the person. Therefore, as compared to the calibration on the floor surface to which the reflective tape is affixed as in the comparative example, the calibration method in the present embodiment is more proper as calibration in the case of the measurement object being a person, and a further improvement in accuracy can be expected.
[0048] In the present embodiment, in order to acquire the motion line data of a person, a specific person may move, and a method in which any person moves in the measurement space can also be used. Therefore, various motion line data can be obtained by the distance measurement sensors, and motion line data effective for calibration needs to be extracted from the various motion line data. In addition, a method for displaying the motion line data needs to be devised based on the assumption that the operator (user) may extract effective motion line data. In the present embodiment, the following processes are performed in consideration of the above situation.
[0049] (1) Body height information is acquired from distance data as accompanying information of persons detected, motion line data of persons having the same body height is extracted, and motion lines are aligned with each other. Accordingly, even when a large number of unspecific persons move in the measurement space, the measurement target can be narrowed down to the same person, and alignment can be performed.
[0050] (2) Referring to time information of when distance data is acquired, the time information being accompanying information of motion line data, motion lines are aligned with each other such that the positions of points on the motion lines coincide with each other, the times coinciding with each other at the points. Therefore, when the motion line data is displayed, animation display is performed with the times synchronized.
[0051] (3) The reliability of the motion line data is evaluated, and motion line data having high reliability is extracted. The reliability referred to here is the degree of measurement accuracy of the person data detected, and when a person is close to the sensor, a person has a large point cloud, and the direction of detection of a person is close to the center of the viewing angle, the person has high reliability. On the contrary, the more distant a person is from the sensor, the further the received light intensity of the TOF method at the position of an end portion of the viewing angle decreases and the reliability of the measured value decreases. Further, when the area of a person to be detected decreases, the point cloud (the number of the detection pixels of the light receiving unit) decreases, or when an obstacle is present in front of the person, a part of the motion line data is missing (hidden) (occurrence of occlusion), which is a concern, thereby causing a decrease in reliability. After the reliability of the motion line data is evaluated, the display unit 24 distinctively displays motion lines according to an evaluation result. For example, a motion line having high reliability is darkly displayed, and a motion line having low reliability is lightly displayed (alternatively, the display color may be changed).
[0052] (4) When motion line data of a plurality of the sensors is displayed on the display unit 24, the display of the motion line data can be turned on and off for each of the sensors. In addition, a plurality of motion line data measured in the past is saved, and desired data is read therefrom to be displayed. Calibration adjustment is performed using the plurality of data, so that the accuracy of calibration is improved.
[0053] The reliability of motion line data described in (3) will be described with reference to the drawing.
[0054]
[0055] In addition, when motion line data is used, the shape of a motion line may also be taken into consideration. Namely, when the length of the motion line is short, it is difficult to align directions (rotation). Therefore, a predetermined length or more is needed. In addition, when the shape of the motion line is linear, alignment in a direction perpendicular thereto can be clearly performed, but alignment in a direction parallel thereto is unclear. Therefore, it can be said that the shape of the motion line is preferably curved, and the reliability is high.
[0056]
[0057] S101: The cooperative processing device 2 sets installation parameters of each of the distance measurement sensors 1. The installation parameters include the installation position (x, y, z), the measurement direction (azimuth angle) (θx, θy, θz) and the like of the sensor.
[0058] S102: Each of the sensors 1 acquires distance data in the measurement space over a predetermined time according to an instruction from the cooperative processing device 2, and transmits the distance data to the cooperative processing device 2.
[0059] S103: The person detection unit 25 of the cooperative processing device 2 detects a person from the received distance data. In the detection of the person, the position of the head of the person is detected by an image recognition technique. In addition, the time, the body height, the point cloud (the number of pixels included in a person region) and the like of the person detected are acquired and retained as accompanying information. When a plurality of persons are detected, position information or accompanying information of each of the persons is acquired.
[0060] S104: Further, the person detection unit 25 evaluates the reliability of the person detected (motion line data). The evaluation is an evaluation for extracting data having the highest accuracy for use in the calibration process, and conditions such as whether or not a person is close to the sensor, whether or not a person has a large point cloud, and whether or not the direction of detection is close to the center of the viewing angle are evaluated.
[0061] S105: The coordinate conversion unit 22 converts position data of the person, which is detected by each of the sensors, in a common coordinate space. In the coordinate conversion, the installation parameters set in S101 are used.
[0062] S106: It is determined whether or not the person data which is coordinate-converted is satisfactory. Namely, it is determined whether or not the accompanying information (time and body height) of the person detected by the sensors coincide with each other between the sensors. When the data is satisfactory, the process proceeds to S107, and when the data is not satisfactory, the process returns to S102, and distance data is acquired again.
[0063] S107: The image composition unit 23 composes the position data of the person from the sensors, which is coordinate-converted in S105, in the common coordinate space with the times synthesized, and draws the composed position data on the display unit 24. Namely, the motion lines acquired by the sensors are displayed. When a plurality of persons are detected, a plurality of sets of motion lines are displayed.
[0064] S108: The calibration unit 26 calculates a similarity between the motion lines acquired by the sensors. Namely, a place where the shapes (patterns) of the motion lines are similar to each other is extracted. For this reason, portions of the motion lines from the sensors, at which the times correspond to each other, are compared, and the similarity between the motion lines is obtained by a pattern matching method.
[0065] S109: The calibration unit 26 performs alignment (movement or rotation) on the sensors such that the motion lines coincide with each other in portions of the motion lines, which have high similarity (correspondence). Namely, the installation position and the measurement direction (azimuth angle) of the installation parameters of each of the sensors are corrected to (x′, y′, z′) and (θx′, θy′, θz′). Here, when there are a plurality (three or more) of the sensors, a sensor which serves as a reference point is determined, and the other sensors are aligned with the sensor one by one. Alternatively, other sensors which are not yet corrected are aligned in order with a sensor which has been already corrected.
[0066] S110: The motion line positions are coordinate-converted by the coordinate conversion unit 22 again, and the calibration result is drawn on the display unit 24. The operator looks at the motion line positions after correction, to determine whether or not the motion line positions are satisfactory. When the motion line positions are satisfactory, the calibration process ends, and when the motion line positions are not satisfactory, the process returns to S107, and alignment is repeatedly performed.
[0067] In the above flow, in the evaluation of the reliability in S104 and the calibration step of S109, the operator can also complementally perform alignment by using the user adjusting unit while looking at the motion lines displayed on the display unit 24. Namely, in S104, the operator determines the reliability of the motion lines to select a motion line having high reliability, so that the efficiency of the subsequent calibration process can be improved. In addition, in the calibration step of S109, the operator can manually fine-adjust the installation parameters to further improve the accuracy of the calibration process.
[0068] As described above, in the calibration of the distance measurement sensors of the present embodiment, the trajectory data (motion line data) of the person moving in the measurement space is acquired by each of the distance measurement sensors, and alignment (correction of the installation position information) between the sensors is performed such that the trajectory data acquired by the distance measurement sensors coincide with each other in the common coordinate system. Accordingly, the load on the operator in installing the marker (reflective tape) for the calibration operation can be reduced, and calibration can be easily executed regardless of the measurement environment.