SENISING ON UAVS FOR MAPPING AND OBSTACLE AVOIDANCE
20190068954 ยท 2019-02-28
Inventors
- Alberto Daniel Lacaze (Potomac, MD, US)
- Karl Nicholas Murphy (Rockville, MD, US)
- Raymond Paul WILHELM, III (Gaithersburg, MD, US)
Cpc classification
B64U2201/00
PERFORMING OPERATIONS; TRANSPORTING
H04N13/254
ELECTRICITY
G01B11/245
PHYSICS
B64U10/14
PERFORMING OPERATIONS; TRANSPORTING
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
G01B11/2545
PHYSICS
G01S17/48
PHYSICS
G01S17/42
PHYSICS
H04N2013/0081
ELECTRICITY
H04N13/243
ELECTRICITY
H04N13/271
ELECTRICITY
International classification
H04N13/243
ELECTRICITY
G01S7/481
PHYSICS
G01S17/42
PHYSICS
H04N13/271
ELECTRICITY
Abstract
Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full a 360 degree range.
Claims
1. A sensing device for UAVs, comprising a quadrotor; one or more line time-of-flight sensors; a computer or microprocessor to process range information; and the computer or microprocessor sending the range information to one or more recipients.
2. The sensing device for UAVs of claim 1, wherein the processing is used for obstacle avoidance.
3. The sensing device for UAVs of claim 1, wherein the processing is used for mapping the surroundings.
4. The sensing device for UAVs of claim 1, wherein the line time-of-flight sensor is rotated; and the rotation is accomplished by a mechanism on the vehicle.
5. The sensing device for UAVs of claim 1, wherein the line time-of-flight sensor is rotated; and the rotation is accomplished by moving the body of the vehicle.
6. The sensing device for UAVs of claim 1, wherein the line time-of-flight sensor is rotated; and the rotation is accomplished by at least one of a mechanism on the vehicle and moving the body of the vehicle, or a combination of the two.
7. The sensing device for UAVs of claim 1, further comprising a structured light sensor.
8. The sensing device for UAVs of claim 7, wherein multiple lines are used, one horizontal line and one vertical line, to increase the coverage.
9. The sensing device for UAVs of claim 1, wherein the UAV is a quadrotor.
10. A sensing device for UAVs, comprising: a plurality of fisheye cameras; the cameras are separated from each other to provide parallax; four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere; a plurality of laser line scanners; the near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras; the laser projection system creates vertical lines, while the cameras will be displaced from each other horizontally' this relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space; at each point in time, a vertical stripe of the world will be triangulated; over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities ; the two laser line projectors are used to create a line that can then be sensed with the omnidirectional cameras; each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens; an optical bandpass filter can be installed to attenuate incoming ambient light; if no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds; a laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens; the laser circuitry pulses the laser while also providing a frame trigger to each imager; the laser light is collimated into a beam using a small aspheric lens directly in front of the laser; the laser beam is then split into an upward and downward beam; each beam is reflected off a small rotating mirror coupled to a laser line lens; the upward beam creates a laser line that extends from horizontal to positive 80 degrees pitch; the downward beam creates a laser line that extends from horizontal to negative 80 degrees pitch; the structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically; at each point in time, the sensor will generate approximately 2080 vertical range measurements; each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second; the yaw scan rate can be varied, depending upon the current mission needs; the sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate; and since this device relies on triangulation, the range accuracy will be dependent on range.
11. The sensing device for UAVs of claim 10, comprising: a UAV; one or more range sensors that are used to sense the surrounding environment; a time-of-flight line sensor to perform the same task as shown with the structured light sensor; and a vertical sensing plan aligned with the direction of travel.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The accompanying drawings, which are incorporated herein a form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
DETAILED DESCRIPTION OF THE INVENTION
[0031] In the following detailed description of the invention of exemplary embodiments of the invention, reference is made to the accompanying drawings (where like numbers represent like elements), which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, but other embodiments may be utilized and logical, mechanical, electrical, and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
[0032] In the following description, numerous specific details are set forth to provide a thorough understanding of the invention. However, it is understood that the invention may be practiced without these specific details. In other instances, well-known structures and techniques known to one of ordinary skill in the art have not been shown in detail in order not to obscure the invention. Referring to the figures, it is possible to see the various major elements constituting the apparatus of the present invention.
[0033] Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In sharp contrast, with conventional stereo and structure from motion, poor lighting actually improves the range and accuracy of this sensor. There is also no need to have rich features in the environment, since the laser projects its own features. Therefore, it will even work on featureless walls and floors.
[0034] One such approach is presented in
[0035]
[0036] In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters. However, the core electronics and software have already been designed, but never used in this combination. The sensor is designed to meet the unique needs of an autonomous multicopter for indoor and outdoor environments, including: Large-field of view for obstacle avoidance and mapping; Light-weight system with minimal moving parts; Accurate ranges at short distances, with decreasing accuracy at longer ranges; Use of eye-safe lasers, while providing resilience to ambient light; and a Predicted weight under 150 grams.
[0037] The proposed configuration makes use of multiple fisheye cameras and laser line scanners. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space.
[0038] At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities as illustrated by
[0039]
[0040]
[0041] Each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens. The camera must be small in size and weight, while providing high sensitivity and a wide dynamic range. Depending on mission requirements, an optical bandpass filter can be installed to attenuate incoming ambient light. If no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds.
[0042] A laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens. The laser circuitry pulses the laser while also providing a frame trigger to each imager. The laser light is collimated into a beam 403 and 404 using a small aspheric lens directly in front of the laser. The laser beam is then split into an upward and downward beam 403 and 404. Each beam 403 and 404 is reflected off a small rotating mirror coupled to a laser line lens. The upward beam 403 creates a laser line that extends from horizontal to positive 80 degrees pitch, while the downward beam 404 creates a laser line that extends from horizontal to negative 80 degrees pitch.
[0043] The proposed field-of-view (shown in
[0044] The structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically. At each point in time, the sensor will generate approximately 2080 vertical range measurements. With each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second.
[0045] The yaw scan rate can be varied, depending upon the current mission needs. The sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate.
[0046] Since this device relies on triangulation, the range accuracy will be dependent on range. The expected range error 600 is shown in
[0047] A second approach is to use a time-of-flight line sensor to perform the same task as shown with the structured light sensor. The line sensors can be organized as seen in
[0048] One more possible configuration is the same as shown in
[0049] The system is composed of a quadrotor, or other UAV, and one or more range sensors that are used to sense the surrounding environment.
[0050] Thus, it is appreciated that the optimum dimensional relationships for the parts of the invention, to include variation in size, materials, shape, form, function, and manner of operation, assembly and use, are deemed readily apparent and obvious to one of ordinary skill in the art, and all equivalent relationships to those illustrated in the drawings and described in the above description are intended to be encompassed by the present invention.
[0051] Furthermore, other areas of art may benefit from this method and adjustments to the design are anticipated. Thus, the scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.