Sensing on UAVs for mapping and obstacle avoidance
11323687 · 2022-05-03
Assignee
Inventors
- Alberto Daniel Lacaze (Potomac, MD, US)
- Karl Nicholas Murphy (Cocoa Beach, FL, US)
- Raymond Paul WILHELM, III (Gaithersburg, MD, US)
Cpc classification
B64U2201/00
PERFORMING OPERATIONS; TRANSPORTING
H04N13/254
ELECTRICITY
G01B11/245
PHYSICS
B64U2101/30
PERFORMING OPERATIONS; TRANSPORTING
G01B11/2545
PHYSICS
G01S17/48
PHYSICS
G01S17/42
PHYSICS
B64C39/024
PERFORMING OPERATIONS; TRANSPORTING
H04N2013/0081
ELECTRICITY
H04N13/243
ELECTRICITY
H04N13/271
ELECTRICITY
International classification
H04N13/243
ELECTRICITY
G01B11/245
PHYSICS
G01B11/25
PHYSICS
G01S7/481
PHYSICS
H04N13/271
ELECTRICITY
G01S17/42
PHYSICS
Abstract
Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full a 360 degree range.
Claims
1. A low-light environment obstacle avoidance-enabled UAV, comprising: a plurality of fisheye cameras displaced from each other horizontally, each camera comprising a wide-angle compound lens; a plurality of near-infrared laser projection line scanners displaced to simultaneously project a plurality of vertical laser lines in an environment surrounding the UAV, thereby generating vertical line light pattern projections, each laser projection line scanner comprising a solid-state laser diode and an aspheric collimation lens coupled in front of a laser emitter and that splits a laser beam emitted from the emitter into upward and downward vertical line portions; a UAV rotation mechanism operable to alter a yaw orientation of the plurality of near-infrared laser projection line scanners; and a microprocessor in communication with the plurality of fisheye cameras, the plurality of near-infrared laser projection line scanners, and the UAV rotation mechanism, the microprocessor being configured to: (i) operate the UAV rotation mechanism to alter a first yaw orientation of the plurality of near-infrared laser projection line scanners at a first point in time to a second yaw orientation of the plurality of near-infrared laser projection line scanners at a second point in time, (ii) process images captured by the plurality of fisheye cameras to: (a) triangulate, at the first point in time and based on first identified locations of first vertical line light pattern projections from the near-infrared laser projection line scanners a first plurality of point locations in a 3-D space comprising the environment surrounding the UAV, and (b) triangulate, at the second point in time and based on second identified locations of second vertical line light pattern projections from the near-infrared laser projection line scanners a second plurality of point locations in the 3-D space comprising the environment surrounding the UAV, thereby defining a point cloud, and (iii) conduct, based on the point cloud, an obstacle avoidance navigation of the UAV.
2. The low-light environment obstacle avoidance-enabled UAV of claim 1, wherein the downward vertical line portion of the laser beam creates a laser line that extends from horizontal to negative 80 degrees pitch.
3. The low-light environment obstacle avoidance-enabled UAV of claim 1, wherein the upward vertical line portion of the laser beam creates a laser line that extends from horizontal to positive 80 degrees pitch.
4. The low-light environment obstacle avoidance-enabled UAV of claim 1, wherein each camera comprises a spectral filter.
5. The low-light environment obstacle avoidance-enabled UAV of claim 1, wherein each camera comprises an optical bandpass filter.
6. The low-light environment obstacle avoidance-enabled UAV of claim 1, wherein the vertical laser lines are projected along a vertical sensor plane and wherein the vertical sensor plane is aligned with a direction of travel of the UAV.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The accompanying drawings, which are incorporated herein a form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION OF THE INVENTION
(12) In the following detailed description of the invention of exemplary embodiments of the invention, reference is made to the accompanying drawings (where like numbers represent like elements), which form a part hereof, and in which is shown by way of illustration specific exemplary embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, but other embodiments may be utilized and logical, mechanical, electrical, and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
(13) In the following description, numerous specific details are set forth to provide a thorough understanding of the invention. However, it is understood that the invention may be practiced without these specific details. In other instances, well-known structures and techniques known to one of ordinary skill in the art have not been shown in detail in order not to obscure the invention. Referring to the figures, it is possible to see the various major elements constituting the apparatus of the present invention.
(14) Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. In sharp contrast, with conventional stereo and structure from motion, poor lighting actually improves the range and accuracy of this sensor. There is also no need to have rich features in the environment, since the laser “projects its own features.” Therefore, it will even work on featureless walls and floors.
(15) One such approach is presented in
(16)
(17) In order to accommodate these sensors on a quadrotor, modifications will be done to the location of the camera and the laser emitters. However, the core electronics and software have already been designed, but never used in this combination. The sensor is designed to meet the unique needs of an autonomous multicopter for indoor and outdoor environments, including: Large-field of view for obstacle avoidance and mapping; Light-weight system with minimal moving parts; Accurate ranges at short distances, with decreasing accuracy at longer ranges; Use of eye-safe lasers, while providing resilience to ambient light; and a Predicted weight under 150 grams.
(18) The proposed configuration makes use of multiple fisheye cameras and laser line scanners. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift (stereo disparity) of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space.
(19) At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full 360 degree range sensing capabilities as illustrated by
(20)
(21)
(22) Each imager is composed of a camera module, a spectral filter, and a wide-angle compound lens. The camera must be small in size and weight, while providing high sensitivity and a wide dynamic range. Depending on mission requirements, an optical bandpass filter can be installed to attenuate incoming ambient light. If no filter is installed, the imaging system can be used as a visible light imager to provide full 360 degree RGB imagery in addition to point clouds.
(23) A laser projection unit consists of a solid-state laser diode, laser pulsing circuitry, aspheric collimation lens, beam splitter, small rotating mirror, and laser line lens. The laser circuitry pulses the laser while also providing a frame trigger to each imager. The laser light is collimated into a beam 403 and 404 using a small aspheric lens directly in front of the laser. The laser beam is then split into an upward and downward beam 403 and 404. Each beam 403 and 404 is reflected off a small rotating mirror coupled to a laser line lens. The upward beam 403 creates a laser line that extends from horizontal to positive 80 degrees pitch, while the downward beam 404 creates a laser line that extends from horizontal to negative 80 degrees pitch.
(24) The proposed field-of-view (shown in
(25) The structured light sensor will be able to measure 360 degrees horizontally and 160 degrees vertically. At each point in time, the sensor will generate approximately 2080 vertical range measurements. With each imager capturing approximately 180 images/second, the sensor will be able to generate over 370 k points per second.
(26) The yaw scan rate can be varied, depending upon the current mission needs. The sensor can be operated with a fine yaw resolution and slow scan rate, providing detailed scans of the environment; or, the sensor can be operated with a faster yaw rate, providing faster updates at a coarser rate.
(27) Since this device relies on triangulation, the range accuracy will be dependent on range. The expected range error 600 is shown in
(28) A second approach is to use a time-of-flight line sensor to perform the same task as shown with the structured light sensor. The line sensors can be organized as seen in
(29) One more possible configuration is the same as shown in
(30) The system is composed of a quadrotor, or other UAV, and one or more range sensors that are used to sense the surrounding environment.
(31) Thus, it is appreciated that the optimum dimensional relationships for the parts of the invention, to include variation in size, materials, shape, form, function, and manner of operation, assembly and use, are deemed readily apparent and obvious to one of ordinary skill in the art, and all equivalent relationships to those illustrated in the drawings and described in the above description are intended to be encompassed by the present invention.
(32) Furthermore, other areas of art may benefit from this method and adjustments to the design are anticipated. Thus, the scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.