Patent classifications
G01S7/51
Augmenting panoramic LIDAR results with color
Methods and systems can augment 360 degree panoramic LIDAR results (e.g., from a spinning LIDAR system) with color obtained from color cameras. A color-pixel-lookup table can specify the correspondence between LIDAR pixels (depth/ranging pixels) and color pixels, which may be done at different viewing object distances. The operation of the color cameras can be triggered by the angular positions of the LIDAR system. For example, a color image of a particular camera can be captured when the LIDAR system is at a particular angular position, which can be predetermined based on properties of the cameras (e.g., shutter speed). Alternatively or in addition, a common internal clock can be used to assign timestamps to LIDAR and color pixels as they are captured. The corresponding color pixel(s), e.g., as determined using a color-pixel-lookup table, with the closest timestamp can be used for colorization.
Augmenting panoramic LIDAR results with color
Methods and systems can augment 360 degree panoramic LIDAR results (e.g., from a spinning LIDAR system) with color obtained from color cameras. A color-pixel-lookup table can specify the correspondence between LIDAR pixels (depth/ranging pixels) and color pixels, which may be done at different viewing object distances. The operation of the color cameras can be triggered by the angular positions of the LIDAR system. For example, a color image of a particular camera can be captured when the LIDAR system is at a particular angular position, which can be predetermined based on properties of the cameras (e.g., shutter speed). Alternatively or in addition, a common internal clock can be used to assign timestamps to LIDAR and color pixels as they are captured. The corresponding color pixel(s), e.g., as determined using a color-pixel-lookup table, with the closest timestamp can be used for colorization.
Laser scanner
A laser scanner comprises a scanner main unit, wherein the scanner main unit has a distance measuring light projecting component for projecting a distance measuring light, a light receiving component for receiving a reflected distance measuring light, a distance measuring component for performing a distance measurement based on a light receiving signal from the light receiving component, an optical axis deflecting unit for deflecting a distance measuring optical axis, a projecting direction detecting unit for detecting a deflection angle of the distance measuring optical axis, a storage component which stores three-dimensional design drawing data of an object to be measured and a control component for controlling the optical axis deflecting unit and the distance measuring component, wherein the control component acquires scanning data by scanning a measurement range and extracts an actual ridge line or an actual intersection point of the object to be measured based on the scanning data.
Laser scanner
A laser scanner comprises a scanner main unit, wherein the scanner main unit has a distance measuring light projecting component for projecting a distance measuring light, a light receiving component for receiving a reflected distance measuring light, a distance measuring component for performing a distance measurement based on a light receiving signal from the light receiving component, an optical axis deflecting unit for deflecting a distance measuring optical axis, a projecting direction detecting unit for detecting a deflection angle of the distance measuring optical axis, a storage component which stores three-dimensional design drawing data of an object to be measured and a control component for controlling the optical axis deflecting unit and the distance measuring component, wherein the control component acquires scanning data by scanning a measurement range and extracts an actual ridge line or an actual intersection point of the object to be measured based on the scanning data.
Information presentation system, moving vehicle, information presentation method, and non-transitory storage medium
An information presentation system according to the present disclosure makes a presentation unit present each of multiple different pieces of information such that an image displayed in a first form and a sound output in a second form are synchronized with each other on an individual basis. The information presentation system also makes the presentation unit shift, when presenting two different pieces of information, selected from the multiple different pieces of information, as two sounds in the second form, a timing to output one of the two sounds by at least a certain period of time with respect to a timing to output the other of the two sounds, in order to prevent the two sounds from overlapping with each other.
Rapid robust detection decreaser
A system for tracking targets. A sequence of sensor observations is processed with two thresholds, including a first threshold, and a second threshold, higher than the first threshold. Signals that exceed the first threshold are identified as low-confidence target detections and stored for possible future use. When a signal exceeds the higher second threshold, it is identified as a high-confidence detection, and one or more candidate tracks are formed, including the high-confidence detection and one or more low-confidence detections from within a neighborhood of the high-confidence detection.
DYNAMIC OBJECT DETECTION INDICATOR SYSTEM FOR AN AUTOMATED VEHICLE
A system includes a tracking system, a controller-circuit, and a device. The tracking system is configured to detect and track an object, and includes one or more of a computer vision system, a radar system, and a LIDAR system. The controller-circuit is disposed in a host vehicle, and is configured to receive detection signals from the tracking system, process the detection signals, determine, whether an object is detected based on the processed detecting signals, and in accordance with a determination that an object is detected, output command signals. The device is adapted to be mounted to the host vehicle, and is configured to receive the command signals and thereby provide a dynamic visual indication adapted to change in accordance with orientation changes between the host vehicle and the object. The dynamic visual indication is viewable from outside of the host vehicle.
DRIVER VISUALIZATION AND SEMANTIC MONITORING OF A VEHICLE USING LIDAR DATA
Methods are provided for using a light ranging system of a vehicle. A computing system receives, from light ranging devices, ranging data including distance vectors to environmental surfaces. A distance vector can correspond to a pixel of a three-dimensional image stream. The system can identify a pose of a virtual camera relative to the light ranging devices. The light ranging devices are separated from the pose by first vectors that are used to translate some of the distance vectors using the first vectors. The system may determine colors associated with the translated distance vectors and display pixels of the three-dimensional image stream using the colors at pixel positions specified by the translated distance vectors. The system may use one or more vehicle models with the ranging data to provide semantic labels that describe a region that has been, or is likely to be, in a collision.
REAL-TIME MONITORING OF SURROUNDINGS OF MARINE VESSEL
Real-time monitoring of surroundings of a marine vessel. One or more observation sensor modules are configured and positioned to generate sensor data extending around the marine vessel. One or more data processors are configured to map and visualize the sensor data in relation to a virtual model of the marine vessel. A user interface is configured to display the virtual model together with the visualized sensor data from a user selectable point of view to a mariner of the marine vessel.
REAL-TIME MONITORING OF SURROUNDINGS OF MARINE VESSEL
Real-time monitoring of surroundings of a marine vessel. One or more observation sensor modules are configured and positioned to generate sensor data extending around the marine vessel. One or more data processors are configured to map and visualize the sensor data in relation to a virtual model of the marine vessel. A user interface is configured to display the virtual model together with the visualized sensor data from a user selectable point of view to a mariner of the marine vessel.