Patent classifications
G01S7/51
Sensor module for applications with handheld rangefinder instrument
A sensor module for connection to a handheld rangefinder instrument and for providing sensor data for the rangefinder instrument includes an electronic interface for connection to an applicable interface of the rangefinder instrument, and at least one electronic sensor component for generating the sensor data. A system for handheld measurement of distances to a surface region of an object includes a sensor module of this kind and a handheld rangefinder instrument having a first laser rangefinder. When the interfaces are connected to one another, the sensor module is configured for transmitting the sensor data to the rangefinder instrument, and the interfaces are configured for transmitting electric power from the rangefinder instrument to the sensor module, in which the sensor module is configured to operate the at least one sensor component by means of the electric power transmitted by the rangefinder instrument.
Sensor module for applications with handheld rangefinder instrument
A sensor module for connection to a handheld rangefinder instrument and for providing sensor data for the rangefinder instrument includes an electronic interface for connection to an applicable interface of the rangefinder instrument, and at least one electronic sensor component for generating the sensor data. A system for handheld measurement of distances to a surface region of an object includes a sensor module of this kind and a handheld rangefinder instrument having a first laser rangefinder. When the interfaces are connected to one another, the sensor module is configured for transmitting the sensor data to the rangefinder instrument, and the interfaces are configured for transmitting electric power from the rangefinder instrument to the sensor module, in which the sensor module is configured to operate the at least one sensor component by means of the electric power transmitted by the rangefinder instrument.
Floor surveying system
A floor surveying comprises a self-contained mobile unit that performs simultaneous localization and mapping (SLAM) using wheel odometry data and data from a range finding laser device (RFLD), a digital camera, or both. Point cloud data for mapping is collected using a RFLD positioned near the floor and scanning in a plane perpendicular to the floor. 3D point cloud data representing the floor surface and surfaces in apposition to the floor are displayed and used to generate a floor map or floor plan.
Floor surveying system
A floor surveying comprises a self-contained mobile unit that performs simultaneous localization and mapping (SLAM) using wheel odometry data and data from a range finding laser device (RFLD), a digital camera, or both. Point cloud data for mapping is collected using a RFLD positioned near the floor and scanning in a plane perpendicular to the floor. 3D point cloud data representing the floor surface and surfaces in apposition to the floor are displayed and used to generate a floor map or floor plan.
Solution path overlay interfaces for autonomous vehicles
Methods and systems for controlling a vehicle are herein disclosed. A method includes receiving vehicle data and external data from a vehicle control system of a vehicle and generating an environment representation of an area of the transportation network proximate to the vehicle location. The method includes displaying the environment representation in a GUI and receiving a solution path via the graphical user interface, the solution path indicating a route and one or more stop points. The method includes transmitting the route to the vehicle including a respective geolocation of each of the one or more stop points. The vehicle receives the route and begins traversing the transportation network based on the solution path. The method includes receiving updated vehicle data and/or updated external data from the vehicle and updating the environment representation based thereon. The method includes displaying, the updated environment representation via the graphical user interface.
Point cloud display method and apparatus
A point cloud display method includes determining, from a first point cloud, points describing a target object, where the first point cloud describes a surrounding area of a vehicle in which the in-vehicle system is located, and the target object is to be identified by the in-vehicle system, generating a second point cloud based on the points, and displaying the second point cloud.
Point cloud display method and apparatus
A point cloud display method includes determining, from a first point cloud, points describing a target object, where the first point cloud describes a surrounding area of a vehicle in which the in-vehicle system is located, and the target object is to be identified by the in-vehicle system, generating a second point cloud based on the points, and displaying the second point cloud.
Driver visualization and semantic monitoring of a vehicle using LiDAR data
Methods are provided for using a light ranging system of a vehicle. A computing system receives, from light ranging devices, ranging data including distance vectors to environmental surfaces. A distance vector can correspond to a pixel of a three-dimensional image stream. The system can identify a pose of a virtual camera relative to the light ranging devices. The light ranging devices are separated from the pose by first vectors that are used to translate some of the distance vectors using the first vectors. The system may determine colors associated with the translated distance vectors and display pixels of the three-dimensional image stream using the colors at pixel positions specified by the translated distance vectors. The system may use one or more vehicle models with the ranging data to provide semantic labels that describe a region that has been, or is likely to be, in a collision.
Driver visualization and semantic monitoring of a vehicle using LiDAR data
Methods are provided for using a light ranging system of a vehicle. A computing system receives, from light ranging devices, ranging data including distance vectors to environmental surfaces. A distance vector can correspond to a pixel of a three-dimensional image stream. The system can identify a pose of a virtual camera relative to the light ranging devices. The light ranging devices are separated from the pose by first vectors that are used to translate some of the distance vectors using the first vectors. The system may determine colors associated with the translated distance vectors and display pixels of the three-dimensional image stream using the colors at pixel positions specified by the translated distance vectors. The system may use one or more vehicle models with the ranging data to provide semantic labels that describe a region that has been, or is likely to be, in a collision.
Threat detection and notification system for public safety vehicles
The present invention is directed to a threat detection and notification system which preferably comprises a LIDAR unit capable of detecting an object in its field of view during a first and subsequent refresh scans, and outputting point cloud data representative of the object detected in a refresh frame corresponding to every refresh scan. A server is operatively connected to the LIDAR unit and is capable of receiving the point cloud data and determining, for the object detected within each refresh frame, a predetermined set of object-specific attributes. A user control interface is operatively connected to the server and is capable of creating a watch zone defining an area of interest less than the LIDAR unit's field of view, and is further capable of defining a plurality of watch mode parameters associated with the watch zone. A LIDAR alert engine is operatively connected to the server and is capable of determining whether the object detected in each refresh frame is located within the watch zone, calculating at least one indicant of motion of the object, and comparing at least one object-specific attribute and the at least one indicant of motion of the object to the defined watch mode parameters. The LIDAR alert engine is capable of alerting a predetermined user to the presence of the object in the event the object is within the watch zone and the aforementioned comparison indicates an alert condition.