G01S7/51

MEASUREMENT SYSTEM, MEASUREMENT METHOD, AND MEASUREMENT PROGRAM
20200363530 · 2020-11-19 · ·

A measurement system includes: a storage configured to store design information including at least a dimension and a designed position of each of building components at a construction site; a component identifier configured to identify at least one of the building components constructed at the construction site as an identified building component, based on the designed position stored in the storage; a scanner configured to measure a distance and/or angle from the identified building component identified by the component identifier; a point cloud data generator configured to generate three-dimensional point cloud data, based on a result of the distance and/or angle measured by the scanner; and an actually measured position calculator configured to calculate an actually measured position of the identified building component, based on the three-dimensional point cloud data.

MEASUREMENT SYSTEM, MEASUREMENT METHOD, AND MEASUREMENT PROGRAM
20200363530 · 2020-11-19 · ·

A measurement system includes: a storage configured to store design information including at least a dimension and a designed position of each of building components at a construction site; a component identifier configured to identify at least one of the building components constructed at the construction site as an identified building component, based on the designed position stored in the storage; a scanner configured to measure a distance and/or angle from the identified building component identified by the component identifier; a point cloud data generator configured to generate three-dimensional point cloud data, based on a result of the distance and/or angle measured by the scanner; and an actually measured position calculator configured to calculate an actually measured position of the identified building component, based on the three-dimensional point cloud data.

Scalable feature classification for laser scanning data and digital elevation models

Scalable feature classification for 3D point cloud data is provided. In one aspect, a method for rasterizing 3D point cloud data includes: obtaining the 3D point cloud data; generating a digital elevation model (DEM) from the 3D point cloud data; decomposing the DEM into local and global fluctuations to obtain a local DEM; generating geo-referenced shapes by automatically thresholding the local DEM; cropping and normalizing the local DEM using minimum bounding boxes derived from the geo-referenced shapes and manual annotations from subject matter experts to create a cropped DEM; and linking geo-spatially tagged labels from the subject matter experts to the cropped DEM. These data can be then directly fed into a system having an ensemble of artificial neural networks. By way of example, a scalable ecosystem is presented on the basis of the geo-spatial platform IBM PAIRS.

Scalable feature classification for laser scanning data and digital elevation models

Scalable feature classification for 3D point cloud data is provided. In one aspect, a method for rasterizing 3D point cloud data includes: obtaining the 3D point cloud data; generating a digital elevation model (DEM) from the 3D point cloud data; decomposing the DEM into local and global fluctuations to obtain a local DEM; generating geo-referenced shapes by automatically thresholding the local DEM; cropping and normalizing the local DEM using minimum bounding boxes derived from the geo-referenced shapes and manual annotations from subject matter experts to create a cropped DEM; and linking geo-spatially tagged labels from the subject matter experts to the cropped DEM. These data can be then directly fed into a system having an ensemble of artificial neural networks. By way of example, a scalable ecosystem is presented on the basis of the geo-spatial platform IBM PAIRS.

System and method for calibrating at least one camera and a light detection and ranging sensor

A system for calibrating at least one camera and a light detection and ranging (LiDAR) sensor includes one or more processors and a memory in communication with the one or more processors that stores an initial calibration module and a user calibration module. The initial calibration module includes instructions that cause the one or more processors to obtain a camera image from the at least one camera, determine when the camera image includes at least one object having at least one edge, obtain a three-dimensional point cloud image from the LiDAR sensor that includes the at least one object having at least one edge and generate a combined image, that includes at least portions of the camera image and at least portions of the three-dimensional point cloud image. The user calibration module includes instructions that cause the one or more processors to display the combined image on a display, receive at least one input from a user interface, and adjust a calibration parameter of at least one of the LiDAR sensor and the at least one camera in response to the one or more inputs from the user interface.

System and method for calibrating at least one camera and a light detection and ranging sensor

A system for calibrating at least one camera and a light detection and ranging (LiDAR) sensor includes one or more processors and a memory in communication with the one or more processors that stores an initial calibration module and a user calibration module. The initial calibration module includes instructions that cause the one or more processors to obtain a camera image from the at least one camera, determine when the camera image includes at least one object having at least one edge, obtain a three-dimensional point cloud image from the LiDAR sensor that includes the at least one object having at least one edge and generate a combined image, that includes at least portions of the camera image and at least portions of the three-dimensional point cloud image. The user calibration module includes instructions that cause the one or more processors to display the combined image on a display, receive at least one input from a user interface, and adjust a calibration parameter of at least one of the LiDAR sensor and the at least one camera in response to the one or more inputs from the user interface.

SYSTEM AND METHOD FOR ENHANCING A 3D RENDERING OF A LIDAR POINT CLOUD
20200357190 · 2020-11-12 ·

A system(s) and method(s) for improved visualization includes a processor for receiving a signal from a LIDAR system measuring a target site. The received signal includes a collection of points representing a 3D space and a signal strength for each point. The processor may execute at least one of an intensity, color, size and opacity modules stored in a memory to adjust at least one attribute of a point of the point cloud based on the signal strength of that point to generate an improved 3D image of the point cloud for display on a display. By correlating the received signal strength with at least one of an intensity, color, size and opacity, an improved and more visually pleasing 3D image of the point cloud may be obtained and displayed.

METHODS AND SYSTEMS FOR MACHINE STATE RELATED VISUAL FEEDBACK IN A ROBOTIC DEVICE
20200356094 · 2020-11-12 ·

A robotic device including a plurality of light emitting modules and a processor is disclosed. The processor is configured to identify one or more machine states of the robotic device and a status of each of the one or more machine states, select at least one of the one or more machine states, determine a visual pattern corresponding to a status of the at least one selected machine state, cause the plurality of light emitting modules to output the visual pattern. The one or more machine states include one or more of the following: a navigation state, a sensor function state, a collision alert state, or an error state.

Night image display apparatus and image processing method thereof

The present invention relates to a night image output apparatus, comprising: a photographing unit comprising an optical pulse output unit for outputting optical pulses and an image sensor for forming a plurality of images using optical pulses reflected by an external object; a display unit for outputting a final image made by synthesizing the plurality of images; and a control unit for calculating object distance information displayed in each pixel of the final image, by using data of a light quantity ratio of the plurality of images to each pixel of the final image and a light quantity ratio related to distance.

Night image display apparatus and image processing method thereof

The present invention relates to a night image output apparatus, comprising: a photographing unit comprising an optical pulse output unit for outputting optical pulses and an image sensor for forming a plurality of images using optical pulses reflected by an external object; a display unit for outputting a final image made by synthesizing the plurality of images; and a control unit for calculating object distance information displayed in each pixel of the final image, by using data of a light quantity ratio of the plurality of images to each pixel of the final image and a light quantity ratio related to distance.