G01S7/51

Image identification system
10983681 · 2021-04-20 · ·

Embodiments may relate to a graphical user interface (GUI). The GUI may include a first portion that displays an image related to images of a location. The GUI may also include a second portion that displays an image related to detection and ranging information of the location. The two images may be linked such that an interaction with an object in one portion of the GUI causes changes in the other portion of the GUI. Other embodiments may be described or claimed.

AERIAL LINE EXTRACTION SYSTEM AND METHOD

A technique facilitates selecting and designating an arbitrary one of a plurality of aerial lines. The aerial line extraction system, includes: an area-of-interest cropping unit that crops a region where an aerial line is assumed to exist as an area of interest by setting a support of the aerial line as a reference from a three-dimensional point cloud data; an element segmenting unit that segments the area of interest into a plurality of subdivided areas, obtains a histogram by counting three-dimensional point clouds existing in each of the subdivided areas, and obtains a segmentation plane of the area of interest on the basis of the histogram; and an element display unit that segments the area of interest into a plurality of segmented areas by the segmentation plane and displays the three-dimensional point clouds included in each of the segmented areas in a distinguishable manner.

DOUBLE-END LASER RANGEFINDER

A double-end laser rangefinder includes a ranging board, a control board, a key board which are integrated to form a mainboard. Two lens mounting bases are coaxially mounted at two ends of a main frame. First end and second end with lens-mounting bases are coaxially mounted in the main frame. Two lens groups are mounted on the two lens-mounting bases respectively. Each lens group includes a transmitting lens and a receiving lens. The receiving lenses on the first end and the second end are coaxial, the transmitting lenses on the first end and the second end are also coaxial. Coaxial arrangement of light paths of the two transmitting lenses and the two receiving lenses is realized through the main frame.

DOUBLE-END LASER RANGEFINDER

A double-end laser rangefinder includes a ranging board, a control board, a key board which are integrated to form a mainboard. Two lens mounting bases are coaxially mounted at two ends of a main frame. First end and second end with lens-mounting bases are coaxially mounted in the main frame. Two lens groups are mounted on the two lens-mounting bases respectively. Each lens group includes a transmitting lens and a receiving lens. The receiving lenses on the first end and the second end are coaxial, the transmitting lenses on the first end and the second end are also coaxial. Coaxial arrangement of light paths of the two transmitting lenses and the two receiving lenses is realized through the main frame.

AUGMENTING PANORAMIC LIDAR RESULTS WITH COLOR
20210041570 · 2021-02-11 ·

Methods and systems can augment 360 degree panoramic LIDAR results (e.g., from a spinning LIDAR system) with color obtained from color cameras. A color-pixel-lookup table can specify the correspondence between LIDAR pixels (depth/ranging pixels) and color pixels, which may be done at different viewing object distances. The operation of the color cameras can be triggered by the angular positions of the LIDAR system. For example, a color image of a particular camera can be captured when the LIDAR system is at a particular angular position, which can be predetermined based on properties of the cameras (e.g., shutter speed). Alternatively or in addition, a common internal clock can be used to assign timestamps to LIDAR and color pixels as they are captured. The corresponding color pixel(s), e.g., as determined using a color-pixel-lookup table, with the closest timestamp can be used for colorization.

AUGMENTING PANORAMIC LIDAR RESULTS WITH COLOR
20210041570 · 2021-02-11 ·

Methods and systems can augment 360 degree panoramic LIDAR results (e.g., from a spinning LIDAR system) with color obtained from color cameras. A color-pixel-lookup table can specify the correspondence between LIDAR pixels (depth/ranging pixels) and color pixels, which may be done at different viewing object distances. The operation of the color cameras can be triggered by the angular positions of the LIDAR system. For example, a color image of a particular camera can be captured when the LIDAR system is at a particular angular position, which can be predetermined based on properties of the cameras (e.g., shutter speed). Alternatively or in addition, a common internal clock can be used to assign timestamps to LIDAR and color pixels as they are captured. The corresponding color pixel(s), e.g., as determined using a color-pixel-lookup table, with the closest timestamp can be used for colorization.

Method for operating a laser distance measuring device
10962631 · 2021-03-30 · ·

A method for operating a laser distance measuring device, in particular a hand-held laser distance measuring device, includes determining a first distance from a first target point with a laser distance measuring unit of the laser distance measuring device by emitting a laser beam in a first distance measuring direction. The method further includes subsequently determining at least one second distance from a second intended target point. An image at least of the target environment of the second target point, captured by a camera of the laser distance measuring device, is displayed on a display of the laser distance measuring device. At least one part of a connection line is represented overlapping with the image, and the connection line connects the first target point and the second target point in the displayed image. A laser distance measuring device implements the method in one embodiment.

Method for operating a laser distance measuring device
10962631 · 2021-03-30 · ·

A method for operating a laser distance measuring device, in particular a hand-held laser distance measuring device, includes determining a first distance from a first target point with a laser distance measuring unit of the laser distance measuring device by emitting a laser beam in a first distance measuring direction. The method further includes subsequently determining at least one second distance from a second intended target point. An image at least of the target environment of the second target point, captured by a camera of the laser distance measuring device, is displayed on a display of the laser distance measuring device. At least one part of a connection line is represented overlapping with the image, and the connection line connects the first target point and the second target point in the displayed image. A laser distance measuring device implements the method in one embodiment.

Autonomous vehicle system for blending sensor data
10915112 · 2021-02-09 · ·

Systems and methods for generating blended sensor-based (LIDAR-based) visualizations are provided. In one example embodiment, a computer implemented method includes obtaining sensor data associated with a surrounding environment of the autonomous vehicle. The sensor data is acquired via a LIDAR system of the autonomous vehicle. The method includes generating a first style sheet associated with the surrounding environment of the autonomous vehicle based at least in part on the sensor data. The method includes generating a second style sheet associated with the surrounding environment of the autonomous vehicle based at least in part on the sensor data. The method includes providing an output for display via a user interface of a display device. The output includes a visualization of the surrounding environment of the autonomous vehicle based at least in part on the first style sheet and the second style sheet.

Autonomous vehicle system for blending sensor data
10915112 · 2021-02-09 · ·

Systems and methods for generating blended sensor-based (LIDAR-based) visualizations are provided. In one example embodiment, a computer implemented method includes obtaining sensor data associated with a surrounding environment of the autonomous vehicle. The sensor data is acquired via a LIDAR system of the autonomous vehicle. The method includes generating a first style sheet associated with the surrounding environment of the autonomous vehicle based at least in part on the sensor data. The method includes generating a second style sheet associated with the surrounding environment of the autonomous vehicle based at least in part on the sensor data. The method includes providing an output for display via a user interface of a display device. The output includes a visualization of the surrounding environment of the autonomous vehicle based at least in part on the first style sheet and the second style sheet.