Patent classifications
G01S7/51
Image identification system
Embodiments may relate to a graphical user interface (GUI). The GUI may include a first portion that displays an image related to images of a location. The GUI may also include a second portion that displays an image related to detection and ranging information of the location. The two images may be linked such that an interaction with an object in one portion of the GUI causes changes in the other portion of the GUI. Other embodiments may be described or claimed.
Measurement system, measurement method, and measurement program
A measurement system includes: a storage configured to store design information including at least a dimension and a designed position of each of building components at a construction site; a component identifier configured to identify at least one of the building components constructed at the construction site as an identified building component, based on the designed position stored in the storage; a scanner configured to measure a distance and/or angle from the identified building component identified by the component identifier; a point cloud data generator configured to generate three-dimensional point cloud data, based on a result of the distance and/or angle measured by the scanner; and an actually measured position calculator configured to calculate an actually measured position of the identified building component, based on the three-dimensional point cloud data.
Measurement system, measurement method, and measurement program
A measurement system includes: a storage configured to store design information including at least a dimension and a designed position of each of building components at a construction site; a component identifier configured to identify at least one of the building components constructed at the construction site as an identified building component, based on the designed position stored in the storage; a scanner configured to measure a distance and/or angle from the identified building component identified by the component identifier; a point cloud data generator configured to generate three-dimensional point cloud data, based on a result of the distance and/or angle measured by the scanner; and an actually measured position calculator configured to calculate an actually measured position of the identified building component, based on the three-dimensional point cloud data.
LIDAR-BASED IMMERSIVE 3D REALITY CAPTURE SYSTEMS, AND RELATED METHODS AND APPARATUS
LiDAR-based immersive 3D reality capture systems and methods are disclosed. The reality capture system includes a set of LiDAR sensors disposed around an environment and configured to capture one or more events occurring within the environment. The reality capture system also includes a corresponding set of cameras disposed around the environment. Each camera is mounted on a same gimbal with a corresponding LiDAR sensor and has a same optical axis as the corresponding LiDAR sensor. The reality capture system further includes a base station viewpoint generator coupled to the set of LiDAR sensors and the cameras to generate a video feed based on data received from the LiDAR sensors and the cameras. The reality capture system additionally includes a virtual reality device coupled to the base station viewpoint generator to receive and display the video feed generated by the base station viewpoint generator.
LIDAR-BASED IMMERSIVE 3D REALITY CAPTURE SYSTEMS, AND RELATED METHODS AND APPARATUS
LiDAR-based immersive 3D reality capture systems and methods are disclosed. The reality capture system includes a set of LiDAR sensors disposed around an environment and configured to capture one or more events occurring within the environment. The reality capture system also includes a corresponding set of cameras disposed around the environment. Each camera is mounted on a same gimbal with a corresponding LiDAR sensor and has a same optical axis as the corresponding LiDAR sensor. The reality capture system further includes a base station viewpoint generator coupled to the set of LiDAR sensors and the cameras to generate a video feed based on data received from the LiDAR sensors and the cameras. The reality capture system additionally includes a virtual reality device coupled to the base station viewpoint generator to receive and display the video feed generated by the base station viewpoint generator.
Information handling system infrared proximity detection with distance reduction detection
An information handling system manages operation of an infrared time of flight sensor to provide accurate and timely user presence and absence detection through monitoring of the time of flight distance detection for indications of object velocity that validates or invalidates a transition between the user presence and user absence states. An integrated sensor hub in a central processing unit stores distances received from the infrared time of flight sensor in a distance table in association with a time stamp of the distance measurement. During monitoring of distances received from the infrared time of flight sensor, if the integrated sensor hub detects a user absence or presence, validation of the transition is performed by analyzing the stored distances to determine a vector of velocity at the state transition.
Systems and methods for commercial inventory mapping including determining if goods are still available
The following relates generally to light detection and ranging (LIDAR) and artificial intelligence (AI). In some embodiments, a system: receives sensor data via wireless communication; updates an electronic inventory of goods within a store based upon the received sensor data associated with the item movement or purchase; receives an electronic order of goods from a customer mobile device via wireless communication or data transmission over one or more radio frequency links; determines if the goods in the electronic order received from the customer are still available; generates a LIDAR-based virtual map of the store; determines a location of the goods in the electronic order that are still available; overlays the determined location of the goods onto the LIDAR-based virtual map of the store; and displays an updated LIDAR-based virtual map of the store displaying aisles of the store and the determined location of the goods within the store.
Eyewear display system
Provided is an eyewear display system includes: a scanner including a measuring unit configured to acquire three-dimensional coordinates, a point cloud data acquiring unit configured to acquire point cloud data, and a communication unit; an eyewear device including a display, a relative position detection sensor, a relative direction detection sensor, and a communication unit; a storage device including CAD design data of an observation site; and a data processing device configured to synchronize a coordinate space of the scanner, a coordinate space of the eyewear device, and a coordinate space of the CAD design data, and convert observation data OD and/or observation data prediction PD into data in the coordinate space of the CAD design data, such that the eyewear device displays the observation data OD or observation data prediction PD on the display by superimposing the observation data OD or observation data prediction PD on an actual landscape.
LIDAR SYSTEM TO ADJUST DOPPLER EFFECTS
Doppler correction of phase-encoded LIDAR includes a code indicating a sequence of phases for a phase-encoded signal, and determining a first Fourier transform of the signal. A laser optical signal is used as a reference and modulated based on the code to produce a transmitted phase-encoded optical signal. A returned optical signal is received in response. The returned optical signal is mixed with the reference. The mixed optical signals are detected to produce an electrical signal. A cross spectrum is determined between in-phase and quadrature components of the electrical signal. A Doppler shift is based on a peak in the cross spectrum. A device is operated based on the Doppler shift. Sometimes a second Fourier transform of the electrical signal and the Doppler frequency shift produce a corrected Fourier transform and then a cross correlation. A range is determined based on a peak in the cross correlation.
CLEANING ROBOT
A cleaning robot is provided. The cleaning robot includes a body, a light detection and ranging (LiDAR) module having a LiDAR sensor rotatably supported by the body, a light-emitting display module mounted on the LiDAR module, wherein the light-emitting display module is configured to display an image based on an afterimage effect according to a rotation of the LiDAR module.