G01S7/51

INFORMATION HANDLING SYSTEM INFRARED PROXIMITY DETECTION WITH FREQUENCY DOMAIN MODULATION

An information handling system manages operation of an infrared time of flight sensor to provide accurate and timely user presence and absence detection through modulation in the frequency domain of infrared light, such as by hopping or multiplexing through plural infrared frequencies sensed by the time of flight sensor. An application of the information handling system retrieves calibration information from the infrared time of flight sensor and applies the calibration information to select the plural infrared frequencies. If one or more predetermined conditions, such as a change in ambient light characteristics, the application commands an update of the calibration information to enhance user presence and absence detection.

INFORMATION HANDLING SYSTEM INFRARED PROXIMITY DETECTION WITH FREQUENCY DOMAIN MODULATION

An information handling system manages operation of an infrared time of flight sensor to provide accurate and timely user presence and absence detection through modulation in the frequency domain of infrared light, such as by hopping or multiplexing through plural infrared frequencies sensed by the time of flight sensor. An application of the information handling system retrieves calibration information from the infrared time of flight sensor and applies the calibration information to select the plural infrared frequencies. If one or more predetermined conditions, such as a change in ambient light characteristics, the application commands an update of the calibration information to enhance user presence and absence detection.

User interface for displaying point clouds generated by a lidar device on a UAV

Techniques are disclosed for real-time mapping in a movable object environment. A system for real-time mapping in a movable object environment, may include at least one movable object including a computing device, a scanning sensor electronically coupled to the computing device, and a positioning sensor electronically coupled to the computing device. The system may further include a client device in communication with the at least one movable object, the client device including a visualization application which is configured to receive point cloud data from the scanning sensor and position data from the positioning sensor, record the point cloud data and the position data to a storage location, generate a real-time visualization of the point cloud data and the position data as it is received, and display the real-time visualization using a user interface provided by the visualization application.

User interface for displaying point clouds generated by a lidar device on a UAV

Techniques are disclosed for real-time mapping in a movable object environment. A system for real-time mapping in a movable object environment, may include at least one movable object including a computing device, a scanning sensor electronically coupled to the computing device, and a positioning sensor electronically coupled to the computing device. The system may further include a client device in communication with the at least one movable object, the client device including a visualization application which is configured to receive point cloud data from the scanning sensor and position data from the positioning sensor, record the point cloud data and the position data to a storage location, generate a real-time visualization of the point cloud data and the position data as it is received, and display the real-time visualization using a user interface provided by the visualization application.

INFORMATION HANDLING SYSTEM INFRARED PROXIMITY DETECTION WITH DISTANCE REDUCTION DETECTION

An information handling system manages operation of an infrared time of flight sensor to provide accurate and timely user presence and absence detection through monitoring of the time of flight distance detection for indications of object velocity that validates or invalidates a transition between the user presence and user absence states. An integrated sensor hub in a central processing unit stores distances received from the infrared time of flight sensor in a distance table in association with a time stamp of the distance measurement. During monitoring of distances received from the infrared time of flight sensor, if the integrated sensor hub detects a user absence or presence, validation of the transition is performed by analyzing the stored distances to determine a vector of velocity at the state transition.

Interface for improved high definition map generation

Systems and methods are disclosed related to generating interactive user interfaces that enable a user to alter 3D point cloud data and/or associated pose graph data generated from LiDAR scans prior to generation of a high definition map. A user may make selections in a 2D map representation with overlaid graph node indicators in order to alter graph connections, remove nodes, view corresponding 3D point clouds, and otherwise edit intermediate results from LiDAR scans in order to improve the quality of a high definition map subsequently generated from the user-manipulated data.

Interface for improved high definition map generation

Systems and methods are disclosed related to generating interactive user interfaces that enable a user to alter 3D point cloud data and/or associated pose graph data generated from LiDAR scans prior to generation of a high definition map. A user may make selections in a 2D map representation with overlaid graph node indicators in order to alter graph connections, remove nodes, view corresponding 3D point clouds, and otherwise edit intermediate results from LiDAR scans in order to improve the quality of a high definition map subsequently generated from the user-manipulated data.

System and method for robust depth calculation with ToF sensors using multiple exposure times

A system and method for performing robust depth calculations with time of flight (ToF) sensors using multiple exposure times is disclosed. A three-dimensional (3D) depth sensor assembly captures a first array of n point values, where each point value of the first array has a respective first-array depth component and a respective first-array quality component. The 3D depth sensor assembly then captures a second array of n point values, where each point value of the second array has a respective second-array depth component and a respective second-array quality component. A processor then renders a 3D point cloud comprising a third array of n point values, where each point value of the third array has a respective third-array depth component. The respective third-array depth component for each point value of the third array is based on either the corresponding respective first-array depth component or the corresponding respective second-array depth component.

System and method for robust depth calculation with ToF sensors using multiple exposure times

A system and method for performing robust depth calculations with time of flight (ToF) sensors using multiple exposure times is disclosed. A three-dimensional (3D) depth sensor assembly captures a first array of n point values, where each point value of the first array has a respective first-array depth component and a respective first-array quality component. The 3D depth sensor assembly then captures a second array of n point values, where each point value of the second array has a respective second-array depth component and a respective second-array quality component. A processor then renders a 3D point cloud comprising a third array of n point values, where each point value of the third array has a respective third-array depth component. The respective third-array depth component for each point value of the third array is based on either the corresponding respective first-array depth component or the corresponding respective second-array depth component.

Image identification system
10983681 · 2021-04-20 · ·

Embodiments may relate to a graphical user interface (GUI). The GUI may include a first portion that displays an image related to images of a location. The GUI may also include a second portion that displays an image related to detection and ranging information of the location. The two images may be linked such that an interaction with an object in one portion of the GUI causes changes in the other portion of the GUI. Other embodiments may be described or claimed.