G01B11/2545

Broken Wheel Detection System
20200369302 · 2020-11-26 · ·

A broken wheel detection system for detecting broken wheels on rail vehicles even when such vehicles are moving at a high rate of speed by determining the positions, lengths, or orientations of structured light lines projected against passing wheels.

Apparatus and method for three-dimensional inspection
10841561 · 2020-11-17 · ·

An apparatus for three-dimensional inspection includes a carrier, an image sensing component, and a processor. The carrier is configured to hold an object. The image sensing component is configured to capture a first image, a second image and a third image of the object along a first axis, a second axis, and a third axis respectively, and the first axis, the second axis, and the third axis are not parallel with each other. The processor is configured to analyze the first image and the second image to obtain a first directional stereo information, and analyze the third image and a determined image of the object to obtain a second directional stereo information.

CAMERA MODULE

A camera module includes a circuit board, two photosensitive chips fixed on a surface of the circuit board, two lens assemblies respectively mounted over the two photosensitive chips, two filter assemblies each including a visible light filter and an infrared filter, and an infrared projection unit fixed on a surface of the circuit board and projecting patterned infrared light. The filter assemblies respectively correspond to the photosensitive chips and the lens assemblies. The visible light filter and the infrared filter of the filter assemblies are switched to be between the lens assembly and the photosensitive chip. When the visible light filters are between the lenses and the photosensitive chips, the photosensitive chips acquire visible light to form a colored 3D image. When the infrared filters are between the lenses and the photosensitive chips, the photosensitive chips acquire reflected patterned infrared light to form an infrared 3D image.

Three-dimensional triangulational scanner having high dynamic range and fast response

A triangulation scanner system and method of operation is provided. The system includes a projector that projects a first pattern of light at a first light level during first time intervals and project the first pattern of light at a second light level during second time intervals, the second light level being different than the first light level. A first camera has a first photosensitive array, the first photosensitive array having a first pixel with an optical detector, a first memory, and a second memory. The first memory storing a first stored signal from the optical detector during the first time intervals, the second memory storing a second signal from the optical detector during the second time intervals. A processor determines three-dimensional coordinates of the first point based at least in part on the projected first pattern of light, the first stored signal, and the second stored signal.

Global positioning of a sensor with respect to different tiles for a global three-dimensional surface reconstruction
10832441 · 2020-11-10 · ·

A measuring system can three-dimensionally reconstruct surface geometry of an object by, from a first pose with a sensor, generating a first three-dimensional representation of a first portion of the object, and with a first camera, generating a first image covering at least part of the first portion, and from a second pose with the sensor, generating a second three-dimensional representation of a second portion of the object, and with the first camera, generating a second image covering at least part of the second portion. A stationary first projector can be arranged externally configured for projecting a texture onto both first and second portions of the object. A stitching computer can be configured for generating a unitary three-dimensional representation of both the first and second portions of the object from the first and second three-dimensional representations based on the first and second images.

Three-dimensional information detection device

Provided is a three-dimensional information detection device having a configuration in which detection accuracy of three-dimensional information of a measurement target object is not lowered even in a case where an optical system that includes a projection unit, a first imaging unit and a second imaging unit is covered with a light transmissive cover so that the optical system is not exposed to the outside. The three-dimensional information detection device includes the projection unit that projects an image pattern onto a measurement target object, the first imaging unit and the second imaging unit that respectively image the image pattern, a transmissive cover that covers the optical system that includes the projection unit, the first imaging unit, and the second imaging unit, and a calculation unit that calculates three-dimensional information of the measurement target object on the basis of the image pattern that is imaged using the first imaging unit and the second imaging unit. At least a part of a mounting surface of the optical system is formed by a light absorbing member, and the first imaging unit and the second imaging unit are disposed out of a region to which reflected light that is regularly reflected from the cover in projection light of the image pattern is directly incident.

3-D ENVIRONMENT SENSING BY MEANS OF PROJECTOR AND CAMERA MODULES

A camera device for a vehicle for 3-D environment sensing includes at least two camera modules having at least partly overlapping sensing ranges, a camera control unit, an evaluation unit and a point light projector. The point light projector is arranged and configured in such a way that the point light projector projects a light pattern of measurement points into the vehicle environment. The at least two camera modules are arranged and configured in such a way that at least part of the projected light pattern is imaged in the overlapping sensing range. The evaluation unit is configured to determine the 3-D position of measurement points in the vehicle environment from image data captured with the at least two camera modules. The point light projector is configured to produce a series of pseudo-noise patterns as the light pattern, the pseudo-noise patterns being projected into the vehicle environment in temporal succession.

Method for the three dimensional measurement of moving objects during a known movement
10823552 · 2020-11-03 · ·

A 3D measurement method including; projecting a pattern sequence onto a moving object; capturing a first image sequence with a first camera and a second image sequence synchronously to the first image sequence with a second camera; determining corresponding image points in the two sequences; computing a trajectory of a potential object point from imaging parameters and from known movement data for each pair of image points that is to be checked for correspondence. The potential object point is imaged by both image points in case they correspond. Imaging object positions derived therefrom at each of the capture points in time into image planes respectively of the two cameras. Corresponding image points positions are determined as trajectories in the two cameras and the image points are compared with each other along predetermined image point trajectories and examined for correspondence; lastly performing 3D measurement of the moved object by triangulation.

Super-resolving depth map by moving pattern projector

The subject disclosure is directed towards active depth sensing based upon moving a projector or projector component to project a moving light pattern into a scene. Via the moving light pattern captured over a set of frames, e.g., by a stereo camera system, and estimating light intensity at sub-pixel locations in each stereo frame, higher resolution depth information at a sub-pixel level may be computed than is captured by the native camera resolution.

Ranging objects external to an aircraft using multi-camera triangulation

Apparatus and associated methods relate to ranging an object nearby an aircraft by triangulation using two simultaneously-captured images of the object. The two images are simultaneously captured from two distinct vantage points on the aircraft. Because the two images are captured from distinct vantage points, the object can be imaged at different pixel-coordinate locations in the two images. The two images are correlated with one another so as to determine the pixel-coordinate locations corresponding to the object. Range to the object is calculated based on the determined pixel-coordinate locations and the two vantage points from which the two images are captured. Only a subset of each image is used for the correlation. The subset used for correlation includes pixel data from pixels upon which spatially-patterned light that is projected onto the object by a light projector and reflected by the object.