Patent classifications
G01B11/2545
TRIANGULATION SCANNER HAVING FLAT GEOMETRY AND PROJECTING UNCODED SPOTS
A projector projects an uncoded pattern of uncoded spots onto an object, which is imaged by a first camera and a second camera, 3D coordinates of the spots on the object being determined by a processor based on triangulation, the processor further determining correspondence among the projected and imaged spots based at least in part on a nearness of intersection of lines drawn from the projector and image spots through their respective perspective centers.
Systems and methods for estimating depth from projected texture using camera arrays
Systems and methods in accordance with embodiments of the invention estimate depth from projected texture using camera arrays that includes at least two two-dimensional arrays of cameras each several cameras; an illumination system configured to illuminate a scene with a projected texture; a processor; and memory containing an image processing pipeline application and an illumination system controller application. In addition, the illumination system controller application directs the processor to control the illumination system to illuminate a scene with a projected texture. Furthermore, the image processing pipeline application directs the processor to: utilize the illumination system controller application to control the illumination system to illuminate a scene with a projected texture capture a set of images of the scene illuminated with the projected texture; determining depth estimates for pixel locations in an image from a reference viewpoint using at least a subset of the set of images.
Three-dimensional coordinate scanner and method of operation
A noncontact optical three-dimensional measuring device that includes a first projector, a first camera, a second projector, and a second camera; a processor electrically coupled to the first projector, the first camera, the second projector, and the second camera; and computer readable media which, when executed by the processor, causes the first digital signal to be collected at a first time and the second digital signal to be collected at a second time different than the first time and determines three-dimensional coordinates of a first point on the surface based at least in part on the first digital signal and the first distance and determines three-dimensional coordinates of a second point on the surface based at least in part on the second digital signal and the second distance.
SYSTEM AND METHOD FOR MEASURING THREE-DIMENSIONAL SURFACE FEATURES
In some embodiments, a system for measuring surface features may include a pattern projector, at least one digital imaging device, and an image processing device. The pattern projector may project, during use, a pattern of light on a surface of an object. In some embodiments, the pattern projector moves, during use, the pattern of light along the surface of the object. In some embodiments, the pattern projector moves the pattern of light in response to electronic control signals. At least one of the digital imaging devices may record, during use, at least one image of the projected pattern of light. The image processing device which, during use, converts projected patterns of light recorded on at least one of the images to three-dimensional data points representing a surface geometry of the object using relative positions and relative angles between the at least one imaging device and the pattern projector.
IMAGE DEVICE FOR GENERATING PANORAMA DEPTH IMAGES AND RELATED IMAGE DEVICE
An image device for generating panoramic depth images includes at least two image capturing groups. Each image capturing group of the at least two image capturing groups includes at least three image capturers. A size of each image capturer of the at least three image capturers is a first length, a distance between two adjacent capturing devices of the at least three image capturers is a second length, and a ratio of the second length to the first length is not less than a predetermined value. Depths of at least three depth maps corresponding to the at least two image capturing groups are applied to generating a panoramic depth image.
Method and apparatus for laser projection, and machining method
A laser projection method including the steps of: irradiating, from a laser projection unit, a workpiece that is a measurement object, with a laser while controlling a plurality of mirror angles; imaging the workpiece with a stereo camera, extracting a contour of the workpiece, and calculating a three-dimensional coordinate; calculating a positional relationship between the laser projection unit and the workpiece by comparing the calculated three-dimensional coordinate of the workpiece contour with the minor angle; and performing coordinate transformation of CAD data information and drawing CAD data from the laser projection unit to the workpiece, based on the positional relationship between the laser projection unit and the workpiece. The machining method including the steps of: selecting a component of a tool; assembling the component; imaging the tool assembled; and determining whether or not a desired tool has been assembled.
Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
A method for scanning and obtaining three-dimensional (3D) coordinates is provided. The method includes providing a 3D measuring device having a projector, a first camera and a second camera. The method records images of a light pattern emitted by the projector onto an object. A deviation in a measured parameter from an expected parameter is determined. The calibration of the 3D measuring device may be changed when the deviation is outside of a predetermined threshold.
Three-dimensional coordinate scanner and method of operation
A system and method of determining 3D coordinates of an object is provided. The method includes determining a first set of 3D coordinates for a plurality of points on the object with a structured light scanner. An inspection plan is determined for the object, which includes features to be inspected with a remote probe. The points are mapped onto a CAD model. The features are identified on the plurality of points mapped onto a CAD model. A visible light is projected with the scanner proximate a first feature of the features. A sensor is contacted on the remote probe to at least one first point on the first feature on the object. A first position and orientation of the remote probe are determined with the scanner. A second set of 3D coordinates of the at least one first point are determined on the first feature on the object.
Three-dimensional measurement method, device, and storage medium
A three-dimensional measurement method combines the three-step phase shift method to embed the marker line information into the sinusoidal stripe pattern to obtain the target stripe pattern. The target stripe pattern is projected onto the surface of the object to be measured, and the wrapped phase image, mean intensity image and modulated intensity image of the stripe pattern collected by the left and right cameras are solved. The mask image according to the mean intensity image and modulation intensity image is solved to extract the marker line. The spatial phase unwrapping starting from the marker line in the wrapped phase image is performed to obtain the spatial phase. The spatial phase matching based on the unique correspondence between the left and right cameras based on the spatial phase of the marker line is performed, the best matching point of the right camera is obtained.
APPARATUS AND METHOD FOR THREE-DIMENSIONAL INSPECTION
An apparatus for three-dimensional inspection includes a carrier, an image sensing component, and a processor. The carrier is configured to hold an object. The image sensing component is configured to capture a first image, a second image and a third image of the object along a first axis, a second axis, and a third axis respectively, and the first axis, the second axis, and the third axis are not parallel with each other. The processor is configured to analyze the first image and the second image to obtain a first directional stereo information, and analyze the third image and a determined image of the object to obtain a second directional stereo information.