Patent classifications
G01B11/2545
SPECKLE REDUCTION METHODS IN LINE SCANNERS
A system includes a first light source that emits a beam of light; an electrical modulator that imparts a time-varying modulation on the beam of light; a beam- shaping system that shapes the beam of light and projects the shaped beam of light onto an object; an image sensor that captures the beam of light reflected from the object; and processors that determine three-dimensional (3D) coordinates of points on the object.
HANDHELD SCANNER THAT TRANSFERS DATA AND POWER OVER ETHERNET
A system includes a handheld unit having a light source, an image sensor, one or more first processors, an Ethernet cable, and a frame. The light source projects light onto an object, and the image sensor captures an image of light reflected from the object. One or more first processors are directly coupled to the frame. An accessory device has one or more second processors that receive data extracted from the captured image over the Ethernet cable and, in response, determine three-dimensional (3D) coordinates of points on the object. The accessory device also send electrical power over the Ethernet cable to the handheld unit.
DEVICE AND METHOD FOR MEASURING TOOLS
A device for determining a dimension of a tool having a cutting edge includes a first light source configured to emit light parallel to a first axis, an image sensor which is associatable with a second axis extending orthogonally to the image sensor and an analyzing unit. The first axis and the second axis are inclined relative to each other. The device is configured such that the light emitted from the first light source is reflectable by the cutting edge of the tool in such a way that light spots arranged in a line on the image sensor are generatable by the reflected light. The analyzing unit is configured to determine positions of the light spots. The dimension of the tool is determinable based on the positions of the light spots.
Field calibration of stereo cameras with a projector
Calibration in the field is described for stereo and other depth camera configurations using a projector One example includes imaging the first and the second feature in a first camera of the camera system wherein the distance from the first camera to the projector is known, imaging the first and the second feature in a second camera of the camera system, wherein the distance from the second camera to the projector is known, determining a first disparity between the first camera and the second camera to the first feature, determining a second disparity between the first camera and the second camera to the second feature, and determining an epipolar alignment error of the first camera using the first and the second disparities.
Borescope with pattern projection
A borescope includes an electronic image capture unit having two image capture sensors as a borescope lens at an end of a shaft that is designed for being inserted into a borescope opening, a position and alignment of the image capture sensors in relation to one another being suitable for ascertaining three-dimension (3D) information using triangulation; and a pattern projector configured to project a pattern into a common recording region of the image capture sensors. The pattern projector includes: a fundamentally optically imaging light-guide bundle, which is made up of statistically distributed optical fibers having differing transmittances, to whose input surface a light source is coupled and whose output surface is aligned with the region captured by the image capture sensors.
Method and system for measuring an object by means of stereoscopy
The invention relates to a method and a system for measuring an object (2) by means of stereoscopy, in which method a pattern (3) is projected onto the object surface by means of a projector (9) and the pattern (3), which is designated as a scene and is projected onto the object surface, is captured by at least two cameras (4.1, 4.2, 4.3, 4.4), wherein correspondences of the scene are found in the images captured by the cameras (4.1, 4.2, 4.3, 4.4) by means of a computing unit (5) using image processing, and the object (2) is measured by means of the correspondences found. According to the invention, the cameras (4.1, 4.2, 4.3, 4.4) are intrinsically and extrinsically calibrated, and a two-dimensional and temporal coding is generated during the pattern projection, by (a) projecting a (completely) two-dimensionally coded pattern (3) and capturing the scene using the cameras (4.1, 4.2, 4.3, 4.4), and (b) projecting a temporally encoded pattern having a two-dimensionally different coding several times in succession and using the cameras (4.1, 4.2, 4.3, 4.4) to capture several scenes in succession, the capturing of said scenes being triggered simultaneously in each case.
Fringe projection for in-line inspection
A structured light pattern comprising at least two sub-patterns in the direction of motion are projected onto an object during in-line inspection. Images of the object are captured for each sub-pattern. The sub-patterns are used to establish correspondence and to construct a profile using dense per pixel camera-projection correspondence.
DETERMINING OBJECT PROPERTIES WITH RESPECT TO PARTICULAR OPTICAL MEASUREMENT
A method of identifying a surface point or region of an object to be measured by means of an optical sensor providing defined measuring conditions regarding emission of measuring light and reception of reflected measuring light in a defined spatial relationship. The method comprises defining a point or region of interest of the object, determining an optical property of the defined point or of the defined region and deriving an object information base on the optical property. The determination of the optical property is performed by optically pre-measuring the point or region using the optical sensor by illuminating the point or the region with the measuring light, capturing at least one image by means of the optical sensor of at least one illumination (Lr,Li) at the object and analysing respective illuminations (Lr,Li) regarding position or appearance plausibility with respect to the measuring conditions of the optical sensor.
SYSTEMS AND METHODS FOR AUGMENTATION OF SENSOR SYSTEMS AND IMAGING SYSTEMS WITH POLARIZATION
A multi-modal sensor system includes: an underlying sensor system; a polarization camera system configured to capture polarization raw frames corresponding to a plurality of different polarization states; and a processing system including a processor and memory, the processing system being configured to control the underlying sensor system and the polarization camera system, the memory storing instructions that, when executed by the processor, cause the processor to: control the underlying sensor system to perform sensing on a scene and the polarization camera system to capture a plurality of polarization raw frames of the scene; extract first tensors in polarization representation spaces based on the plurality of polarization raw frames; and compute a characterization output based on an output of the underlying sensor system and the first tensors in polarization representation spaces.
CAPTURE DEVICE ASSEMBLY, THREE-DIMENSIONAL SHAPE MEASUREMENT DEVICE, AND MOTION DETECTION DEVICE
An image capture device assembly includes: a light source 110 that emits a reference light pattern; an image capture device 120; and a control device 130 that controls the light source and the image capture device. The light source 110 emits the reference light pattern to a subject 140 with high brightness and low brightness, respectively, under the control of the control device 130, the image capture device 120 captures an image of the reference light pattern and the subject 140 in a high brightness irradiation state and outputs a first image signal to the control device 130, or captures an image of at least the subject 140 in a low brightness irradiation state and outputs a second image signal to the control device 130, and the control device 130 generates a reference light pattern image signal from a difference between the first image signal and the second image signal.