G01B11/2545

Aerial device having a three-dimensional measurement device

A three-dimensional (3D) coordinate measuring system is provided. The system includes an aerial measuring device that has an aerial drone and a 3D measurement device. The 3D measurement device being rotatably attached to the aerial drone, the aerial drone is movable from a first position to a stationary second position. The 3D measurement device being configured to optically measure points on the surface of an object. The system further includes one or more processors configured to execute nontransitory computer readable instructions. The computer readable instructions comprise: moving the aerial measuring device from the first position; landing the aerial measuring device at the second position; rotating the 3D measurement device to optically measure a first object point; and determining a first 3D coordinates of the first object point with the 3D measuring device.

Method and system for determining a three-dimensional definition of an object by reflectometry

A method and system for scanning an outer surface of a three-dimensional object, the outer surface being reflective, the method comprising the following steps: (a) projecting a light pattern on the object with a relative movement between the light pattern and the object; (b) recording with cameras images of the light pattern reflected by the outer surface during the relative movement; (c) processing the recorded reflection images by identifying the outline of the light pattern and determining from the outline characteristics of the outer surface; wherein the light pattern comprises at least one homogenously illuminated strip extending transversally to a direction of the relative movement with at least one border with a non-straight profile so as to form a non-constant width of the strip; and in step (c) the speed of the relative movement is taken into account for determining a three-dimensional definition of the outer surface.

Apparatus, method and recording medium storing command for inspection
12051606 · 2024-07-30 · ·

The present disclosure provides an apparatus. The apparatus according to the present disclosure comprises: at least one first light source configured to irradiate illumination light to an object on a reference surface; at least one second light source configured to irradiate pattern light to the object, a plurality of cameras configured to capture one or more illumination images and one or more pattern images; and one or more processors configured to determine a plurality of outlines indicating edges of the object based on two or more images captured in different directions among the one or more illumination images and the one or more pattern images; determine a virtual plane corresponding to an upper surface of the object based on the plurality of outlines; and determine an angle between the virtual plane and the reference plane.

SHAPE MEASURING DEVICE

A shape measuring device includes: a first light radiating unit to radiate first line light and second line light: a first image capturing unit to capture an object; and a measuring unit to measure a shape of the object on a basis of a first image and second images.

Systems and Methods for Estimating Depth from Projected Texture using Camera Arrays
20190063905 · 2019-02-28 · ·

Systems and methods in accordance with embodiments of the invention estimate depth from projected texture using camera arrays. One embodiment of the invention includes: at least one two-dimensional array of cameras comprising a plurality of cameras; an illumination system configured to illuminate a scene with a projected texture; a processor; and memory containing an image processing pipeline application and an illumination system controller application. In addition, the illumination system controller application directs the processor to control the illumination system to illuminate a scene with a projected texture. Furthermore, the image processing pipeline application directs the processor to: utilize the illumination system controller application to control the illumination system to illuminate a scene with a projected texture capture a set of images of the scene illuminated with the projected texture; determining depth estimates for pixel locations in an image from a reference viewpoint using at least a subset of the set of images. Also, generating a depth estimate for a given pixel location in the image from the reference viewpoint includes: identifying pixels in the at least a subset of the set of images that correspond to the given pixel location in the image from the reference viewpoint based upon expected disparity at a plurality of depths along a plurality of epipolar lines aligned at different angles; comparing the similarity of the corresponding pixels identified at each of the plurality of depths; and selecting the depth from the plurality of depths at which the identified corresponding pixels have the highest degree of similarity as a depth estimate for the given pixel location in the image from the reference viewpoint.

SENISING ON UAVS FOR MAPPING AND OBSTACLE AVOIDANCE

Structured light approaches utilize a laser to project features, which are then captured with a camera. By knowing the disparity between the laser emitter and the camera, the system can triangulate to find the range. Four, 185 degree field-of-view cameras provide overlapping views over nearly the whole unit sphere. The cameras are separated from each other to provide parallax. A near-infrared laser projection unit sends light out into the environment, which is reflected and viewed by the cameras. The laser projection system will create vertical lines, while the cameras will be displaced from each other horizontally. This relative shift of the lines, as viewed by different cameras, enables the lines to be triangulated in 3D space. At each point in time, a vertical stripe of the world will be triangulated. Over time, the laser line will be rotated over all yaw angles to provide full a 360 degree range.

Aerial Vehicle Imaging and Targeting System
20190068953 · 2019-02-28 ·

The subject disclosure relates to a tracking system to mount to an aircraft and to image and track a target aircraft. The tracking system may include a structured light source operatively coupled to a processor, an inertial measurement unit (IMU) operatively coupled with the processor, a mirror to steer light from the light source toward the target aircraft, and a stereo-vision system having a first camera and a second camera. The IMU may be configured to generate position data representing a position of the aircraft. The stereo-vision system may be operatively coupled to the processor and configured to determine a 3D position of the target aircraft as a function of the position data. The processor may be configured to adjust the mirror position as a function of a mirror position.

MEASUREMENT SYSTEM HAVING A COOPERATIVE ROBOT AND THREE-DIMENSIONAL IMAGER
20190063907 · 2019-02-28 ·

A measurement system and a method of measuring an object is provided. The system includes a measurement platform having a planar surface. At least two optical sensors are coupled to the measurement platform that emit light in a plane and determines a distance to an object based on a reflection of the light. A linear rail is coupled to the measurement platform. A cooperative robot is coupled to move along the linear rail. A 3D measuring system is coupled to the end of the robot. A controller coupled to the at least two optical sensors, the robot, and the 3D measuring system, the controller changing the speed of the robot and the 3D measuring system to less than a threshold in response to a distance measured by at least one of the at least two optical sensors to a human operator being less than a first distance threshold.

META PROJECTOR AND ELECTRONIC APPARATUS INCLUDING THE SAME

A meta projector is provided, including an edge emitting device configured to emit light through a side surface thereof, a meta-structure layer spaced apart from the upper surface of the edge emitting device that includes a plurality of nanostructures having a sub-wavelength dimension smaller than a wavelength of the light emitted from the edge emitting device, and a path changing member configured to change a path of the light emitted from the edge emitting device so as to direct the path toward the meta-structure layer. The meta projector may thus be configured to emit a light pattern of structured light, based on directing the light emitted from the edge emitting device through the meta-structure layer.

DYNAMIC VISION SENSOR AND PROJECTOR FOR DEPTH IMAGING
20190045173 · 2019-02-07 · ·

Systems, devices, and techniques related to matching features between a dynamic vision sensor and one or both of a dynamic projector or another dynamic vision sensor are discussed. Such techniques include casting a light pattern with projected features having differing temporal characteristics onto a scene and determining the correspondence(s) based on matching changes in detected luminance and temporal characteristics of the projected features.