G01B11/2545

Optimizing material handling tasks

Various examples are directed to systems and methods for utilizing depth videos to analyze material handling tasks. A material handling facility may comprise a depth video system and a control system programmed to receive a plurality of depth videos including performances of the material handling task. For each of the plurality of depth videos, training data may identify sub-tasks of the material handling task and corresponding portions of the video including the sub-tasks. The plurality of depth videos and the training data may be used to train a model to identify the sub-tasks from depth videos. The control system may apply the model to a captured depth video of a human agent performing the material handling task at a workstation to identify a first sub-task of the material handling task being performed by the human agent.

CALIBRATION METHOD OF DEPTH IMAGE CAPTURING DEVICE
20190149788 · 2019-05-16 ·

A calibration method of a depth image capturing device including a projecting device and an image sensing device is provided. At least three groups of images of a calibration board having multiple feature points are captured. Intrinsic parameters of the image sensing device are calibrated according to the at least three groups of images. Multiple sets of coordinate values of corresponding points corresponding to the feature points in a projection pattern of the projecting device are obtained. Intrinsic parameters of the projecting device are obtained by calibration. Multiple sets of three-dimensional coordinate values of multiple feature points are obtained. An extrinsic parameter between the image sensing device and the projecting device is obtained according to multiple sets of three-dimensional coordinate values, the multiple sets of coordinate values of corresponding points, the intrinsic parameters of the image sensing device, and the intrinsic parameters of the projecting device.

META PROJECTORS AND ELECTRONIC APPARATUSES INCLUDING THE SAME

A meta projector includes a light source array configured to emit light along an optical path. The light source array includes a first light-emitting array including a plurality of first light-emitting configured to emit first light having a first set of light properties and a second light-emitting array including a plurality of second light-emitting elements configured to emit a second light having a second set of light properties, the second set of light properties different from the first set of light properties. The meta projector includes a meta-structure layer aligned with the optical path. The meta projector includes a plurality of nanostructures having a sub-wavelength shape dimension that is smaller than a wavelength of light emitted from the light source array. The meta-structure layer is configured to differently modulate the first light and the second light in relation to each other.

MEDICAL SCENE MODEL
20190139300 · 2019-05-09 ·

A method of deriving one or more medical scene model characteristics for use by one or more software applications is disclosed. The method includes receiving one or more sensor data streams. Each sensor data stream of the one or more sensor data steams includes position information relating to a medical scene. A medical scene model including a three-dimensional representation of a state of the medical scene is dynamically updated based on the one or more sensor data streams. Based on the medical scene model, the one or more medical scene model characteristics are derived.

System and Method for Filling Apparel with Gases, Fluids, or Fluid-Like Solids to Enable the Accurate Three-Dimensional Capture of Apparel by Three-Dimensional Scanning and Stereo Photogrammetry
20190137258 · 2019-05-09 · ·

The present invention provides a system for accurate three-dimensional (3D) capture of apparel by 3D scanning and stereo photogrammetry. One or more flexible deflated bladders are inserted into an apparel and inflated by a filling media source. The bladder is inflated to measure at least one of size of the apparel, tensile strength of the fabric of the apparel, or cross section perimeter of the apparel. A rotating structure is anchored onto the stable structure to uniformly rotate all the fixed points of the apparel. One or more 3D scanners are configured to scan the inflated apparel on the stable structure and export the scanned data to one or more software applications. The processor retrieves the 3D scanned dimensional data from a searchable database and compares the given apparel's dimensions with one or more apparels' dimensions in the database to provide the customer with size recommendation via a display.

Determining object properties with respect to particular optical measurement

A method of identifying a surface point or region of an object to be measured by means of an optical sensor providing defined measuring conditions regarding emission of measuring light and reception of reflected measuring light in a defined spatial relationship. The method comprises defining a point or region of interest of the object, determining an optical property of the defined point or of the defined region and deriving an object information base on the optical property. The determination of the optical property is performed by optically pre-measuring the point or region using the optical sensor by illuminating the point or the region with the measuring light, capturing at least one image by means of the optical sensor of at least one illumination (Lr,Li) at the object and analyzing respective illuminations (Lr,Li) regarding position or appearance plausibility with respect to the measuring conditions of the optical sensor.

Three-dimensional measurement apparatus and control method for the same

A three-dimensional measurement apparatus including: a projection unit configured to project a pattern onto a measurement target object from one or more projection directions; and an image capturing unit configured to obtain one or more captured images by capturing an image of the measurement target object from one or more view points. The three-dimensional measurement apparatus obtains a position of the pattern projected onto the measurement target object based on the one or more captured images obtained by the image capturing unit, and calculates three-dimensional coordinates of a surface of the measurement target object based on the position of the pattern obtained from the one or more captured images and a position of the pattern estimated based on a parameter set that represents internal scattering of the measurement target object.

MOVING FLYING OBJECT FOR SCANNING AN OBJECT, AND SYSTEM FOR ANALYZING DAMAGE TO THE OBJECT
20190128772 · 2019-05-02 ·

An aircraft that includes a helicopter drone on which a 3D scanner is mounted via an actively rotatable joint is provided. The 3D scanner has at least one high-resolution camera for recording a multiplicity of overlapping images of the object from different recording positions and recording directions, so that comparison of the images allows a position and orientation of the 3D scanner relative to the object to be ascertained. In addition, the aircraft has a coordination device for coordinated control of the 3D scanner, the joint and the helicopter drone. The system for damage analysis has an aircraft and an image processing module generating a data representation of a surface profile of the object on the basis of the recorded images. In addition, the system includes a rating device for checking the surface profile and for outputting a damage statement on the basis of the check.

SYSTEMS AND METHODS FOR GATHERING DATA AND INFORMATION ON SURFACE CHARACTERISTICS OF AN OBJECT
20190128666 · 2019-05-02 · ·

A system is provided for gathering information and data on the surface characteristics of an object includes a projector, a table, a first camera and a second camera. The projector is suspended above the table and arranged to project a random pattern of optical indicators onto the table. The optical indicators can be dots, lines, or other such indicators. The table is arranged to hold the object to be inspected. The first camera is positioned above and to one side of the table and angled toward the table. The second camera is positioned above and to opposite side of the table and angled toward the table. The first and second cameras are arranged to capture images of the optical indicators projected onto the object. The system is further arranged to gather information and data from the captured images and determine the surface characteristics of the object from said gathered information and data.

METHOD FOR THE THREE DIMENSIONAL MEASUREMENT OF A MOVING OBJECTS DURING A KNOWN MOVEMENT
20190128665 · 2019-05-02 ·

A 3D measurement method including; projecting a pattern sequence onto a moving object; capturing a first image sequence with a first camera and a second image sequence synchronously to the first image sequence with a second camera; determining corresponding image points in the two sequences; computing a trajectory of a potential object point from imaging parameters and from known movement data for each pair of image points that is to be checked for correspondence, The potential object point is imaged by both image points in case they correspond. Imaging object positions derived therefrom at each of the capture points in time into image planes respectively of the two cameras. Corresponding image points positions are determined as trajectories in the two cameras and the image points are compared with each other along predetermined image point trajectories and examined for correspondence; lastly performing 3D measurement of the moved object by triangulation,