G05B2219/37567

RECONFIGURABLE, FIXTURELESS MANUFACTURING SYSTEM AND METHOD ASSISTED BY LEARNING SOFTWARE
20200130189 · 2020-04-30 ·

Systems and methods for AI assisted reconfigurable, fixtureless manufacturing is disclosed. The invention eliminates geometry-setting tools (hard points, pins and netstraditionally known as 3-2-1 fixturing schemes) and to replace the physical geometry setting with virtual datums driven by learning AI algorithms. A first type of part and a second type of part may be located by a machine vision system and moved by material handling devices and robots to locations within an assembly area. The parts may be aligned with one another and the alignment may be checked by the machine vision system which is configured to locate datums, in the form of features, of the parts and compare such datums to stored virtual datums. The parts may be joined while being held by the material handling devices or robots to form a subassembly in a fixtureless fashion. The material handling devices are able to grasp a number of different types of parts so that a number of different types of subassemblies are capable of being assembled. The system enables one skilled in the art to develop a product design with self-locating parts that will eliminate and minimize the need for geometry setting dedicated line tools and fixtures. This leads to the development of a manufacturing process that utilizes the industry 4.0 technologies to once again eliminate or significantly reduces the need for geometry setting line tools.

Automating robot operations

A method to control operation of a robot includes generating at least one virtual image by an optical 3D measurement system and with respect to a 3D measurement coordinate system, the at least one virtual image capturing a surface region of a component. The method further includes converting a plurality of point coordinates of the virtual image into point coordinates with respect to a robot coordinate system by a transformation instruction and controlling a tool element of the robot using the point coordinates with respect to the robot coordinate system so as to implement the operation.

Remote control system and remote control method
11904481 · 2024-02-20 · ·

A remote control system includes: an imaging unit that shoots an environment in which a device to be operated including an end effector is located; a recognition unit that recognizes objects that can be grasped by the end effector based on a shot image of the environment shot by the imaging unit; an operation terminal that displays the shot image and receive handwritten input information input to the displayed shot image; and an estimation unit that, based on the objects that can be grasped and the handwritten input information input to the shot image, estimates an object to be grasped which has been requested to be grasped by the end effector from among the objects that can be grasped and estimates a way of performing a grasping motion by the end effector, the grasping motion having been requested to be performed with regard to the object to be grasped.

Road-building machine and method for operating a road-building machine
20190169804 · 2019-06-06 · ·

A method by means of which the paving result of the road covering can be improved. For the production of a road covering, road-building material is supplied by a pivoting conveyor of a feeder to a supply container of a road paver. In the case of a relative movement between the feeder and the road paver, the pivoting conveyor has to be manually readjusted, which can result in irregularities in the road covering. This is achieved in that the container of the at least one construction vehicle is detected by at least one sensor unit of the road-building machine.

CONTROLLING A ROBOT IN THE PRESENCE OF A MOVING OBJECT

A method, system, and one or more computer-readable storage media for controlling a robot in the presence of a moving object are provided herein. The method includes capturing a number of frames from a three-dimensional camera system and analyzing a frame to identify a connected object. The frame is compared to a previous frame to identify a moving connected object (MCO). If an unexpected MCO is in the frame a determination is made if the unexpected MCO is in an actionable region. If so, the robot is instructed to take an action.

Robotic autonomous navigation and orientation tracking system and methods

A system and apparatus for navigating and tracking a robotic platform includes a non-contact velocity sensor module set positioned on the robotic platform for measuring a velocity of the robotic platform relative to a target surface. The non-contact velocity sensor module set may include a coherent light source that is emitted towards the target surface and reflected back to the coherent light source. Measuring the change in intensity of the reflected coherent light source may be used to determine the velocity of the robotic platform based on the its relationship with the principles of a Doppler frequency shift. A communication unit may also be utilized to transmit data collected from the non-contact velocity sensor set to a computer for data processing. A computer is then provided on the robotic platform to process data collected from the non-contact velocity sensor set. A user may then monitor the determined trajectory path of the robotic platform and transmit navigation instructions to the robotic platform based on the received trajectory path data.

Controlling a robot in the presence of a moving object

A method, system, and one or more computer-readable storage media for controlling a robot in the presence of a moving object are provided herein. The method includes capturing a number of frames from a three-dimensional camera system and analyzing a frame to identify a connected object. The frame is compared to a previous frame to identify a moving connected object (MCO). If an unexpected MCO is in the frame a determination is made if the unexpected MCO is in an actionable region. If so, the robot is instructed to take an action.

ROBOT APPARATUS, METHOD OF CONTROLLING ROBOT APPARATUS, AND COMPUTER PROGRAM

A robot apparatus includes: a grasping section configured to grasp an object; a recognition section configured to recognize a graspable part and a handing-over area part of the object; a grasp planning section configured to plan a path of the grasping section for handing over the object to a recipient by the handing-over area part; and a grasp control section configured to control grasp operation of the object by the grasping section in accordance with the planned path.

Robot apparatus and method of controlling robot apparatus

A robot apparatus includes: a grasping section configured to grasp an object; a recognition section configured to recognize a graspable part and a handing-over area part of the object; a grasp planning section configured to plan a path of the grasping section for handing over the object to a recipient by the handing-over area part; and a grasp control section configured to control grasp operation of the object by the grasping section in accordance with the planned path.

Safety in dynamic 3D healthcare environment

The present invention relates to safety in a dynamic 3D healthcare environment. The invention in particular relates to a medical safety-system for dynamic 3D healthcare environments, a medical examination system with motorized equipment, an image acquisition arrangement, and a method for providing safe movements in dynamic 3D healthcare environments. In order to provide improved safety in dynamic 3D healthcare environments with a facilitated adaptability, a medical safety-system (10) for dynamic 3D healthcare environments is provided, comprising a detection system (12), a processing unit (14), and an interface unit (16). The detection system comprises at least one sensor arrangement (18) adapted to provide depth information of at least a part of an observed scene (22). The processing unit comprises a correlation unit (24) adapted to correlate the depth information. The processing unit comprises a generation unit (26) adapted to generate a 3D free space model (32). The interface unit is adapted to provide the 3D free space model.