APPARATUS, CONTROLLER, AND METHOD FOR GENERATING IMAGE DATA OF MOVEMENT PATH OF INDUSTRIAL MACHINE
20210132576 ยท 2021-05-06
Assignee
Inventors
Cpc classification
B25J9/1656
PERFORMING OPERATIONS; TRANSPORTING
Y02P90/02
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B2219/36047
PHYSICS
G05B2219/36053
PHYSICS
G05B2219/33099
PHYSICS
B25J9/1674
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
An apparatus configured to facilitate identifying a factor for a defect if the defect occurs in a finished surface of the workpiece. An apparatus includes a movement path generation section configured to generate the movement path of the industrial machine when performing a work on a workpiece; a running information acquisition section configured to acquire running information of the industrial machine when performing a work on the workpiece; and an image data generation section configured to generate the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
Claims
1. An apparatus configured to generate image data of a movement path of an industrial machine, the apparatus comprising: a movement path generation section configured to generate the movement path of the industrial machine when performing a work on a workpiece; a running information acquisition section configured to acquire running information of the industrial machine when performing the work on the workpiece; and an image data generation section configured to generate image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
2. The apparatus of claim 1, wherein the running information includes: a position, a velocity, acceleration, a jerk or a movement direction of the industrial machine; or a running mode of the industrial machine.
3. The apparatus of claim 2, wherein the running information includes time-series data that indicates, in time series, a change of the position, the velocity, the acceleration, the jerk, the movement direction, or the running mode with respect to time.
4. The apparatus of claim 3, further comprising a change point acquisition section configured to acquire, in the time-series data: the change point at which a change amount of the position, the velocity, the acceleration, the jerk, or the movement direction within a predetermined time exceeds a predetermined threshold value; or the change point at which first running mode is switched to second running mode.
5. The apparatus of claim 1, wherein the running information acquisition section acquires the running information from a work program for performing the work on the workpiece, or from feedback information detected when the industrial machine is operated in accordance with the work program.
6. The apparatus of claim 1, wherein the movement path generation section generates: a first movement path defined by a work program for performing the work on the workpiece; and a second movement path when the industrial machine is operated in accordance with the work program, wherein the image data generation section generates the image data in which the first movement path and the second movement path are displayed.
7. The apparatus of claim 6, wherein the image data generation section generates the image data such that a difference between the first movement path and the second movement path can be enlarged and displayed.
8. A controller of the industrial machine, comprising the apparatus of claim 1.
9. A method for generating image data of a movement path of an industrial machine, the method comprising: generating the movement path of the industrial machine when performing a work on a workpiece; acquiring running information of the industrial machine when performing the work on the workpiece; and generating the image data in which a first point on the movement path corresponding to a change point of first running information, and a second point on the movement path corresponding to a change point of second running information different from the first running information are highlighted on the movement path in display forms visually different from each other.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
DETAILED DESCRIPTION
[0020] Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In the various embodiments described below, similar elements are denoted by the same reference numeral, and redundant description thereof will be omitted. Referring first to
[0021] The industrial machine 12 is e.g. a machine tool (a lathe, milling machine, machining center, etc.), or an industrial robot (a vertical articulated robot, horizontal articulated robot, parallel link robot, etc.), and performs a predetermined work on a workpiece. The industrial machine 12 includes a movement mechanism 16 and a tool 18. The movement mechanism 16 includes at least one electric motor 20, and relatively moves the tool 18 with respect to a workpiece to be worked.
[0022] The electric motor 20 is e.g. a servomotor, and the controller 14 sends a command to the electric motor 20 so as to operate the movement mechanism 16 by driving the electric motor 20, thereby moving the tool 18 relative to the workpiece. The tool 18 is e.g. a cutting tool, a laser processing head, a welding torch, or a coating material applicator, and performs a predetermined work (e.g., cutting, laser processing, welding, or coating) on the workpiece.
[0023] As an example, when the industrial machine 12 is the machine tool, the movement mechanism 16 moves a processing table, on which the workpiece is set, in a horizontal direction, and moves a spindle head, on which the tool 18 (cutting tool) is set, in a vertical direction. As another example, when the industrial machine 12 is the industrial robot (articulated robot), the movement mechanism 16 includes a rotary barrel provided at a robot base so as to be rotatable about a vertical axis, a robot arm rotatably provided at the rotary barrel, and a wrist provided at a tip of the robot arm, and moves the tool 18 attached to the wrist to any position in a three-dimensional space.
[0024] The controller 14 controls the operation of the industrial machine 12. Specifically, the controller 14 is a computer including a processor 22, a memory 24, and an I/O interface 26. The processor 22 includes e.g. a CPU, or GPU, and executes arithmetic processing to perform various functions described later. The processor 22 is communicably connected to the memory 24 and the I/O interface 26 via a bus 28.
[0025] The memory 24 includes e.g. a ROM or RAM, and stores various types of data temporarily or permanently. The I/O interface 26 communicates with an external device, receives data from the external device, and transmits data to the external device, under the control of the processor 22. In the present embodiment, a display device 30 and an input device 32 are communicably connected to the I/O interface 26 in wireless or wired manner.
[0026] The display device 30 includes e.g. an LCD, an organic EL display, and the processor 22 transmits image data to the display device 30 via the I/O interface 26 to display the image on the display device 30. The input device 32 includes e.g. a keyboard, mouse, or touch sensor, and transmits information inputted by the operator to the processor 22 via the I/O interface 26. The display device 30 and the input device 32 may be provided integrally with the controller 14, or may be provided separate from the controller 14.
[0027] In the present embodiment, the processor 22 functions as an apparatus 50 configured to generate the image data of a movement path of the industrial machine 12. The function of the apparatus 50 will be described below. The processor 22 generates a movement path MP of the industrial machine 12 when performing a work on a workpiece. For example, the processor 22 generates the movement path MP of the tool 18 (specifically, a tool tip point, TCP, or the like), which is moved by the movement mechanism 16, with respect to the workpiece.
[0028] In the present embodiment, the processor 22 generates a movement path MP.sub.1 (first movement path) defined by a work program WP for performing the work on the workpiece. The work program WP is a computer program (e.g., a G code program) including a plurality of statements that define a plurality of target positions at which the tool 18 is to be arranged with respect to the workpiece, a minute line segment connecting two adjacent target positions, a target velocity of the tool 18 with respect to the workpiece, etc.
[0029] The processor 22 analyzes the work program WP and generates a command to be transmitted to the electric motor 20 in order to perform the work on the workpiece. In this way, the processor 22 operates the movement mechanism 16 in accordance with the work program WP so as to perform the work on the workpiece by the tool 18. The work program WP is stored in the memory 24.
[0030] The processor 22 generates the movement path MP.sub.1 in the three-dimensional space by analyzing the work program WP. The movement path MP.sub.1 is an aggregate of the minute line segments defined in the work program WP. The movement path MP.sub.1 is a movement path in control of the movement mechanism 16 (tool 18). For example, assume that the industrial machine 12 forms a workpiece A in
[0031] In this case, an example of the movement path MP.sub.1 of the tool 18 with respect to the workpiece A when forming a connection portion A3 between a base portion A1 and a main body portion A2 of the workpiece A is illustrated in
[0032] The processor 22 generates such movement path MP.sub.1 from the work program WP. In
[0033] The processor 22 acquires running information of the industrial machine 12 when performing the work on the workpiece A. In the present embodiment, the processor 22 acquires the running information from the work program WP. The running information acquired from the work program WP includes e.g. information of a position P.sub.T1, a velocity V.sub.T1, acceleration a.sub.T1, a jerk .sub.T1 and a movement direction d.sub.T1 of the tool 18 with respect to the workpiece A; information of a position (rotation angle) P.sub.M1, a velocity (rotation speed) V.sub.M1, acceleration (angular acceleration) a.sub.M1, a jerk (angular jerk) .sub.M1 and a movement direction (rotational direction) d.sub.M1 of a rotary shaft of the electric motor 20; and information of a running mode RM of the industrial machine 12.
[0034] The processor 22 analyzes the work program WP, and acquires, as the running information, time-series data indicating, in time series, changes of the positions P.sub.T1 and P.sub.M1, the velocities V.sub.T1 and V.sub.M1, the acceleration a.sub.T1 and a.sub.M1, the jerks .sub.T2 and .sub.M1, the movement directions d.sub.T1 and d.sub.M1, and the running mode RM with respect to time t, based on the information of each statement included in the work program WP (e.g., the target position, minute line segment, target velocity). Accordingly, the processor 22 functions as a running information acquisition section 54 (
[0035]
[0036] In the example illustrated in
[0037] A line L3 illustrated in the lower side of the graph in
[0038] This phenomenon in which the difference .sub.MP becomes large may similarly occur also at a time point at which a change amount .sub.2 of the positions P.sub.T1 and P.sub.M1, the velocities V.sub.T1 and V.sub.M1, the acceleration a.sub.T1 and a.sub.M1, the jerks .sub.T1 and a.sub.M1, or the movement direction d.sub.M1 increases. At the time point at which the change amount .sub.1 increases (i.e., the difference .sub.MP increases), the actual position (or path) of the tool 18 with respect to the workpiece during the work may be shifted from the target position (or path) defined by the work program, which may result in the generation of a defect in the finished surface of the workpiece A (e.g., an unintended line, pattern, or the like may be formed in the finished surface of the workpiece A when the workpiece is machined).
[0039] On the other hand, in the example illustrated in
[0040] On the other hand, the second running mode RM.sub.2 is a work mode, for example. The work mode is a running mode in which the work on the workpiece is performed by the tool 18 while the movement mechanism 16 moves the tool 18 at a velocity V.sub.T1_W (<V.sub.T1_P). In the work mode, the control gain G.sub.C is set to G.sub.C_W (>G.sub.C_P), and the time constant is set to .sub.W (<.sub.p). Accordingly, the responsiveness of the control of the movement mechanism 16 in the work mode is higher than that in the positioning mode.
[0041] As discussed above, at the time point t.sub.0 at which the running mode RM of the industrial machine 12 is switched between the first running mode RM.sub.1 and the second running mode RM.sub.2, vibrations are generated in the tool 18, whereby the difference .sub.MP may increase as indicated by the arrow C in
[0042] In the present embodiment, the processor 22 acquires a change point at which each change amount .sub.1 of the positions P.sub.T1 and P.sub.M1, the velocities V.sub.T1 and V.sub.M1, the acceleration a.sub.T1 and a.sub.M1, the jerks .sub.T1 and a.sub.M1, and the movement directions d.sub.T1 and d.sub.M1 within the predetermined time t exceeds a predetermined threshold value .sub.th1 in the time-series data of the positions P.sub.T1 and P.sub.M1, the velocities V.sub.T1 and V.sub.M1, the acceleration a.sub.T1 and a.sub.M1, the jerks .sub.T1 and .sub.M1, and the movement directions d.sub.T1 and d.sub.M1.
[0043] Specifically, regarding positions P.sub.T1 and P.sub.M1, the processor 22 retrieves from the time-series data of the positions P.sub.T1 and P.sub.M1 the change point at which the change amount (i.e., the distance) between two positions within the predetermined time t exceeds a threshold value .sub.th1, and acquires each time point t.sub.0 at which such a change point occurs, for example.
[0044] Regarding the velocities V.sub.T1 and V.sub.M1, the acceleration a.sub.T1 and a.sub.M1, and the jerks .sub.T1 and .sub.M1, for example, the processor 22 retrieves the change point at which an absolute value of each change amount of the velocities V.sub.T1 and V.sub.M1, acceleration a.sub.T1 and a.sub.M1, and jerks .sub.T1 and .sub.M1 within the predetermined time t exceeds the threshold value .sub.th1 from the time-series data of the velocities V.sub.T1 and V.sub.M1, acceleration a.sub.T1 and a.sub.M1, and jerks .sub.T1 and .sub.M1, and acquires each time point t.sub.0 at which such a change point occurs.
[0045] Further, regarding the movement direction d.sub.T1 of the tool 18, for example, the processor 22 calculates, as a change amount of the movement direction d.sub.T1 within the time t from a first time point t.sub.1 to a second time point t.sub.2 (>t.sub.1) (i.e., t=t.sub.2t.sub.2), an inner product IP of a movement direction d.sub.T1_t1 at the first time point t.sub.1 and a movement direction d.sub.T1_t2 at the second time point t.sub.2. For example, if each of the movement directions d.sub.T1_t1 and d.sub.T1_t2 is considered as an unit vector, and an angle between the movement directions d.sub.T1_t1 and d.sub.T1_t2 is represented as , the internal product IP can be calculated by the equation of IP=cos (1IP1).
[0046] As the change amount (angle ) from the movement direction d.sub.T1_t1 to the movement direction d.sub.T1_t2 becomes larger from 0 degrees toward 180 degrees, the inner product IP becomes smaller in the range of [1IP1]. The processor 22 retrieves a change point at which the inner product IP becomes smaller exceeding the predetermined threshold value .sub.th1 from the time-series data of the movement direction d.sub.T1, and acquires the time point t.sub.0 at which such a change point occurs.
[0047] Alternatively, the processor 22 may calculate the angle between the movement direction d.sub.T1_t1 and the movement direction d.sub.T1_t2 as a change amount of the movement direction d.sub.T1 within the time t, and retrieve a change point at which the angle exceeds the predetermined threshold value .sub.th1 from the time-series data of the movement direction d.sub.T1, and then acquire the time point t.sub.0 at which the change point occurs.
[0048] Regarding the movement direction (rotation direction) d.sub.M1 of the rotary shaft of the electric motor 20, for example, the processor 22 determines that the movement direction d.sub.M1 exceeds the predetermined threshold value .sub.th1 (=180 degrees) when the movement direction d.sub.M1 is reversed, and retrieves a change point at which the movement direction d.sub.M1 is reversed from the time-series data of the movement direction d.sub.M1, and then acquires the time point t.sub.0 at which such a change point occurs.
[0049] On the other hand, regarding the running mode RM, the processor 22 retrieves a change point at which the first running mode RM.sub.1 is switched to the second running mode RM.sub.2 from the time-series data of the running mode RM (
[0050] Next, the processor 22 specifies points on the movement path MP.sub.1 corresponding to the respective change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1 a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, and RM. In the present embodiment, the processor 22 acquires a total of eleven kinds of running information, i.e., the positions P.sub.T1 and P.sub.M1, the velocities V.sub.T1 and V.sub.M1, the acceleration a.sub.T1 and a.sub.M1, the jerks .sub.T1 and .sub.M1, the movement directions d.sub.T1 and d.sub.M1, and the running mode RM.
[0051] Thus, regarding these eleven kinds of running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, and RM, the processor 22 retrieves and specifies the points on the movement path MP.sub.1 corresponding to the time points t.sub.0 at which the change points of these pieces of running information occurs. In this regard, since the time-series data of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 and RM are associated with the statements included in the work program WP, the processor 22 can identify the statement of the work program WP that corresponds to the time point t.sub.0, and thereby identifying the point on the movement path MP.sub.1 corresponding to the time point t.sub.0 via the statement of the work program WP.
[0052] In this way, the processor 22 identifies the plurality of points on the movement path MP.sub.1 corresponding to the respective change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, .sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, and RM. Then, the processor 22 generates image data ID.sub.1 in which the identified plurality of points on the movement path MP.sub.1 are highlighted in display forms visually different from one another.
[0053] An example of an image of the image data ID.sub.1 is illustrated in
[0054] The arrow D.sub.1, the arrow D.sub.2, and the arrow D.sub.3 depicted in the image data ID.sub.1 are displayed in different colors, different shapes, or different visual effects (flashing or the like) from one another, for example, and can be visually identified, respectively. Accordingly, the operator is able to visually recognize which of the running information either of the arrows D.sub.1, D.sub.2 or D.sub.3 corresponds to.
[0055] Note that, in the image data ID.sub.1 illustrated in
[0056] As stated above, in the present embodiment, the processor 22 functions as the movement path generation section 52, the running information acquisition section 54, the change point acquisition section 56, and the image data generation section 58 of the apparatus 50. These movement path generation section 52, the running information acquisition section 54, the change point acquisition section 56 and the image data generation section 58 constitute the apparatus 50.
[0057] The processor 22 transmits the generated image data ID.sub.1 to the display device 30 via the I/O interface 26, and the display device 30 displays the image of the image data ID.sub.1 as illustrated in
[0058] According to the present embodiment, if a defect occurs in a finished surface of the workpiece A when the industrial machine 12 performs the work on the workpiece A, the operator easily identifies a factor for the defect. More specifically, assume that the position of the defect generated in the finished surface of the workpiece A matches (or is close to) the position of the point highlighted in the image data ID.sub.1, when the industrial machine 12 performs the work on the workpiece A in accordance with the machining program WP.
[0059] In this case, the operator can immediately identify which of the change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, and RM, the point that matches (or is close to) the position of the defect generated in the workpiece A corresponds to. Thus, the operator may estimate that the factor for the defect results from the parameter relating to the identified running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 or RM.
[0060] As a result, the operator is able to achieve an improvement in precision of the work performed on the workpiece A and a reduction in time necessary for startup of the mechanical system 10, by changing the machining program WP or the parameter (e.g., the statement of the machining program, control gain G.sub.C, or time constant ) relating to the identified running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 or RM.
[0061] The processor 22 may select a point to be highlighted in the image of the image data ID.sub.1 from the plurality of kinds of running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 and RM, in response to the input information to the input device 32, and may highlight in the image only the point corresponding to the selected running information.
[0062] Such an embodiment will be described with reference to
[0063] In the example illustrated in
[0064] The operator operates the input device 32 (e.g., a mouse) so as to input the input information for selecting the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 or RM to be highlighted, while viewing the image of the image data ID.sub.1 displayed on the display device 30.
[0065] In response to the input information to the input device 32, the processor 22 highlights the point corresponding to the selected running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 or RM, and displays the check mark in the check column of the selected running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, or RM. In the example depicted in
[0066] The processor 22 may display, in the image data ID.sub.1, the relating parameter of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, or RM corresponding to the point highlighted in the image data ID.sub.1, in response to the input information to the input device 32. Such an embodiment will be described with reference to
[0067] The image data ID.sub.1 illustrated in
[0068] In the example illustrated in
[0069] The operator operates the input device 32 (e.g., the mouse) so as to input the input information for selecting the point highlighted in the image data ID.sub.1 (arrows D.sub.1, D.sub.2, D.sub.3), while viewing the image of the image data ID.sub.1 displayed on the display device 30. In response to the input information to the input device 32, the processor 22 displays, in the image data ID.sub.1, the relating parameter of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, .sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, or RM corresponding to the selected points (arrows D.sub.1, D.sub.2, D.sub.3).
[0070] According to this configuration, in a case where the position of the defect generated in the workpiece A matches (is close to) the position of the point highlighted in the image data ID.sub.1, the operator is able to refer to the relating parameter of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, or RM that may be the factor for the defect, and verify the factor in more detail.
[0071] Next, a mechanical system 100 according to another embodiment will be described with reference to
[0072] The sensor 104 includes e.g. a rotation detection sensor such as an encoder or a Hall element, or a force sensor such as a torque sensor or a force detection sensor, configured to detect the position P.sub.M2 of the rotary shaft of the electric motor 20, the torque applied to the rotary shaft, or the force applied to the movement mechanism 16. The sensor 104 transmits the detected position P.sub.M1, torque or force, as feedback information FB, to the processor 22 via the I/O interface 26.
[0073] The processor 22 receives the feedback information FB detected by the sensor 104 therefrom while operating the industrial machine 102 in accordance with the work program WP. For example, the processor 22 acquires the feedback information FB detected by the sensor 104 while operating the industrial machine 102 in accordance with the work program WP so as to actually perform the work on the workpiece A by the tool 18.
[0074] Alternatively, the processor 22 may acquire the feedback information FB detected by the sensor 104 while operating the industrial machine 102 in accordance with the work program WP without performing the actual work by the tool 18 (e.g., without activating the tool 18, or with the tool 18 being removed from the movement mechanism 16).
[0075] The processor 22 determines, from the feedback information FB from the sensor 104 (specifically, the position F.sub.M2), a position P.sub.T2 of the tool 18 relative to the workpiece A while operating the industrial machine 102 in accordance with the work program WP. Then, the processor 22 functions as the movement path generation section 52, and generates, from the position P.sub.T2, the movement path MP.sub.2 (second movement path) of the industrial machine 102 in a three-dimensional space.
[0076] The movement path MP.sub.2 corresponds to the actual movement path of the movement mechanism 16 when the industrial machine 102 is operated in accordance with the work program WP. An example of the movement path MP.sub.2 is illustrated in
[0077] Then, similarly as the above-described method for acquiring the change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1 and d.sub.M1, the processor 22 functions as the change point acquisition section 56 to acquire a change point at which a change amount .sub.2 of each of the positions P.sub.T2 and P.sub.M2, velocities V.sub.T2 and V.sub.M2, acceleration a.sub.T2 and a.sub.M2, jerks .sub.T2 and .sub.M2, and movement directions d.sub.T2 and d.sub.M2 exceeds a predetermined threshold value .sub.th2 within a predetermined time t in the time-series data of the positions P.sub.T2 and P.sub.M2, velocities V.sub.T2 and V.sub.M2, acceleration a.sub.T2 and a.sub.M2, jerks .sub.T2 and .sub.M2, and movement directions d.sub.T2 and d.sub.M2.
[0078] The processor 22 acquires, in the time-series data, the change point of each of the positions P.sub.T2 and P.sub.M2, velocities V.sub.T2 and V.sub.M2, acceleration a.sub.T2 and a.sub.M2, jerks .sub.T2 and .sub.M2, and movement directions d.sub.T2 and d.sub.M2; and the time point t.sub.0 at which each change point occurs, and then identifies the points on the movement path MP.sub.2 corresponding to the respective change points.
[0079] The time-series data of the running information P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sub.T2, .sub.M2, d.sub.T2 and d.sub.M2, and the movement path MP.sub.2 are data acquired from the feedback information FB, and are associated with each other via time t. Accordingly, the processor 22 is able to retrieve and identify the points on the movement path MP.sub.2 corresponding to the time points t.sub.0 at which the change points of the running information P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sub.T2, .sub.M2, d.sub.T2, and d.sub.M2 acquired from the feedback information FB occur.
[0080] Then, the processor 22 functions as the image data generation section 58 to generate image data ID.sub.2 in which the points on the movement path MP.sub.2 corresponding to the change points of the running information P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, a.sub.T2, a.sub.M2, d.sub.T2 and d.sub.M2 are highlighted on the movement path MP.sub.2 in display forms visually different from one another. An example of the image data ID.sub.2 is illustrated in
[0081] In the example illustrated in
[0082] The arrow D.sub.1, the arrow D.sub.2, and the arrow D.sub.3 depicted in the image data ID.sub.2 are displayed in different colors, different shapes, or different visual effects (flashing or the like) from one another, and thereby can be visually identified. Accordingly, the operator is able to visually recognize which of the running information each of the arrows D.sub.1, D.sub.2, and D.sub.3 in the image data ID.sub.2 corresponds to.
[0083] Note that, in the image data ID.sub.2 illustrated in
[0084] As described above, in the image data ID.sub.2, the points on the movement path MP.sub.2 corresponding to the change points of the plurality of pieces of running information different from one another acquired from the feedback information FB are highlighted on the movement path MP.sub.2 in display forms visually different from one another. The processor 22 may switch the image to be displayed on the display device 30 between the image illustrated in
[0085] Note that, as illustrated in
[0086] Further, as illustrated in
[0087] Furthermore, the processor 22 of the mechanical system 100 may acquire, as the running information, the time-series data of the positions P.sub.T1 and P.sub.M1, velocities V.sub.T1 and V.sub.M1, acceleration a.sub.T1 and a.sub.M1, jerks .sub.T1 and a.sub.M1, movement directions d.sub.T1 and d.sub.M1, and the running mode RM; and the time-series data of the positions P.sub.T2 and P.sub.M2, velocities V.sub.T2 and V.sub.M2, acceleration a.sub.T2 and a.sub.M2, jerks .sub.T1 and .sub.M2, movement directions d.sub.T1 and d.sub.M1.
[0088] In this case, the processor 22 may highlight, in the image data ID.sub.2, the points on the movement path MP.sub.2 corresponding to the change points of the running information P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sup.T2, .sub.M2, d.sub.T2 and d.sub.M2, and also the points on the movement path MP.sub.2 corresponding to the change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, and RM on the movement path MP.sub.2.
[0089] Further, in the mechanical system 100, the processor 22 may generate both the movement paths MP.sub.1 and MP.sub.2, display the movement paths MP.sub.1 and MP.sub.2 in the image data ID.sub.2, and highlight the points on the movement path MP.sub.1 corresponding to the change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1 and RM, as well as the points on the movement path MP.sub.2 (or the movement path MP.sub.1) corresponding to the change points of the running information P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sub.T2, .sub.M2, d.sub.T2, and d.sub.M2.
[0090] The processor 22 may acquire the difference .sub.MP between the movement path MP.sub.1 and the movement path MP.sub.2, and display the difference .sub.MP in the image data ID.sub.2. Such an embodiment is illustrated in
[0091] For example, the processor 22 displays a region E corresponding to the difference .sub.MP between the movement path MP.sub.1 and the movement path MP.sub.2 as a colored region (e.g., a red-colored region). The processor 22 may enlarge and display the difference .sub.MP (i.e., the region E) in the image data ID.sub.2, in response to the input information to the input device 32.
[0092] In the above-described embodiments, the processor 22 may acquire time-series data of digital signals in which discrete numerical data exist in time sequence as the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sub.T2, .sub.M2, d.sub.T2, or d.sub.M2, not being limited to the time-series data of continuous analog signals as illustrated in
[0093] The running information RM may not be limited to the time-series data as illustrated in
[0094] The change point acquisition section 56 may be omitted from the above-described embodiments. In this case, the operator may manually identify and input the change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1 d.sub.T1, d.sub.M1, RM, P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sub.T2, .sub.M2, d.sub.T2, or d.sub.M2 from the time-series data thereof, for example.
[0095] In the above-described embodiments, the points on the movement path MP corresponding to the change points of the running information P.sub.T1, P.sub.M1, V.sub.T1, V.sub.M1, a.sub.T1, a.sub.M1, .sub.T1, .sub.M1, d.sub.T1, d.sub.M1, RM, P.sub.T2, P.sub.M2, V.sub.T2, V.sub.M2, a.sub.T2, a.sub.M2, .sub.T2, .sub.M2, d.sub.T2, and d.sub.M2 are highlighted by the arrows D.sub.1, D.sub.2, and D.sub.3. However, the point on the movement path MP corresponding to the change point may be highlighted in any display form (e.g., colored dot, triangular point, or square point). Although the present disclosure has been described through the embodiments, the embodiments are not intended to limit the invention according to the claims.