GRAPHICAL INTERFACE AND METHOD FOR MANAGING SAID GRAPHICAL INTERFACE DURING THE TOUCH-SELECTION OF A DISPLAYED ELEMENT

20170364243 · 2017-12-21

Assignee

Inventors

Cpc classification

International classification

Abstract

A touch interface includes a display screen. The interface detects an approach and a position of a finger of a user with respect to the screen. The interface displays on the screen at least one graphical element associated with a touch-selection zone, surrounding an anchor point of the graphical element on the screen. The interface estimates a trajectory of a point of the finger and an impact point of the trajectory on the screen, and moves the graphical element in the direction of the impact point, when a distance between the anchor point and the impact point falls below a first threshold.

Claims

1-10. (canceled)

11. A touch interface, comprising: a display screen, the interface being configured to detect an approach and a position of a finger of a user with respect to the screen, the interface being configured to display on the screen at least one graphical element associated with a touch-selection zone, surrounding an anchor point of the graphical element on the screen, wherein the interface is configured to estimate a trajectory of a point of the finger and an impact point of the trajectory on the screen, and is configured to move the graphical element in the direction of the impact point, when a distance between the anchor point and the impact point falls below a first threshold.

12. The touch interface as claimed in claim 11, in which the interface is configured, when crossing the first threshold, to calculate a translation vector of the anchor point to a temporary centering point, and to perform a corresponding translation of the display of the graphical element.

13. The touch interface as claimed in claim 12, in which the first threshold is a variable function of an angular position of the impact point around the anchor point.

14. The touch interface as claimed in claim 12, in which the interface is configured to calculate a position of the temporary centering point as a barycenter between the initial anchor point and the impact point, a relative distance between the centering point and the impact point being an increasing function of the distance between the finger and the screen.

15. The touch interface as claimed in claim 12, in which the interface is configured, when crossing the first threshold, to display the translated graphical element by dilating the graphical element, along at least one direction, according to an enlargement factor.

16. The touch interface as claimed in claim 12, in which the interface is configured, when crossing the first threshold to display the translated graphical element at a new position corresponding to the crossing of the first threshold, then, when the distance between the anchor point and the impact point falls below the first threshold, periodically calculating a new translation vector each time taking into account an updated impact point, and to display the translated graphical element of the corresponding vector.

17. The touch interface as claimed in claim 11, in which the interface is configured to move, with the displayed graphical element, the touch-selection zone.

18. The touch interface as claimed in claim 11, in which the interface is configured, when the distance between the anchor point and the impact point rises above a second threshold, to bring back the display of the graphical element to an initial position around the anchor point.

19. The touch interface as claimed in claim 17, in which the interface is configured to display on the touch screen at least one first graphical element associated with a first touch-selection zone, a first anchor point, and a first distance threshold first function, and to display at least one second graphical element associated with a second touch-selection zone, a second anchor point, and a first distance threshold second function, the first distance threshold first and second function defining around the first and the second anchor points, respectively, a first boundary of a first field of influence and a second boundary of a second field of influence, the interface being configured to allow, at least at times, a selection by contact of the finger at a contact point in the first touch-selection zone, while the first touch-selection zone temporarily overlaps the second field of influence and the contact point is located in the second field of influence.

20. A method for managing a touch interface, configured to detect an approach and a position of a finger of a user with respect to a screen of the interface, comprising: displaying at least one graphical element on the screen associated with a touch-selection zone, surrounding an anchor point of the graphical element on the screen, and located inside the same zone of influence; estimating repeatedly a trajectory of a point of the finger and an impact point of the trajectory on the screen; moving, when the impact point enters the zone of influence, the displayed graphical element and the associated touch-selection zone in a direction of the impact point; and displaying, as long as the impact point remains in the zone of influence, the graphical element and the associated touch-selection zone at a position which is a function of the updated impact point, and which becomes closer to the impact point as the finger approaches the screen.

Description

[0032] Other objects, features and advantages of the invention will appear on reading the following description, given solely as a non-restrictive example with reference to the accompanying drawings in which:

[0033] FIG. 1 illustrates a motor vehicle provided with an interface according to the invention,

[0034] FIG. 2 illustrates a human-machine interface according to the invention, and

[0035] FIG. 3 is a characteristic graph of one of the modes of operation of the interface in FIG. 2.

[0036] As illustrated in FIG. 1, a touch interface according to the invention may be, for example, on board a motor vehicle 3 driven by a user 4 who, by moving their finger and touching some points of a screen of a touch interface 1, is thus able to transmit instructions to an electronic control unit 2 for operating different equipment of the vehicle, e.g. a ventilation system 5 of the vehicle or any other equipment of the vehicle.

[0037] The electronic control unit 2 may also send back to the touch interface messages expressing the operating state of the vehicle so that the user 4 of the vehicle can take these data into account.

[0038] FIG. 2 illustrates the operating principle of a touch interface 1 according to the invention. The touch interface 1 typically includes a touch screen 6, delimited by edges 7, and a detection system (not represented) for detecting the position of a user's finger 11, notably a particular point D.sub.xyz of this finger, and for detecting whether this finger is or is not in contact with the touch screen. Touch screen here designates any input system by moving a finger and by this finger approaching a validation surface. The invention may, for example, be applied to detection systems optically projecting information onto an inert surface and observing the neighboring volume of this surface by means of various optical or infrared sensors, for example, so as to detect the position of a finger and so as to detect whether the finger is or is not in contact with the surface.

[0039] The touch screen 6 is typically delimited by means of boundaries 10 designated here by F.sub.1.sub._.sub.4, F.sub.1.sub._.sub.2, F.sub.4.sub._.sub.i, F.sub.2.sub._.sub.i, F.sub.2.sub._.sub.3, F.sub.i.sub._.sub.j in regiosn or zones of influence, which are in FIG. 2 referenced by R.sub.1, R.sub.2, R.sub.3, R.sub.4, R.sub.i and R.sub.j. Each region corresponds to a selection zone of a menu displayed on the touch screen. The regions capable of giving rise to an act of validation each display a graphical element referenced here by 8 and more particularly referenced according to the regions C.sub.1.sub._.sub.0, C.sub.2.sub._.sub.0, C.sub.3.sub._.sub.0, C.sub.4.sub._.sub.0, C.sub.i.sub._.sub.0, c.sub.j.sub._.sub.0.

[0040] These graphical elements indexed with 0 correspond to an initial display of the menu on the graphical screen. The graphical interface is configured for detecting the movement of the finger 11 and in particular of one end D.sub.xyz of this finger which at an instant t is located at the point D.sub.xyz(t) and at the next instant t+dt is located at a point D.sub.xyz(t+dt).

[0041] The graphical interface is capable of determining, for example by extrapolation of the successive detected points, a trajectory which is reestimated at each instant and which is denoted by traj(t) in FIG. 2 for the estimated trajectory at the instant t, and which is denoted by traj(t+dt) for the estimated trajectory at the instant t+dt. Each of these calculated trajectories defines a point P.sub.xy that is designated here as an impact point, although the impact remains at first theoretical. This point P.sub.xy is the intersection of the trajectory and a contact surface of the screen which may coincide with the display surface of the screen 6.

[0042] When the impact point is located sufficiently close to one of the graphical elements, the invention provides for modifying the display of the graphical element and bringing it closer to the impact point, in order to facilitate the work of the user who may thus continue to validate the corresponding option of the menu without moving their finger aside from the trajectory in progress. To do this, for each graphical element a virtual anchor point 9 is arbitrarily defined, which may not appear in the display, and which serves both for estimating the distance between the graphical element and the impact point and for calculating the subsequent movements of the display of the graphical element.

[0043] In FIG. 2, these anchor points are identified respectively by the references B.sub.1.sub._.sub.0, for the graphical element C.sub.1.sub._.sub.0, B.sub.2.sub._.sub.0 for the graphical element C.sub.2.sub._.sub.0, . . . B.sub.j.sub._.sub.0 for the graphical element C.sub.j .sub._.sub.0.

[0044] These anchor points may, for convenience, correspond to a surface barycenter of the graphical element, or to a barycenter of an outline of the graphical element. According to a variant embodiment, they may optionally be located arbitrarily near one of the boundaries of the graphical element.

[0045] For determining whether the display of the graphical element C.sub.1.sub._.sub.0 of the region R.sub.1 has to be moved, the distance denoted here by gap.sub.1(t) may be compared either with a constant threshold, or with a threshold which depends on the direction of the straight line linking the centering point B.sub.1.sub._.sub.0 and the impact point P.sub.xy(t). For example, it is possible to verify whether the impact point is located inside the boundaries delimiting the region in which the graphical element considered is located in its initial state in the absence of any interaction with the finger.

[0046] In FIG. 2, the distance at an instant t of the impact point P.sub.xy(t) of the trajectory traj(t) calculated at this instant t and of the anchor point B.sub.1.sub._.sub.0 is denoted by Gap.sub.1(t). As a function of this distance, and also as a function of the distance of the finger to the screen, a movement is applied to the graphical element C.sub.1.sub._.sub.0 denoted here by U.sub.1(t), which at a given instant corresponds to a vector joining the anchor point B.sub.1.sub._.sub.0 and a temporary centering point B.sub.1(t) occupying with respect to the moved graphical element C.sub.1(t) the same barycentric position that the anchor point B.sub.1.sub._.sub.0 initially occupies with respect to the graphical element C.sub.1.sub._.sub.0 in its initial display configuration.

[0047] At a subsequent instant t+dt, the recalculated trajectory defines a new impact point P.sub.xy(t+dt) the position of which is used in conjunction with the distance of the finger to the screen for calculating a new position of the centering point B.sub.1(t+dt) of the graphical element C.sub.1(t+dt) displayed at this moment.

[0048] In order to improve the perception by the user of the graphical element which is on the point of being selected, it is possible, as soon as the movement of the display of the graphical element is activated, to accompany this movement with a dilation of the dimensions of the graphical element, e.g. a homothety along all directions or optionally, according to the available space on the screen, a dilation along one of the directions of the screen.

[0049] The size of the graphical element may then be maintained constant as long as the movement of the display of the graphical element continues being effective. According to the size of the graphical element and the amplitude of the movement, it may happen that the graphical element overlaps one of the boundaries between regions. For example, in FIG. 2, the graphical element C.sub.1(t+dt) is on the point of overlapping the boundary F.sub.1.sub._.sub.2. A touch-selection zone is associated with each graphical element, which when it is touched by the user's finger triggers an action corresponding to one of the options in the menu displayed on the screen. Preferably, the touch-selection zone coincides with the surface occupied by the graphical element.

[0050] The interface according to the invention may be configured so that, if, following the trajectory performed by the finger, the graphical element and the associated touch-selection zone overlap one of the boundaries and the finger comes into contact with the screen at a point of the graphical element displayed at this instant, a validation on the associated touch-selection zone is then taken into account, even if the contact point is located at this instant beyond the boundary of the region associated with the graphical element.

[0051] In this way, the user's input is facilitated, since in a way the effective boundary of the region allowed for selecting an element from the menu changes shape to some extent according to the trajectory of the user's finger, so as to widen the total selection region allowed by temporarily moving its boundaries.

[0052] FIG. 3 illustrates an example of a graph 20 linking the amplitude of the movement denoted here by U.sub.1(t) of a graphical element on an interface according to the invention, as a function of a distance h(t) between a user's finger and the screen, and a distance Gap.sub.1(t) between the impact point of the trajectory of the finger and the initial anchor point of the graphical element. The mapped surface 21 is here chosen for canceling any movement of the graphical element when the distance of the finger to the screen exceeds a certain threshold h.sub.0, which may typically be the detection threshold distance of the touch interface. The mapped surface 21 is also chosen for canceling any movement of the graphical element when the impact point approaches the anchor point, since there is then no longer any need to move the graphical element. Typically, the value U.sub.1(t) of movement may be chosen as a product between the distance Gap.sub.1(t) and a function which is chosen as an increasing function of the distance of the finger to the screen, a function that is canceled for a distance threshold value h.sub.0. One of the forms of possible functions for defining the movement vector U.sub.1(t) is to directly multiply the distance Gap.sub.1(t) by a concave or convex function of the distance h from the finger to the screen. This concave or convex function may be, for example, a power of the difference from 1, of a ratio between the distance h from the finger and the threshold distance h.sub.0.

[0053] If a power ½ is chosen for this function, the expression provided in equation (1) is obtained, corresponding to the graph in FIG. 3:


U.sub.1(t)=Dist(B.sub.1.sub._.sub.0, P.sub.xy(t))×√{square root over ((1−h(t)/h.sub.0))}=Gap.sub.1(t)×√{square root over ((1−h(t)/h.sub.0))}   Equation (1).

[0054] The advantage of choosing such a form of convex function is that it has an effect of “slowing down” the movement of the graphical element when the finger is in the immediate neighborhood of the screen, which avoids disturbing the user before the final selection. Another variant function U(t) may be envisaged, in which a power function is also applied to the distance Gap.sub.1(t) between the anchor point and the impact point, so as, for example, to slow down the movement of the graphical element when approaching the bounds of the region associated with the graphical element considered. The distance of the finger to the screen may be taken as an orthogonal distance h of the finger to the screen, as represented in FIGS. 2 and 3, but variant embodiments may be envisaged in which the distance of the finger to the screen is taken as the distance between the point D.sub.xyz of the finger closest to the screen and the impact point P.sub.xy(t) of the trajectory at this instant.

[0055] From another point of view, a relative distance may be defined between the centering point B.sub.1(t) of the graphical element and the impact point P.sub.xy(t) as a ratio of distance A.sub.1(t), defined as:

[00001] Δ 1 ( t ) = Gap 1 ( t ) - U 1 ( t ) Gap 1 ( t ) . Equation .Math. .Math. ( 2 )

[0056] This relative distance gives the remaining gap to be traveled to the graphical element so that this graphical element is centered on the impact point.

[0057] This relative distance decreases when the finger approaches the screen and is canceled when the finger touches the screen.

[0058] The invention is not limited to the exemplary embodiments described and may have many variants. The position of the finger with respect to the interface may be detected by any touch means or any selection means by positioning the end of a finger. The functions for calculating the movement of a graphical element may differ from those cited in the examples. Time-outs may be introduced in some steps of the process of modified display of the graphical element or elements. Transparent display modes of one graphical element with respect to another may be provided if two graphical elements have to be displayed over intercepting zones.