Merge lp:~kevin-wells/libgrip/touch-points into lp:libgrip
Status: | Rejected |
---|---|
Rejected by: | Stephen M. Webb |
Proposed branch: | lp:~kevin-wells/libgrip/touch-points |
Merge into: | lp:libgrip |
Diff against target: |
197 lines (+128/-0) 2 files modified
src/gripgesturemanager.c (+122/-0) src/gripgesturemanager.h (+6/-0) |
To merge this branch: | bzr merge lp:~kevin-wells/libgrip/touch-points |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Henrik Rydberg (community) | Disapprove | ||
Stephen M. Webb (community) | Approve | ||
Review via email: mp+54441@code.launchpad.net |
Description of the change
For demonstration purposes I needed to display the actual touch points used in the gestures exposed by libgrip, but that information was not in the event structures (although those points are contained in the geis attributes).
Also, the information passed in the libgrip event structures is not enough to recreate how the object being transformed should react to the gesture. As an example, I am using the pinch, drag, and rotate gestures to apply transformations to a map with the goal that the points on the map that are pressed stay under the touch points as the touch points move around the screen. I realize this could only work with 2 touch points, but my netbook only supports 2 touch points. The problem is that the only notion of position of the event is focus_x and focus_y. In the case of a pinch event, only knowing the focus_x and focus_y and radius_delta will only allow the map to be tracked accurately if both points have moved towards or away from the focus an equal amount (radius_delta / 2). If one point is kept stationary and the second is moved toward or away from the focus, tracking on the map becomes inaccurate.
While this is ultimately an issue with the gesture recognition library, geis provides finger positions, and with that additional information, this issue can be avoided. This branch adds finger_x and finger_y arrays of size fingers to the pinch, drag, and rotate event structures and populates them with point information from geis.
If this is something that is deemed useful, I can update the rectangle-mover example to show touch points and demonstrate the issue I mentioned.
Unmerged revisions
- 40. By Kevin Wells
-
Fix indentation
- 39. By Kevin Wells
-
Use slice allocators instead of g_new for finger points
- 38. By Kevin Wells
-
Add finger_x and finger_y to drag and rotate events.
- 37. By Kevin Wells
-
Add finger locations to Pinch gestures
I don't have a problem with propagating the touch points in general, but be aware that the significance of the touch points depends on the type of device. For example, a number of touchpads report only the bounding box, which will give focus and radius but not information about which individual points are in motion. The actual coordinates of a touchpad (an indirect device) may also not correspond to points on the screen in the same way as touchscreen touches do.
In short, I approve this patch as it is as a way to propagate the touch information through the libgrip structures, but also note it is insufficient for the purpose stated since that also requires the device information to be propagated. That information can come in a separate patch.
I also note the indentation appears to be a little messed up in a couple of places, particularly the parameters lists.
I'll wait and see what others have to say.