Before this commit, we assumed `ShowMenu` action is not bound to any
buttons other than window menu button and always place the client-menu
under the window-menu button when atCursor="no". Also, it was going to be
difficult to distinguish whether the action is executed from the window
menu button or the window icon, which will be added soon.
This commit fixes it to open the menu under the actually-clicked button by
passing `cursor_context` to `actions_run()`, with some refactoring:
- `seat->pressed.resize_edges` is removed and it's calculated from the
cursor position and `seat->pressed.type` just before running Resize
action. This slightly changes the existing logic to determine the
resizing edges with Alt-Right + Drag mousebinding, but
`seat->pressed.type` is still stored on button press so it doesn't bring
back the issue #543.
- `seat->pressed.toplevel` is removed and `get_toplevel()` in
`update_pressed_surface()` may be called more often, but its overhead
will be negligible.
Contrary to the raw tablet events, the cursor events transform
the coordinates based on a mapped output orientation.
Otherwise those events are the same.
We should not switch to tablet notifications when an
out-of-surface-move had been started on a non-tablet
capabale surface.
Also postpone proximity-in when moving to a new surface
with the tip down.
Similar like touch, this is guarded by checking if
a surface accepts the tablet protocol. Also reuse
common cursor logic.
Intialize tablet tools on proximity.
Notify idle manager about activity on events.
We currently only support cursor emulation
for absolute motion, thus ignore tools/pens
that use relative motion.
Add a log statement on proximity-in to give
some feedback.