Find/invent interaction pattern for window touch resizing
Open, NormalPublic

Description

Resizing windows with touch is cumbersome at the moment. It is difficult to hit the resize area (with or without borders) with a finger.

It would make sense to develop a distinct interaction pattern for touch resize, which might be completely different to how resize works with a more precise pointer event.

Any ideas or examples from other software?

romangg created this task.Oct 1 2018, 5:09 PM
romangg triaged this task as Normal priority.
romangg mentioned this in T8707: Window borders.
ngraham added a subscriber: ngraham.Oct 1 2018, 5:42 PM

Drag-resize is generally not a thing with touch.

Drag-and-drop with discrete items exists, but requires a large area, and the whole thing must be draggable.

To the extent that drag-resize exists with touch (e.g. to resize the divider line between two tiled apps on iOS, maybe Android too), the drag area must be absolutely huge.

romangg added a comment.EditedOct 1 2018, 5:50 PM

One idea I had was to start a resize only when two opposing corners are dragged (from the quadrants of their outer most area). But this wouldn't work with windows adjacent to a screen border. Related would be to start a resize when one finger rests in the window to resize (or its window bar) and a second finger is used for resize somewhere around the window..

One idea I had was to start a resize only when two opposing corners are dragged (from the quadrants of their outer most area). [...] Related would be to start a resize when one finger lasts in the window to resize (or its window bar) and a second finger is used for resize somewhere around the window..

Hmm, that would seem to make the ability to resize dependent on a user's hand size relative to the window size. What about huge windows or people with small hands? Seems like it might not be very ergonomic.

Generally drag with touch doesn't use fancy gestures that one would have to know about and get good at, it just makes the hit areas much bigger to reflect the imprecision of a finger.

An input-specific drag interaction paradigm kind of opens a can of worms:

  • What is the new touch-drag-resize interaction paradigm?
  • How do we make the new interaction paradigm discoverable?
  • How do we teach the user how to use it?
  • How can we make sure it's accessible for people's physical bodies (carpal tunnel, small/big hands, etc)?
  • Etc.

There is a much simpler alternative: whether borders are shown or not, when touch input is used to make contact with the screen near a screen edge, make the drag resize areas outside the window as big as the titlebar currently is (it's more or less the bare minimum size for comfortable touch). That way we don't have to answer any of the above questions and it'll behave just like people expect, with the finger basically being a virtual cursor.

fabianr added a subscriber: fabianr.Oct 2 2018, 6:53 AM

One idea I had was to start a resize only when two opposing corners are dragged (from the quadrants of their outer most area). [...] Related would be to start a resize when one finger lasts in the window to resize (or its window bar) and a second finger is used for resize somewhere around the window..

Hmm, that would seem to make the ability to resize dependent on a user's hand size relative to the window size. What about huge windows or people with small hands? Seems like it might not be very ergonomic.

You can just use both hands? We are mainly talking about laptop and desktop touchscreens here aren't we?

I would even suggest a gesture where you position 2 fingers in the window and move a third finger (from the other hand) in the direction you want to resize the window.
Does anybody know how others do it? If Google or Apple already have a gesture defined for that, we probably should just use these, because more people will be familiar with it.

In my opinion just trying to replicate mouse interaction with touch input like @ngraham suggested is the reason that Microsoft failed several times in creating a mobile/touch os. The result will always be a cumbersome compromise. So -1 from me for that.

romangg added a comment.EditedOct 2 2018, 12:40 PM

There is a much simpler alternative: whether borders are shown or not, when touch input is used to make contact with the screen near a screen edge, make the drag resize areas outside the window as big as the titlebar currently is (it's more or less the bare minimum size for comfortable touch). That way we don't have to answer any of the above questions and it'll behave just like people expect, with the finger basically being a virtual cursor.

Can you explain this idea more? Did you mean the "window edge" and not the "screen edge" maybe? Would the touch area "grow" into the window or to the outside?

Having a too large resize area for touch is imo not a very good solution, since then it becomes increasingly difficult to activate windows behind the currently active one. One needs large areas of the underlying window uncovered to find a place to touch besides the resize area to activate it / do something with it. On contrary I think there should be no virtual window resize area at all for touch resize, because there is no visual indication where this area ends in contrast to a cursor, which changes the icon for that.

ngraham added a comment.EditedOct 2 2018, 1:09 PM

Microsoft eventually succeeded in creating a halfway decent desktop OS that also has some touch and very decent pen input, which is what modern convertible laptops ship with. It is not a touch-first solution like a tablet or phone and cannot be; they did indeed fail in creating a popular and user-friendly touch-first OS.

We must also note that Apple succeeded. How did they succeed, and what did they do differently from Microsoft?

Apple ditched the entire desktop WIMP paradigm and started anew. No resizable, movable, and partially overlapping windows. No window management at all, in fact; just one window per app, period. No pointer, not even a virtual one. No menus. etc. Apple realized that the physical characteristics of the input devices (pointing device vs finger) were so different that they had to start from scratch rather than adapting existing paradigms. And it was so successful that now all touch OSs work the same way, with touch apps being either full screen or optionally tiled. There is still no concept of "window management."

As long as we keep the paradigm of discrete windows that can be moved, resized, and overlapped, we are implementing a traditional desktop metaphor and adapting it to touch, like what Microsoft wound up with. If we want to have a touch-first OS, we need to strongly consider the possibility that the WIMP desktop metaphor is un-salvageable, and that the final product needs to look and feel much more like other touch-first OSs with full-screen or tiled apps, large UI elements that have huge touch zones, no cascading menus, etc. I find it highly unlikely that we would prevail where Microsoft, Apple and Google either failed or did not even bother to try.

So back to the task at hand: how do we implement window resizing for Plasma-when-used-as-a-desktop-US-but-with-some-touch-support? I'll admit to being low on ideas beyond of the one I already offered. But I very strongly believe that any kind of window resizing on a touch device is going to be an ugly compromise; true touch OSs realize this and don't implement independent windows in the first place.

! In T9780#162302, @fabianr wrote:
You can just use both hands? We are mainly talking about laptop and desktop touchscreens here aren't we?

I would even suggest a gesture where you position 2 fingers in the window and move a third finger (from the other hand) in the direction you want to resize the window.

That's a decent option. What if you just use pinch on a window area that doesn't have any buttons or anything. There's already the ability to drag a window around like that.

I think you could do it as a full-window overlay. You might activate it with either an extra button in the titlebar or by double-tapping the titlebar.

I made a quick mockup:

You drag in the yellow areas to resize the corresponding border, you can drag in the green corner areas to resize both borders of that corner at once. Tapping in the blue area confirms the resize and closes the overlay.

This way it is easier to reach and resize the borders, and since you don't need to resize windows too often, this overlay might be appropriate. You could also extend these resize areas outside of the window border.

abetts added a subscriber: abetts.Nov 16 2018, 9:43 PM

What about multitouch? Using 2 fingers to pinch and zoom to the desired size? If you are using a tablet for example, the system could detect your pinch and zoon done vertically, diagonally, or horizontally and resize the window accordingly. It would only activate with two fingers on the screen and only in "empty" window areas.

@abetts

As not all applications have these empty window areas that can currently be used for dragging windows around, your idea could be integrated into an overlay with one large surface to pinch-to-zoom.

@abetts

As not all applications have these empty window areas that can currently be used for dragging windows around, your idea could be integrated into an overlay with one large surface to pinch-to-zoom.

That could work! I was aiming at how natural it is to use pinch to zoom. Most phones and tablets out there are multi-touch and we wouldn't have to jump so many obstacles to get this right.