Session:「Improving Touch Interfaces」

ProbUI: Generalising Touch Target Representations to Enable Declarative Gesture Definition for Probabilistic GUIs

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025502

論文アブストラクト: We present ProbUI, a mobile touch GUI framework that merges ease of use of declarative gesture definition with the benefits of probabilistic reasoning. It helps developers to handle uncertain input and implement feedback and GUI adaptations. ProbUI replaces today's static target models (bounding boxes) with probabilistic gestures ("bounding behaviours"). It is the first touch GUI framework to unite concepts from three areas of related work: 1) Developers declaratively define touch behaviours for GUI targets. As a key insight, the declarations imply simple probabilistic models (HMMs with 2D Gaussian emissions). 2) ProbUI derives these models automatically to evaluate users' touch sequences. 3) It then infers intended behaviour and target. Developers bind callbacks to gesture progress, completion, and other conditions. We show ProbUI's value by implementing existing and novel widgets, and report developer feedback from a survey and a lab study.

日本語のまとめ:

既存のモバイルタッチGUIは入力条件が厳しく, ユーザの意図を考慮したシステムを構築しにくい. 本研究では, ジェスチャ入力からユーザ意図を確率的に推測する処理を実装したフレームワークを提案した.

BackXPress: Using Back-of-Device Finger Pressure to Augment Touchscreen Input on Smartphones

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025565

論文アブストラクト: When people hold their smartphone in landscape orientation, they use their thumbs for input on the frontal touchscreen, while their remaining fingers rest on the back of the device (BoD) to stabilize the grip. We present BackXPress, a new interaction technique that lets users create BoD pressure input with these remaining fingers to augment their interaction with the touchscreen on the front: Users can apply various pressure levels with each of these fingers to enter different temporary "quasi-modes" that are only active as long as that pressure is applied. Both thumbs can then interact with the frontal screen in that mode. We illustrate the practicality of BackXPress with several sample applications, and report our results from three user studies: Study 1 investigated which fingers can be used to exert BoD pressure and found index, middle, and ring finger from both hands to be practical. Study 2 revealed how pressure touches from these six fingers are distributed across the BoD. Study 3 examined user performance for applying BoD pressure (a) during single touches at the front and (b) for 20 seconds while touching multiple consecutive frontal targets. Participants achieved up to 92% pressure accuracy for three separate pressure levels above normal resting pressure, with the middle fingers providing the highest accuracy. BoD pressure did not affect frontal touch accuracy. We conclude with design guidelines for BoD pressure input.

日本語のまとめ:

既存の背面タッチ入力では位置座標の情報のみを利用していた. 本研究では背面からの圧力入力を利用したインタラクションを提案し, 背面タッチ設計に関するガイドラインを示した.

Improving Gesture Recognition Accuracy on Touch Screens for Users with Low Vision

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025941

論文アブストラクト: We contribute in this work on gesture recognition to improve the accessibility of touch screens for people with low vision. We examine the accuracy of popular recognizers for gestures produced by people with and without visual impairments, and we show that the user-independent accuracy of $P, the best recognizer among those evaluated, is small for people with low vision (83.8%), despite $P being very effective for gestures produced by people without visual impairments (95.9%). By carefully analyzing the gesture articulations produced by people with low vision, we inform key algorithmic revisions for the P recognizer, which we call P+. We show significant accuracy improvements of $P+ for gestures produced by people with low vision, from 83.8% to 94.7% on average and up to 98.2%, and 3x faster execution times compared to P.

日本語のまとめ:

障害者向けのタッチインタフェースの設計に関する研究は, 盲人を対象としたものがほとんどであった. 本研究では, 低視力者が作成したジェスチャも認識できるアルゴリズムを提案し, 認識精度と実行時間を大幅に改善した.

Understanding Grip Shifts: How Form Factors Impact Hand Movements on Mobile Phones

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025835

論文アブストラクト: In this paper we present an investigation into how hand usage is affected by different mobile phone form factors. Our initial (qualitative) study explored how users interact with various mobile phone types (touchscreen, physical keyboard and stylus). The analysis of the videos revealed that each type of mobile phone affords specific handgrips and that the user shifts these grips and consequently the tilt and rotation of the phone depending on the context of interaction. In order to further investigate the tilt and rotation effects we conducted a controlled quantitative study in which we varied the size of the phone and the type of grips (Symmetric bimanual, Asymmetric bimanual with finger, Asymmetric bimanual with thumb and Single handed) to better understand how they affect the tilt and rotation during a dual pointing task. The results showed that the size of the phone does have a consequence and that the distance needed to reach action items affects the phones' tilt and rotation. Additionally, we found that the amount of tilt, rotation and reach required corresponded with the participant's grip preference. We finish the paper by discussing the design lessons for mobile UI and proposing design guidelines and applications for these insights.

日本語のまとめ:

モバイル端末を持つ手(腕)の運動量・運動方向が、端末の形状・握り方・ボタンの位置によってどのように変化するか、また、その動きが少ないほどユーザに使用が好まれるかを調査した。調査結果を基にした新たなUIデザインの提案も行った。