論文アブストラクト： Previous research and recent smartphone development presented a wide range of input controls beyond the touchscreen. Fingerprint scanners, silent switches, and Back-of-Device (BoD) touch panels offer additional ways to perform input. However, with the increasing amount of input controls on the device, unintentional input or limited reachability can hinder interaction. In a one-handed scenario, we conducted a study to investigate the areas that can be reached without losing grip stability (comfortable area), and with stretched fingers (maximum range) using four different phone sizes. We describe the characteristics of the comfortable area and maximum range for different phone sizes and derive four design implications for the placement of input controls to support one-handed BoD and edge interaction. Amongst others, we show that the index and middle finger are the most suited fingers for BoD interaction and that the grip shifts towards the top edge with increasing phone sizes.
論文アブストラクト： We introduce KeyTime, a new technique and accompanying software for predicting the production times of users' stroke gestures articulated on touchscreens. KeyTime employs the principles and concepts of the Kinematic Theory, such as lognormal modeling of stroke gestures' velocity profiles, to estimate gesture production times significantly more accurately than existing approaches. Our experimental results obtained on several public datasets show that KeyTime predicts user-independent production times that correlate r=.99 with groundtruth from just one example of a gesture articulation, while delivering an average error in the predicted time magnitude that is 3 to 6 times smaller than that delivered by CLC, the best prediction technique up to date. Moreover, KeyTime reports a wide range of useful statistics, such as the trimmed mean, median, standard deviation, and confidence intervals, providing practitioners with unprecedented levels of accuracy and sophistication to characterize their users' a priori time performance with stroke gesture input.
論文アブストラクト： We present a mobile system that enhances mixed reality experiences and games with force feedback by means of electrical muscle stimulation (EMS). The benefit of our approach is that it adds physical forces while keeping the users' hands free to interact unencumbered-not only with virtual objects, but also with physical objects, such as props and appliances. We demonstrate how this supports three classes of applications along the mixed-reality continuum: (1) entirely virtual objects, such as furniture with EMS friction when pushed or an EMS-based catapult game. (2) Virtual objects augmented via passive props with EMS-constraints, such as a light control panel made tangible by means of a physical cup or a balance-the-marble game with an actuated tray. (3) Augmented appliances with virtual behaviors, such as a physical thermostat dial with EMS-detents or an escape-room that repurposes lamps as levers with detents. We present a user-study in which participants rated the EMS-feedback as significantly more realistic than a no-EMS baseline.
論文アブストラクト： Atomic interactions in touch interfaces, like tap, drag, and flick, are well understood in terms of interaction design, but less is known about their physical performance characteristics. We carried out a study to gather baseline data about finger pitch and roll orientation during atomic touch input actions. Our results show differences in orientation and range for different fingers, hands, and actions, and we analyse the effect of tablet angle. Our data provides designers and researchers with a new resource to better understand what interactions are possible in different settings ( e.g. when using the left or right hand), to design novel interaction techniques that use orientation as input (e.g. using finger tilt as an implicit mode), and to determine whether new sensing techniques are feasible (e.g. using fingerprints for identifying specific finger touches).