Differences in Crowdsourced vs. Lab-based Mobile and Desktop Input Performance Data

論文URL:http://dl.acm.org/citation.cfm?doid=3025453.3025820

論文アブストラクト:Research on the viability of using crowdsourcing for HCI performance experiments has concluded that online results are similar to those achieved in the lab---at least for desktop interactions. However, mobile devices, the most popular form of online access today, may be more problematic due to variability in the user's posture and in movement of the device. To assess this possibility, we conducted two experiments with 30 lab-based and 303 crowdsourced participants using basic mouse and touchscreen tasks. Our findings show that: (1) separately analyzing the crowd and lab data yields different study conclusions-touchscreen input was significantly less error prone than mouse input in the lab but more error prone online; (2) age-matched crowdsourced participants were significantly faster and less accurate than their lab-based counterparts, contrasting past work; (3) variability in mobile device movement and orientation increased as experimenter control decreased--a potential factor affecting the touchscreen error differences. This study cautions against assuming that crowdsourced data for performance experiments will directly reflect lab-based data, particularly for mobile devices.

日本語のまとめ:

Pointing等の入力タスクの性能実験を実験室で行った場合とオンラインで行った場合に違いがあるかを検証。実験室ではマウスよりタブレットの方が入力エラーが少ないのにオンライン実験では逆の結果が出るといった違いがあった。

(109文字)

発表スライド: