EgoScanning: Quickly Scanning First-Person Videos with Egocentric Elastic Timelines

論文URL:http://dl.acm.org/citation.cfm?doid=3025453.3025821

論文アブストラクト:This work presents EgoScanning, a novel video fast-forwarding interface that helps users to find important events from lengthy first-person videos recorded with wearable cameras continuously. This interface is featured by an elastic timeline that adaptively changes playback speeds and emphasizes egocentric cues specific to first-person videos, such as hand manipulations, moving, and conversations with people, based on computer-vision techniques. The interface also allows users to input which of such cues are relevant to events of their interests. Through our user study, we confirm that users can find events of interests quickly from first-person videos thanks to the following benefits of using the EgoScanning interface: 1) adaptive changes of playback speeds allow users to watch fast-forwarded videos more easily; 2) Emphasized parts of videos can act as candidates of events actually significant to users; 3) Users are able to select relevant egocentric cues depending on events of their interests.

日本語のまとめ:

一人称視点動画の高速閲覧を支援するインタフェースを提案した。ユーザが手がかりを選択すると、対応するシーンが低速で再生され、他のシーンが高速で再生される。ユーザスタディで提案インタフェースが有用であることを確認した。

(107文字)

発表スライド: