EagleSense: Tracking People and Devices in Interactive Spaces using Real-Time Top-View Depth-Sensing

論文URL:http://dl.acm.org/citation.cfm?doid=3025453.3025562

論文アブストラクト:Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth-sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouette-extremities features and applying gradient tree boosting classifiers for activity recognition optimized for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies.

日本語のまとめ:

「日本語のまとめ」はツイッターに投稿する予定です。ツイッターでは110文字程度まで表示可能です。それ以降はツイッターに投稿する際にはざっくり削除されます。ウェブサイト上では削除されずに残りますが、一方であまり長いとまとめの意味がなくなるので、110字程度でお願いします。修正したい場合には、再度この画面から登録してください。一番最後に登録したものが採用されます。

(111文字)