Session:「VR/AR/Telepresence 1」

Scenariot: Spatially Mapping Smart Things Within Augmented Reality Scenes

論文URL: http://dl.acm.org/citation.cfm?doid=3173574.3173793

論文アブストラクト: The emerging simultaneous localizing and mapping (SLAM) based tracking technique allows the mobile AR device spatial awareness of the physical world. Still, smart things are not fully supported with the spatial awareness in AR. Therefore, we present Scenariot, a method that enables instant discovery and localization of the surrounding smart things while also spatially registering them with a SLAM based mobile AR system. By exploiting the spatial relationships between mobile AR systems and smart things, Scenariot fosters in-situ interactions with connected devices. We embed Ultra-Wide Band (UWB) RF units into the AR device and the controllers of the smart things, which allows for measuring the distances between them. With a one-time initial calibration, users localize multiple IoT devices and map them within the AR scenes. Through a series of experiments and evaluations, we validate the localization accuracy as well as the performance of the enabled spatial aware interactions. Further, we demonstrate various use cases through Scenariot.

日本語のまとめ:

Scenariotというデバイス間の通信だけで位置合わせを行うSLAMの手法を提案した。この手法は、超広帯域無線(UWB)高周波(RF)ユニットをARデバイスとスマートデバイスのコントローラに組み込むことにより、それらの間の距離測定を実現している。

ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications

論文URL: http://dl.acm.org/citation.cfm?doid=3173574.3173927

論文アブストラクト: The latest generations of smartphones with built-in AR capabilities enable a new class of mobile apps that merge digital and real-world content depending on a user's task, context, and preference. But even experienced mobile app designers face significant challenges: creating 2D/3D AR content remains difficult and time-consuming, and current mobile prototyping tools do not support AR views. There are separate tools for this; however, they require significant technical skill. This paper presents ProtoAR which supplements rapid physical prototyping using paper and Play-Doh with new mobile cross-device multi-layer authoring and interactive capture tools to generate mobile screens and AR overlays from paper sketches, and quasi-3D content from 360-degree captures of clay models. We describe how ProtoAR evolved over four design jams with students to enable interactive prototypes of mobile AR apps in less than 90 minutes, and discuss the advantages and insights ProtoAR can give designers.

日本語のまとめ:

速にオブジェクトが作成可能なモバイル向けARアプリケーションを提案した.スケッチをキャプチャし,そこからインタラクションを作成するものと,実物体を読み取り,3Dモデルに変換するものの2種類の機能がある.

ARcadia: A Rapid Prototyping Platform for Real-time Tangible Interfaces

論文URL: http://dl.acm.org/citation.cfm?doid=3173574.3173983

論文アブストラクト: Paper-based fabrication techniques offer powerful opportunities to prototype new technological interfaces. Typically, paper-based interfaces are either static mockups or require integration with sensors to provide real-time interactivity. The latter can be challenging and expensive, requiring knowledge of electronics, programming, and sensing. But what if computer vision could be combined with prototyping domain-aware programming tools to support the rapid construction of interactive, paper-based tangible interfaces? We designed a toolkit called ARcadia that allows for rapid, low-cost prototyping of TUIs that only requires access to a webcam, a web browser, and paper. ARcadia brings paper prototypes to life through the use of marker based augmented reality (AR). Users create mappings between real-world tangible objects and different UI elements. After a crafting and programming phase, all subsequent interactions take place with the tangible objects. We evaluated ARcadia in a workshop with 120 teenage girls and found that tangible AR technologies can empower novice technology designers to rapidly construct and iterate on their ideas.

日本語のまとめ:

ARcadiaというツールキットを設計した。ARcadiaはWebカメラ、Webブラウザ、および紙の3つの情報を取得するだけで、迅速かつ低コストのタンジブルユーザインタフェースのプロトタイプ作成を可能にし、設定後のインタラクションは全てタンジブルオブジェクトで行う。

TeleHuman2: A Cylindrical Light Field Teleconferencing System for Life-size 3D Human Telepresence

論文URL: http://dl.acm.org/citation.cfm?doid=3173574.3174096

論文アブストラクト: For telepresence to support the richness of multiparty conversations, it is important to convey motion parallax and stereoscopy without head-worn apparatus. TeleHuman2 is a "hologrammatic" telepresence system that conveys full-body 3D video of interlocutors using a human-sized cylindrical light field display. For rendering, the system uses an array of projectors mounted above the heads of participants in a ring around a retroreflective cylinder. Unique angular renditions are calculated from streaming depth video captured at the remote location. Projected images are retro-reflected into the eyes of local participants, at 1.3º intervals providing angular renditions simultaneously for left and right eyes of all onlookers, which conveys motion parallax and stereoscopy without head-worn apparatus or head tracking. Our technical evaluation of the angular accuracy of the system demonstrates that the error in judging the angle of a remote arrow object represented in TeleHuman2 is within 1 degree, and not significantly different from similar judgments of a collocated arrow object.

日本語のまとめ:

遠隔地にいる人からのメッセージを前身3Dで表示.人間サイズの円柱ライトフィールドディスプレイを使用しており,プロジェクタを円形に設置することで3D表示を実現.複数台の3Dで撮影することで,どの角度からでも正しい立体映像を提示.