Session:「Innovative Fabrication Techniques」

Pineal: Bringing Passive Objects to Life with Embedded Mobile Devices

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025652

論文アブストラクト: Interactive, smart objects-customized to individuals and uses-are central to many movements, such as tangibles, the internet of things (IoT), and ubiquitous computing. Yet, rapid prototyping both the form and function of these custom objects can be problematic, particularly for those with limited electronics or programming experience. Designers often need to embed custom circuitry; program its workings; and create a form factor that not only reflects the desired user experience but can also house the required circuitry and electronics. To mitigate this, we created Pineal, a design tool that lets end-users: (1) modify 3D models to include a smart watch or phone as its heart; (2) specify high-level interactive behaviours through visual programming; and (3) have the phone or watch act out such behaviours as the objects' "smarts". Furthermore, a series of prototypes show how Pineal exploits mobile sensing and output, and automatically generates 3D printed form-factors for rich, interactive, objects.

日本語のまとめ:

3Dプリントしたオブジェクト内部にスマホを埋め込んだインタラクティブオブジェクトのプロトタイピング手法。入出力にスマホのセンサやタッチパネル、スピーカーなどを使用する。デザインツールを実装し複数のアプリケーションを示した。

CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025950

論文アブストラクト: As devices around us become smart, our gaze is poised to become the next frontier of human-computer interaction (HCI). State-of-the-art mobile eye tracker systems typically rely on eye-model-based gaze estimation approaches, which do not require a calibration. However, such approaches require specialized hardware (e.g., multiple cameras and glint points), can be significantly affected by glasses, and, thus, are not fit for ubiquitous gaze-based HCI. In contrast, regression-based gaze estimations are straightforward approaches requiring solely one eye and one scene camera but necessitate a calibration. Therefore, a fast and accurate calibration is a key development to enable ubiquitous gaze-based HCI. In this paper, we introduce CalibMe, a novel method that exploits collection markers (automatically detected fiducial markers) to allow eye tracker users to gather a large array of calibration points, remove outliers, and automatically reserve evaluation points in a fast and unsupervised manner. The proposed approach is evaluated against a nine-point calibration method, which is typically used due to its relatively short calibration time and adequate accuracy. CalibMe reached a mean angular error of 0.59 (0=0.23) in contrast to 0.82 (0=0.15) for a nine-point calibration, attesting for the efficacy of the method. Moreover, users are able to calibrate the eye tracker anywhere and independently in - 10 s using a cellphone to display the collection marker.

日本語のまとめ:

モバイル視線計測器での 高速かつ高精度なキャリブレーション手法の提案。ユーザが基準となるマーカを注視したまま頭を動かすことでキャリブレーションが完了する。既存のキャリブレーション手法との比較を行った。

TrussFab: Fabricating Sturdy Large-Scale Structures on Desktop 3D Printers

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3026016

論文アブストラクト: We present TrussFab, an integrated end-to-end system that allows users to fabricate large scale structures that are sturdy enough to carry human weight. TrussFab achieves the large scale by complementing 3D print with plastic bottles. It does not use these bottles as "bricks" though, but as beams that form structurally sound node-link structures, also known as trusses, allowing it to handle the forces resulting from scale and load. TrussFab embodies the required engineering knowledge, allowing non-engineers to design such structures and to validate their design using integrated structural analysis. We have used TrussFab to design and fabricate tables and chairs, a 2.5 m long bridge strong enough to carry a human, a functional boat that seats two, and a 5 m diameter dome.

日本語のまとめ:

大量のペットボトルを繋げて、人の体重をも支えられる大型オブジェクトをファブリケーションする。ペットボトル同士を繋げるコネクタを3Dプリントする。それらの耐久評価を行った。実際に活用できる椅子や机など様々な応用例を示した。

StretchEBand: Enabling Fabric-based Interactions through Rapid Fabrication of Textile Stretch Sensors

論文URL: http://dl.acm.org/citation.cfm?doid=3025453.3025938

論文アブストラクト: The increased interest in interactive soft materials, such as smart clothing and responsive furniture, means that there is a need for flexible and deformable electronics. In this paper, we focus on stitch-based elastic sensors, which have the benefit of being manufacturable with textile craft tools that have been used in homes for centuries. We contribute to the understanding of stitch-based stretch sensors through four experiments and one user study that investigate conductive yarns from textile and technical perspectives, and analyze the impact of different stitch types and parameters. The insights informed our design of new stretch-based interaction techniques that emphasize eyes-free or causal interactions. We demonstrate with StretchEBand how soft, continuous sensors can be rapidly fabricated with different parameters and capabilities to support interaction with a wide range of performance requirements across wearables, mobile devices, clothing, furniture, and toys.

日本語のまとめ:

伸縮性のある布に導電糸を縫い付け伸縮センサを作成する。導電糸・縫い方の特性を調査。耐久性、 縫いやすさ、抵抗の変化の仕方などから最適なものを選出した。ユーザが指定された長さを引き延ばすことができるかを調査した。