論文アブストラクト： Interactive, smart objects-customized to individuals and uses-are central to many movements, such as tangibles, the internet of things (IoT), and ubiquitous computing. Yet, rapid prototyping both the form and function of these custom objects can be problematic, particularly for those with limited electronics or programming experience. Designers often need to embed custom circuitry; program its workings; and create a form factor that not only reflects the desired user experience but can also house the required circuitry and electronics. To mitigate this, we created Pineal, a design tool that lets end-users: (1) modify 3D models to include a smart watch or phone as its heart; (2) specify high-level interactive behaviours through visual programming; and (3) have the phone or watch act out such behaviours as the objects' "smarts". Furthermore, a series of prototypes show how Pineal exploits mobile sensing and output, and automatically generates 3D printed form-factors for rich, interactive, objects.
論文アブストラクト： As devices around us become smart, our gaze is poised to become the next frontier of human-computer interaction (HCI). State-of-the-art mobile eye tracker systems typically rely on eye-model-based gaze estimation approaches, which do not require a calibration. However, such approaches require specialized hardware (e.g., multiple cameras and glint points), can be significantly affected by glasses, and, thus, are not fit for ubiquitous gaze-based HCI. In contrast, regression-based gaze estimations are straightforward approaches requiring solely one eye and one scene camera but necessitate a calibration. Therefore, a fast and accurate calibration is a key development to enable ubiquitous gaze-based HCI. In this paper, we introduce CalibMe, a novel method that exploits collection markers (automatically detected fiducial markers) to allow eye tracker users to gather a large array of calibration points, remove outliers, and automatically reserve evaluation points in a fast and unsupervised manner. The proposed approach is evaluated against a nine-point calibration method, which is typically used due to its relatively short calibration time and adequate accuracy. CalibMe reached a mean angular error of 0.59 (0=0.23) in contrast to 0.82 (0=0.15) for a nine-point calibration, attesting for the efficacy of the method. Moreover, users are able to calibrate the eye tracker anywhere and independently in - 10 s using a cellphone to display the collection marker.
論文アブストラクト： We present TrussFab, an integrated end-to-end system that allows users to fabricate large scale structures that are sturdy enough to carry human weight. TrussFab achieves the large scale by complementing 3D print with plastic bottles. It does not use these bottles as "bricks" though, but as beams that form structurally sound node-link structures, also known as trusses, allowing it to handle the forces resulting from scale and load. TrussFab embodies the required engineering knowledge, allowing non-engineers to design such structures and to validate their design using integrated structural analysis. We have used TrussFab to design and fabricate tables and chairs, a 2.5 m long bridge strong enough to carry a human, a functional boat that seats two, and a 5 m diameter dome.
論文アブストラクト： The increased interest in interactive soft materials, such as smart clothing and responsive furniture, means that there is a need for flexible and deformable electronics. In this paper, we focus on stitch-based elastic sensors, which have the benefit of being manufacturable with textile craft tools that have been used in homes for centuries. We contribute to the understanding of stitch-based stretch sensors through four experiments and one user study that investigate conductive yarns from textile and technical perspectives, and analyze the impact of different stitch types and parameters. The insights informed our design of new stretch-based interaction techniques that emphasize eyes-free or causal interactions. We demonstrate with StretchEBand how soft, continuous sensors can be rapidly fabricated with different parameters and capabilities to support interaction with a wide range of performance requirements across wearables, mobile devices, clothing, furniture, and toys.