論文アブストラクト： Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR.
論文アブストラクト： Virtual reality (VR) technology strives to enable a highly immersive experience for the user by including a wide variety of modalities (e.g. visuals, haptics). Current VR hardware however lacks a sufficient way of communicating the perception of weight of an object, resulting in scenarios where users can not distinguish between lifting a bowling ball or a feather. We propose a solely software based approach of simulating weight in VR by deliberately using perceivable tracking offsets. These tracking offsets nudge users to lift their arm higher and result in a visual and haptic perception of weight. We conducted two user studies showing that participants intuitively associated them with the sensation of weight and accept them as part of the virtual world. We further show that compared to no weight simulation, our approach led to significantly higher levels of presence, immersion and enjoyment. Finally, we report perceptional thresholds and offset boundaries as design guidelines for practitioners.
論文アブストラクト： Including haptic feedback in current consumer VR applications is frequently challenging, since technical possibilities to create haptic feedback in consumer-grade VR are limited. While most systems include and make use of the possibility to create tactile feedback through vibration, kinesthetic feedback systems almost exclusively rely on external mechanical hardware to induce actual sensations so far. In this paper, we describe an approach to create a feeling of such sensations by using unmodified off-the-shelf hardware and a software solution for a multi-modal pseudo-haptics approach. We first explore this design space by applying user-elicited methods, and afterwards evaluate our refined solution in a user study. The results show that it is indeed possible to communicate kinesthetic feedback by visual and tactile cues only and even induce its perception. While visual clipping was generally unappreciated, our approach led to significant increases of enjoyment and presence.
論文アブストラクト： Research on virtual reality (VR) has studied users' experience of immersion, presence, simulator sickness, and learning effects. However, the momentary experience of exiting VR and transitioning back to the real-world is not well understood. Do users become self-conscious of their actions upon exit? Are users nervous of their surroundings? Using explicitation interviews, we explore the moment of exit from VR across four applications. Analysis of the interviews reveals five components of experience: space, control, sociality, time, and sensory adaptation. Participants described spatial disorientation, for example, regardless of the complexity of the VR scene. Participants also described a window across which they exit VR, for example mentally first and then physically. We present six designs for easing or heightening the exit experience, as described by the participants. Based on these findings, we further discuss the ?moment of exit' as an opportunity for designing engaging and enhanced VR experiences.