Within our work, we apply context-awareness to determine how AR/VR technology should adapt instructions based on the context to suit user needs. We focus on situations where the user must carry out a complex manual activity that requires additional information to be present during the activity to achieve the desired result. To this end, the emphasis is on activities that require fine-motor skills and in-depth expertise and training, for which XR is a powerful tool to support and guide users performing these tasks. The contexts we detect include user intentions, environmental conditions, and activity progressions. Our work builds on these contexts with the main focus on determining how XR should adapt for the end-user from a usability perspective. The feedback we request from ISMAR consists of input in detection, usability, and simulation categories, together with how to balance these categories to create real-time and user-friendly systems. The next steps of our work will consider how to content should adjust based on the cognitive load, activity space, and environmental conditions.
Posts tagged: XR
HapticPanel: An open system to render haptic interfaces in virtual reality for manufacturing industry
Virtual Reality (VR) allows simulation of machine control panels without physical access to the machine, enabling easier and faster initial exploration, testing, and validation of machine panel designs. However, haptic feedback is indispensable if we want to interact with these simulated panels in a realistic manner. We present HapticPanel, an encountered-type haptic system that provides realistic haptic feedback for machine control panels in VR. To ensure a realistic manipulation of input elements, the user's hand is continuously tracked during interaction with the virtual interface. Based on which virtual element the user intends to manipulate, a motorized panel with stepper motors moves a corresponding physical input element in front of the user's hand, enabling realistic physical interaction.
Purpose-centric appropriation of everyday objects as game controllers
Generic multi-button controllers are the most common input devices used for video games. In contrast, dedicated game controllers and gestural interactions increase immersion and playability. Room-sized gaming has opened up possibilities to further enhance the immersive experience, and provides players with opportunities to use full-body movements as input. We present a purpose-centric approach to appropriating everyday objects as physical game controllers, for immersive room-sized gaming. Virtual manipulations supported by such physical controllers mimic real-world function and usage. Doing so opens up new possibilities for interactions that flow seamlessly from the physical into the virtual world. As a proof-of-concept, we present a 'Tower Defence' styled game, that uses four everyday household objects as game controllers, each of which serves as a weapon to defend the base of the players from enemy bots. Players can use 1) a mop (or a broom) to sweep away enemy bots directionally; 2) a fan to scatter them away; 3) a vacuum cleaner to suck them; 4) a mouse trap to destroy them. Each controller is tracked using a motion capture system. A physics engine is integrated in the game, and ensures virtual objects act as though they are manipulated by the actual physical controller, thus providing players with a highly-immersive gaming experience.
Web-powered virtual site exploration based on augmented 360 degree video via gesture-based interaction
Multi-viewer gesture-based interaction for omni-directional video
Omni-directional video (ODV) is a novel medium that offers viewers a 360º panoramic recording. This type of content will become more common within our living rooms in the near future, seeing that immersive displaying technologies such as 3D television are on the rise. However, little attention has been given to how to interact with ODV content. We present a gesture elicitation study in which we asked users to perform mid-air gestures that they consider to be appropriate for ODV interaction, both for individual as well as collocated settings. We are interested in the gesture variations and adaptations that come forth from individual and collocated usage. To this end, we gathered quantitative and qualitative data by means of observations, motion capture, questionnaires and interviews. This data resulted in a user-defined gesture set for ODV, alongside an in-depth analysis of the variation in gestures we observed during the study.
Exploring social augmentation concepts for public speaking using peripheral feedback and real-time behavior analysis
Squeeze me and i'll change: An exploration of frustration-triggered adaptation for multimodal interaction
Complex 3D interaction in virtual environments may inhibit user interaction and cause frustration. Supporting adaptivity based on the detected user frustration can be considered as one promising solution to enhance user interaction. Our work proposes to provide adaptive assistance to users who are frustrated during their interac- tion with 3D user interfaces in virtual environments. The obtrusive- ness of physiological measurements to detect frustration inspired us to investigate the pressure patterns exerted on a 3D input de- vice for this purpose. The experiment presented in this paper has shown a great potential on utilizing the finger pressure measures as an alternative to physiological measures to indicate user frustration during interaction. Furthermore, the findings in this particular con- text showed that adaptation of haptic interaction was effective in increasing the user's performance and making users feel less frus- trated in performing their tasks in the 3D environment.