Posts tagged: Mobile

An interactive design space for wearable displays

The promise of on-body interactions has led to widespread development of wearable displays. They manifest themselves in highly variable shapes and form, and are realized using technologies with fundamentally different properties. Through an extensive survey of the field of wearable displays, we characterize existing systems based on key qualities of displays and wearables, such as location on the body, intended viewers or audience, and the information density of rendered content. We present the results of this analysis in an open, web-based interactive design space that supports exploration and refinement along various parameters. The design space, which currently encapsulates 129 cases of wearable displays, aims to inform researchers and practitioners on existing solutions and designs, and enable the identification of gaps and opportunities for novel research and applications. Further, it seeks to provide them with a thinking tool to deliberate on how the displayed content should be adapted based on key design parameters. Through this work, we aim to facilitate progress in wearable displays, informed by existing solutions, by providing researchers with an interactive platform for discovery and reflection.

Impact of situational impairment on interaction with wearable displays

The number of wearable devices that we carry increases, with smaller companion devices like smartwatches providing quick access for simple tasks. These devices are, however, not necessarily in direct sight of the user and during everyday activities, it is unlikely, even undesirable, that the user constantly focuses on or interacts with these screens. Furthermore, interaction is often limited because our hands are occupied carrying or holding items such as bags, papers, boxes, or tools. In this paper, we evaluate how encumbrance affects, among others, the time it takes to perceive and react to a notification depending on the placement of the companion device. Our experimental results can assist designers in choosing the right device for the task.

Attracktion: Field evaluation of multi-track audio as unobtrusive cues for pedestrian navigation

Listening to music while being on the move is common in our headphone society. However, if we want assistance in navigation from our smartphone, existing approaches either demand exclusive playback through the headphones or impact the listening experience of the music. We present a field evaluation of Attracktion, a spatial audio navigation system that leverages the access to single stems in a multi-track recording to minimize the impact on the listening experience. We compared Attracktion against current turn-by-turn navigation instructions in a field-study with 22 users and found that users perceived acoustic overlays with additional navigation information to have no impact on the listening experience. In terms of path efficiency, errors, and mental workload, Attracktion is on par with spoken turn-by-turn navigation instructions, and users liked it for the aspect of serendipity.

TaskHerder: A wearable minimal interaction interface for mobile and long-lived task execution

Notifications have become a core component of the smart-phone as our ubiquitous companion. Many of these only require minimal interaction, for which the smartwatch is a helpful companion device. However, its design and placement is influenced by its traditional ancestors. For applications where the user is constrained because of a specific usage situation, or performs tasks with both hands simultaneously, interaction with the smartwatch can be cumbersome. In this paper, we propose a wearable armstrap for minimal interaction in long-lived tasks. Placed around the elbow, it is outside the hands' proximal working space which reduces interference. Its flexible e-ink display provides screen space to provide overview information at minimal energy consumption for longer uptime. We designed the wearable for a professional use-case, meaning that is can easily be placed above protective clothing as its flexible round shape easily adjusts to various diameters. Capacitive touch sensing allows gesture input even under rough conditions, e.g., with gloves.

Whom-i-approach: A system that provides cues on approachability of bystanders for blind users

Body posture is one of many visual cues used by sighted persons to determine if someone would be open to initiate a conversation. These cues are inaccessible for individuals with blindness leading to difficulties when deciding whom to approach for eventual assistance. Current camera technologies, such as depth cameras, enable to automatically scan the environment to assess the approachability of nearby persons. We present Whom-I-Approach, a system that translates postures of bystanders into a measure of approachability and communicates this information using auditory and tactile cues. The system scans the environment and determines the approachability based on body posture for the persons in the vicinity of the user. Efficiency as well as perceived system usability and psychosocial attitudes are measured in a user study showing the potential to improve competence for users with blindness prior to engagement in social interactions.

SCWT: A joint workshop on smart connected and wearable things

Back on bike: The BoB mobile cycling app for secondary prevention in cardiac patients

Persons that suffered from a cardiac disease are often recommended to integrate a sufficient level of physical exercise in their daily life. Initially, cardiac rehabilitation takes place in a closely monitored setting in a hospital or a rehabilitation center. Sustaining the effort once the patient has left the ambulatory, supervised environment is a challenge, and drop-out rates are high. Emerging approaches such as telemonitoring and telerehabilitation have been proven to show the potential to support the cardiac patient in adhering to the advised physical exercise. However, most telerehabilitation solutions only support a limited range of physical exercise, such as step-counting during walking. We propose BoB (Back on Bike), a mobile application that guides cardiac patients while cycling. Design choices are explained according to three pillars: ease of use, reduce fear, and direct and indirect motivation. In this paper, we report the results from a field study with cardiac patients.

A grounded approach for applying behavior change techniques in mobile cardiac tele-rehabilitation

In mobile tele-rehabilitation applications for Coronary Artery Disease (CAD) patients, behavior change plays a central role in influencing better therapy adherence and prevention of disease recurrence. However, creating sustainab le behavior chan ge that holds a beneficial impact over a prolonged period of time remains an important challenge. In this paper we discuss various models and frameworks related to persuas ion and behav ior chan ge, and investigate how to incorporate these with a multidisciplinary user-centered design approach for creating a mobile tele-rehabilitation application. By implementing different concepts that contribute to behavior change and applying a set of distinct persuasive design patterns, we were able to translate the high-level goals of behavior theory into a mobile application that explicitly incorporates behavior change techniques and also offers a good overall user experience. We evaluated our system, HeartHab, in a lab setting and show that our approach leads to a high user acceptance and willingness to use the system in daily activities.

SmartObjects: Fourth workshop on interacting with smart objects

Augmenting social interactions: Realtime behavioural feedback using social signal processing techniques

Nonverbal and unconscious behaviour is an important component of daily human-human interaction. This is especially true in situations such as public speaking, job interviews or information sensitive conversations, where researchers have shown that an increased awareness of one's behaviour can improve the outcome of the interaction. With wearable technology, such as Google Glass, we now have the opportunity to augment social interactions and provide realtime feedback on one's behaviour in an unobtrusive way. In this paper we present Logue, a system that provides realtime feedback on the presenters' openness, body energy and speech rate during public speaking. The system analyses the user's nonverbal behaviour using social signal processing techniques and gives visual feedback on a head-mounted display. We conducted two user studies with a staged and a real presentation scenario which yielded that Logue's feedback was perceived helpful and had a positive impact on the speaker's performance.

All Posts by Category or Tags.