Posts tagged: Intelligible UI

Gestu-wan - an intelligible mid-air gesture guidance system for walk-up-and-use displays

We present Gestu-Wan, an intelligible gesture guidance system designed to support mid-air gesture-based interaction for walk-up-and-use displays. Although gesture-based interfaces have become more prevalent, there is currently very little uniformity with regard to gesture sets and the way gestures can be executed. This leads to confusion, bad user experiences and users who rather avoid than engage in interaction using mid-air gesturing. Our approach improves the visibility of gesture-based interfaces and facilitates execution of mid-air gestures without prior training. We compare Gestu-Wan with a static gesture guide, which shows that it can help users with both performing complex gestures as well as understanding how the gesture recognizer works.

Augmenting social interactions: Realtime behavioural feedback using social signal processing techniques

Nonverbal and unconscious behaviour is an important component of daily human-human interaction. This is especially true in situations such as public speaking, job interviews or information sensitive conversations, where researchers have shown that an increased awareness of one's behaviour can improve the outcome of the interaction. With wearable technology, such as Google Glass, we now have the opportunity to augment social interactions and provide realtime feedback on one's behaviour in an unobtrusive way. In this paper we present Logue, a system that provides realtime feedback on the presenters' openness, body energy and speech rate during public speaking. The system analyses the user's nonverbal behaviour using social signal processing techniques and gives visual feedback on a head-mounted display. We conducted two user studies with a staged and a real presentation scenario which yielded that Logue's feedback was perceived helpful and had a positive impact on the speaker's performance.

Crossing the bridge over norman's gulf of execution: Revealing feedforward's true identity

Feedback and affordances are two of the most well-known principles in interaction design. Unfortunately, the related and equally important notion of feedforward has not been given as much consideration. Nevertheless, feedforward is a powerful design principle for bridging Norman's Gulf of Execution. We reframe feedforward by disambiguating it from related design principles such as feedback and perceived affordances, and identify new classes of feedforward. In addition, we present a reference framework that provides a means for designers to explore and recognize different opportunities for feedforward.

Understanding complex environments with the feedforward torch

In contrast with design flaws that occur in user interfaces, design flaws in physical spaces have a much higher cost and impact. Software is in fact fairly easy to change and update in contrast with legacy physical constructions where updating their physical appearance is often not an option. We present the Feedforward Torch, a mobile projection system that targets the augmentation of legacy hardware with feedforward information. Feedforward explains users what the results of their action will be, and can thus be seen as the opposite of feedback. A first user study suggests that providing feedforward in these environments could improve their usability.

PervasiveCrystal: Asking and answering why and why not questions about pervasive computing applications

Users often become frustrated when they are unable to understand and control a pervasive computing environment. Previous studies have shown that allowing users to pose why and why not questions about context-aware applications resulted in better understanding and stronger feelings of trust. Although why and why not questions have been used before to aid in debugging and to clarify graphical user interfaces, it is currently not clear how they can be integrated into pervasive computing systems. We explain in detail how we have extended an existing pervasive computing framework with support for why and why not questions. This resulted in PervasiveCrystal, a system for asking and answering why and why not questions in pervasive computing environments.

I bet you look good on the wall: Making the invisible computer visible

The design ideal of the invisible computer, prevalent in the vision of ambient intelligence (AmI), has led to a number of interaction challenges. The complex nature of AmI environments together with limited feedback and insufficient means to override the system can result in users who feel frustrated and out of control. In this paper, we explore the potential of visualizing the system state to improve user understanding. We use projectors to overlay the environment with a graphical representation that connects sensors and devices with the actions they trigger and the effects those actions produce. We also provided users with a simple voice-controlled command to cancel the last action. A small first-use study suggested that our technique could indeed improve understanding and support users in forming a reliable mental model.

Answering why and why not questions in ubiquitous computing

Users often find it hard to understand and control the behavior of a Ubicomp system. This gives rise to usability problems and can lead to loss of user trust, which may hamper the acceptance of these systems. We are extending an existing Ubicomp framework to allow users to pose why and why not questions about its behavior. Initial experiments suggest that these questions are easy to use and could help users in understanding how Ubicomp systems work.

A component-based infrastructure for pervasive user interaction

Since a growing number of different mobile computing devices are used in pervasive and ubiquitous environments, the need to adopt new approaches for designing and implementing pervasive interactive software with minor effort is emerging. In this paper we present a process that facilitates the design of next-generation interactive software for pervasive environments. We created a distributed runtime infrastructure that enables the distribution of software components on heterogeneous, networked and embedded hardware systems. Some of these components or compositions of components will require interaction by human users from a large range of different devices. To make the deployment of consistent and functional User Interfaces in these pervasive environments easier, Interaction Components are introduced into the runtime infrastructure which enable the presentation of component and service behavior to human users.

All Posts by Category or Tags.