Posts tagged: HCI

User driven evolution of user interface models - the FLEPR approach

In model-based user interface development, models at different levels of abstraction are used. While ideas may initially only be expressed in more abstract models, modifications and improvements according to user's feedback will likely be made at the concrete level, which may lead to model inconsistencies that need to be fixed in every iteration. Transformations form the bridge between these models. Because one-to-one mappings between models cannot always be defined, these transformations are completely manual or they require manual post-treatment. We propose interactive but automatic transformations to address the mapping problem while still allowing designer's creativity. To manage consistency and semantic correctness within and between models and therefore to foster iterative development processes, we are combining these with techniques to track decisions and modifications and techniques of intra- and inter-model validation. Our approach has been implemented for abstract and concrete user interface models using Eclipse-based frameworks for model-driven engineering. Our approach and tool support is illustrated by a case study.

Read more →

Squeeze me and i'll change: An exploration of frustration-triggered adaptation for multimodal interaction

Complex 3D interaction in virtual environments may inhibit user interaction and cause frustration. Supporting adaptivity based on the detected user frustration can be considered as one promising solution to enhance user interaction. Our work proposes to provide adaptive assistance to users who are frustrated during their interac- tion with 3D user interfaces in virtual environments. The obtrusive- ness of physiological measurements to detect frustration inspired us to investigate the pressure patterns exerted on a 3D input de- vice for this purpose. The experiment presented in this paper has shown a great potential on utilizing the finger pressure measures as an alternative to physiological measures to indicate user frustration during interaction. Furthermore, the findings in this particular con- text showed that adaptation of haptic interaction was effective in increasing the user's performance and making users feel less frus- trated in performing their tasks in the 3D environment.

Read more →

GRIP: Get better results from interactive prototypes

Despite the fact many proposals have been made for abstract user interface models it was not given a detailed context in which it should or could be used in a user-centered design process. This paper presents a clear role for the abstract user interface model in user-centered and model-based development, provides an overview of the stakeholders that may create and/or use abstract user interface models and presents a modular abstract user interface modeling language, CAP3, that makes relations with other models explicit and builds on the foundation of existing abstract user interface models. The proposed modeling notation is supported by a tool and applied to some case studies from literature and in some projects. It is also validated based on state-of-the-art knowledge on domain-specific modeling languages and visual notations and some case studies.

Read more →

CAP3: Context-sensitive abstract user interface specification

Despite the fact many proposals have been made for abstract user interface models it was not given a detailed context in which it should or could be used in a user-centered design process. This paper presents a clear role for the abstract user interface model in user-centered and model-based development, provides an overview of the stakeholders that may cre- ate and/or use abstract user interface models and presents a modular abstract user interface modeling language, CAP3, that makes relations with other models explicit and builds on the foundation of existing abstract user interface models. The proposed modeling notation is supported by a tool and applied to some case studies from literature and in some projects. It is also validated based on state-of-the-art knowledge on domain-specific modeling languages and visual notations and some case studies.

Read more →

A unified scalable model of user localisation with uncertainty awareness for large-scale pervasive environments (best paper)

Localisation has become a standard feature in many mobile applications. Numerous techniques for both indoor and outdoor location tracking are available today, providing a diversity of ways positioning information can be delivered to a mobile application (e.g., a location-based service). Such factors as the variation of precision over time and covered areas or the difference in quality and reliability make the adoption of several techniques for one application cumbersome. This work presents an approach that models the capabilities of localisation systems and then uses this model to build a unified view on localisation, with special attention paid to uncertainty coming from different localisation conditions and its presentation to the user. We discuss technical considerations, challenges and issues of the approach and report about a user study on users' acceptance of the suggested behaviour of an application based on the approach. The results of the study showed the feasibility of the approach and revealed users' preference towards automatic but yet informed changes they experienced while using the application.

Read more →

PervasiveCrystal: Asking and answering why and why not questions about pervasive computing applications

Users often become frustrated when they are unable to understand and control a pervasive computing environment. Previous studies have shown that allowing users to pose why and why not questions about context-aware applications resulted in better understanding and stronger feelings of trust. Although why and why not questions have been used before to aid in debugging and to clarify graphical user interfaces, it is currently not clear how they can be integrated into pervasive computing systems. We explain in detail how we have extended an existing pervasive computing framework with support for why and why not questions. This resulted in PervasiveCrystal, a system for asking and answering why and why not questions in pervasive computing environments.

Read more →

Pervasive maps: Explore and interact with pervasive environments

Efficient discovery of nearby devices and services is one of the preconditions to obtain a usable pervasive environment. Typical user interfaces in these environments hide the heterogeneity of the environment for end-users which often makes it hard to perceive the provided functionality. We present Pervasive Maps, an approach and tool that allows to create an intuitive user interface for exploring and controlling the environment. Pervasive Maps offers user-oriented views on the user's environment based on pictures of this environment. We show how users can model, explore and finally interact with complex pervasive environments using Pervasive Maps.

Read more →

Jelly: A multi-device design environment for managing consistency across devices

When creating applications that should be available on multiple computing platforms, designers have to cope with dif- ferent design tools and user interface toolkits. Incompatibilities between these design tools and toolkits make it hard to keep multi-device user interfaces consistent. This paper presents Jelly, a flexible design environment that can target a broad set of computing devices and toolkits. Jelly enables designers to copy parts of a user interface from one device to another and to maintain the different user interfaces in concert using linked editing. Our approach lowers the burden of designing multi-device user interfaces by eliminating the need to switch between different design tools and by providing tool support for keeping the user interfaces consistent across different platforms and toolkits.

Read more →