Despite the fact many proposals have been made for abstract user interface models it was not given a detailed context in which it should or could be used in a user-centered design process. This paper presents a clear role for the abstract user interface model in user-centered and model-based development, provides an overview of the stakeholders that may create and/or use abstract user interface models and presents a modular abstract user interface modeling language, CAP3, that makes relations with other models explicit and builds on the foundation of existing abstract user interface models. The proposed modeling notation is supported by a tool and applied to some case studies from literature and in some projects. It is also validated based on state-of-the-art knowledge on domain-specific modeling languages and visual notations and some case studies.
Posts tagged: HCI
CAP3: Context-sensitive abstract user interface specification
Despite the fact many proposals have been made for abstract user interface models it was not given a detailed context in which it should or could be used in a user-centered design process. This paper presents a clear role for the abstract user interface model in user-centered and model-based development, provides an overview of the stakeholders that may cre- ate and/or use abstract user interface models and presents a modular abstract user interface modeling language, CAP3, that makes relations with other models explicit and builds on the foundation of existing abstract user interface models. The proposed modeling notation is supported by a tool and applied to some case studies from literature and in some projects. It is also validated based on state-of-the-art knowledge on domain-specific modeling languages and visual notations and some case studies.
A unified scalable model of user localisation with uncertainty awareness for large-scale pervasive environments (best paper)
Localisation has become a standard feature in many mobile applications. Numerous techniques for both indoor and outdoor location tracking are available today, providing a diversity of ways positioning information can be delivered to a mobile application (e.g., a location-based service). Such factors as the variation of precision over time and covered areas or the difference in quality and reliability make the adoption of several techniques for one application cumbersome. This work presents an approach that models the capabilities of localisation systems and then uses this model to build a unified view on localisation, with special attention paid to uncertainty coming from different localisation conditions and its presentation to the user. We discuss technical considerations, challenges and issues of the approach and report about a user study on users' acceptance of the suggested behaviour of an application based on the approach. The results of the study showed the feasibility of the approach and revealed users' preference towards automatic but yet informed changes they experienced while using the application.
PervasiveCrystal: Asking and answering why and why not questions about pervasive computing applications
Users often become frustrated when they are unable to understand and control a pervasive computing environment. Previous studies have shown that allowing users to pose why and why not questions about context-aware applications resulted in better understanding and stronger feelings of trust. Although why and why not questions have been used before to aid in debugging and to clarify graphical user interfaces, it is currently not clear how they can be integrated into pervasive computing systems. We explain in detail how we have extended an existing pervasive computing framework with support for why and why not questions. This resulted in PervasiveCrystal, a system for asking and answering why and why not questions in pervasive computing environments.
On stories, models and notations: Storyboard creation as an entry point for model-based interface development with UsiXML
Storyboards are excellent tools to create a high level specification of an interactive system. Because of the emphasis on graphical depiction they are both an accessible means for communicating the requirements and properties of an interactive system and allow the specification of complex context-aware systems while avoiding the need for technical details. We present a storyboard meta-model that captures the high level information from a storyboard and al- lows relating this information with other models that are common for engineering interactive systems. We show that a storyboard can be used as an entry point for using UsiXML models. Finally, this approach is accompanied by a tool set to make the connection between the storyboard model, UsiXML models and the program code required for maintaining these connections throughout the engineering process.
Engineering patterns for multi-touch interfaces
UIML based design of multimodal interactive applications with strict synchronization requirements
As the variety in network service platforms and end user devices grows rapidly, content providers must constantly adapt their production system to support these new technologies. In this paper, we present a middleware platform for deploying highly interactive (television) applications over a diverse collection of networks and end user devices. As the user interface of such interactive applications may vary depending on the capabilities of the different target devices, our middleware uses UIML for the description of generic user interfaces. Our middleware platform also provides a pluggable support for new networks. A factor that highly complicates the design is the need for strict synchronization between an interactive application and video or audio data that is broadcasted. In order to support a maximum of functionality, downloadable application logic is used to provide the interactive services. As a test case, an evaluation setup was built, targeting both set-top boxes and mobile phones.
The design of context-specific educational mobile games
Shortening user interface design iterations through realtime visualisation of design actions on the target device
Plug-and-design: Embracing mobile devices as part of the design environment
Due to the large amount of mobile devices that continue to appear on the consumer market, mobile user interface design becomes increasingly important. The major issue with many existing mobile user interface design approaches is the time and effort that is needed to deploy a user interface design to the target device. In order to address this issue, we propose the plug-and-design tool that relies on a continuous multi-device mouse pointer to design user interfaces directly on the mobile target device. This will shorten iteration time since designers can continuously test and validate each design action they take. Using our approach, designers can empirically learn the specialities of a target device which will help them while creating user interfaces for devices they are not familiar with.