Interactions with computing systems and conversational services such as ChatGPT have become an inherent part of our daily lives. It is surprising that user interfaces, the gateways through which we communicate with an interactive intelligent system, are still predominantly devoid from hedonic aspects. There is little attempt to make communication through user interfaces intentionally more like communication with humans. Anthropomorphic user interfaces can transform interactions with intelligent software into more pleasant experiences by integrating human-like attributes. Anthropomorphic user interfaces expose human-like attributes that enable people to perceive, connect and interact with the interfaces as social actors. This integration of human-like aspects not only enhances user experience but also holds the potential to make interfaces more sustainable, as they rely on familiar human interaction patterns, thus potentially reducing the learning curve and increasing user adoption rates. However, there is little consensus on how to build these anthropomorphic user interfaces. We conducted an extensive literature review on existing anthropomorphic user interfaces for software systems (past), in order to map and connect existing definitions and interpretations in an overarching taxonomy (present). The taxonomy is used to organize and structure examples of anthropomorphic user interfaces into an accessible collection. The taxonomy and an accompanying web tool provides designers with a reference framework for analyzing and dissecting existing anthropomorphic user interfaces, and for designing new anthropomorphic user interfaces (future).
Posts tagged: Accessibility
Impact of situational impairment on interaction with wearable displays
The number of wearable devices that we carry increases, with smaller companion devices like smartwatches providing quick access for simple tasks. These devices are, however, not necessarily in direct sight of the user and during everyday activities, it is unlikely, even undesirable, that the user constantly focuses on or interacts with these screens. Furthermore, interaction is often limited because our hands are occupied carrying or holding items such as bags, papers, boxes, or tools. In this paper, we evaluate how encumbrance affects, among others, the time it takes to perceive and react to a notification depending on the placement of the companion device. Our experimental results can assist designers in choosing the right device for the task.
Runtime personalization of multi-device user interfaces: Enhanced accessibility for media consumption in heterogeneous environments by user interface adaptation
The diversity of end-user devices in combination with a growing user base poses important challenges for providing easy access to the huge amount of content and services currently available. Each device has its typical set of capabilities and characteristics that must be taken into account to create an appropriate user interface that provides interactive access to multimedia data and services. Furthermore, end-users also have their specific requirements that influence the accessibility of data and services for individual access. The approach we present in this paper is geared towards the idea of universal access to interactive multimedia data and services for everyone, independent of the user characteristics or end-user device capabilities. For this purpose we combine user and device models with high-level user interface description languages in order to decouple the interface presentation from its platform, and to generate the most suitable interface on a per-user, per- device basis making use of the semantics that are provided by user and device profile.
Profile-aware multi-device interfaces: An MPEG-21-based approach for accessible user interfaces
The wide diversity of consumer devices has led to new methodologies and techniques to make digital content available over a broad range of devices with minimal effort. In particular the design of the interactive parts of a system has been the subject of a lot of research efforts because these parts are the most visible and are critical for the usability (and thus use) of a system. One thing that is missing in many current approaches is the ability to combine these new methodologies and techniques with a user-centric approach to ensure preferences from and requirements for a specific user are taken into account besides the device adaptations. In this paper we analysed the applicability of MPEG-21, part 7: Digital Item Adaptation, for the adaptation of a user interface to user characteristics. We show how the high-level XML-based user interface description language UIML in combination with an MPEG-21-based user profile enables designers to create accessible and personalised multi-device user interfaces. Using this combination results in user interfaces that can be deployed on a broad range of devices while taking into account user preferences with minimal effort. This approach enhances accessibility to digital items on various platforms, since all interactions with digital items should be supported by an appropriate user interface.