Current digital design tools that have a high connectivity offer a wide range of possibilities for both co-located and remote collaborative design activities. However, from the point of view of conventional collaborative design practices we identified with practitioners and design companies, these tools lack integrated and comprehensive support during the ideation phase. Consequently, we propose a reference framework with solutions for supporting collaboration among professional designers with digital tools in the early stages of design.
Posts tagged: HCI
Study and analysis of collaborative design practices
Proxemic flow: Dynamic peripheral floor visualizations for revealing and mediating large surface interactions
PaperPulse: An integrated approach to fabricating interactive paper
PaperPulse: An integrated approach for embedding electronics in paper designs
PaperPulse: An integrated approach for embedding electronics in paper designs
We present PaperPulse, a design and fabrication approach that enables designers without a technical background to produce standalone interactive paper artifacts by augmenting them with electronics. With PaperPulse, designers overlay pre-designed visual elements with widgets available in our design tool. PaperPulse provides designers with three families of widgets designed for smooth integration with paper, for an overall of 20 different interactive components. We also contribute a logic demonstration and recording approach, Pulsation, that allows for specifying functional relationships between widgets. Using the final design and the recorded Pulsation logic, PaperPulse generates layered electronic circuit designs, and code that can be deployed on a microcontroller. By following automatically generated assembly instructions, designers can seamlessly integrate the microcontroller and widgets in the final paper artifact.
Helaba: A system to highlight design rationale in collaborative design processes
Design activities associated to the ideation phase of design processes require mutual understanding and clear communication based on artefacts. However, this is often a challenge for remote and multidisciplinary teams due to the lack of ad hoc tools for this purpose. Our approach is to solve these limitations by explicitly connecting pieces of information related to design rationale, feedback, and evolution with the artefacts that are subject of communication. We propose Helaba, a system that creates a shared workspace to support communication revolving around design artefacts and activities within multidisciplinary teams. Helaba supports design communication and rationale, and potentially leads to more satisfying outcomes from the design process.
Hasselt UIMS: A tool for describing multimodal interactions with composite events
Gestu-wan - an intelligible mid-air gesture guidance system for walk-up-and-use displays
We present Gestu-Wan, an intelligible gesture guidance system designed to support mid-air gesture-based interaction for walk-up-and-use displays. Although gesture-based interfaces have become more prevalent, there is currently very little uniformity with regard to gesture sets and the way gestures can be executed. This leads to confusion, bad user experiences and users who rather avoid than engage in interaction using mid-air gesturing. Our approach improves the visibility of gesture-based interfaces and facilitates execution of mid-air gestures without prior training. We compare Gestu-Wan with a static gesture guide, which shows that it can help users with both performing complex gestures as well as understanding how the gesture recognizer works.
Empirical study: Comparing hasselt with c\# to describe multimodal dialogs
Previous research has proposed guidelines for creating domain-specific languages for modeling human-machine multimodal dialogs. One of these guidelines suggests the use of multiple levels of abstraction so that the descriptions of multimodal events can be separated from the human-machine dialog model. In line with this guideline, we implemented Hasselt, a domain-specific language that combines textual and visual models, each of them aiming at describing different aspects of the intended dialog system. We conducted a user study to measure whether the proposed language provides benefits over equivalent event-callback code. During the user study participants had to modify the Hasselt models and the equivalent C# code. The completion times obtained for C# were on average shorter, although the difference was not statiscally significant. Subjective responses were collected using standardized questionnaires and an interview, which both indicated that participants saw value in the proposed models. We provide possible explanations for the results and discuss some lessons learned regarding the design of the empirical study.