, , , , & , Hidden in plain sight: An exploration of a visual language for near-eye out-of-focus displays in the peripheral view, in Jofish Kaye, Allison Druin, Cliff Lampe, Dan Morris, & Juan Pablo Hourcade (eds.), Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, May 7-12, 2016,487-497 (ACM ). DOI  PDF


In this paper, we set out to find what encompasses an appropriate visual language for information presented on near-eye out-of-focus displays. These displays are positioned in a user's peripheral view, very near to the user's eyes, for example on the inside of the temples of a pair of glasses. We explored the usable display area, the role of spatial and retinal variables, and the influence of motion and interaction for such a language. Our findings show that a usable visual language can be accomplished by limiting the possible shapes and by making clever use of orientation and meaningful motion. We found that especially motion is very important to improve perception and comprehension of what is being displayed on near-eye out-of-focus displays, and that perception is further improved if direct interaction with the content is allowed.

«  Integrating serious games and tangible objects for functional handgrip training | Hasselt »