Discreet Computing

Speaker:  Aaron Quigley – Sydney, Australia
Topic(s):  Human Computer Interaction


Computing and interaction are changing the nature of humanity. As individuals our capabilities can be extended, our memories augmented and our senses attuned. Societies are being reshaped by our ability to interconnect and harness the abilities of millions. Interaction is all around us and this talk offers a new vision of computing called Discreet Computing. 

Discreet Computing is intentionally unobtrusive through its design, development and use. Aspects of wearable, invisible, ambient and ubiquitous computing are key as discreet computing is woven into the literal or figurative fabric of day to day life. This talk provides a view of eight dimensions of discreet computing along with real research examples. 

Today, the nature of mobile technology gives rise to people seeking to hide it, make it invisible, camouflage it or demonstrate polite and discreet use (e.g. placing it face down when with others). However, commodity devices aren't well equipped to support such use as they require obvious interaction with touch, movement or speech. Haptic and audio signals may provide subtle outputs but inputs aren't so subtle. SpeCam supports discreet micro-interactions to avoid the many micro-distractions we face with today's mobile devices. This leverages a natural use of mobile devices today, namely when they are placed face down on flat surfaces. People may place their phones face-down on a surface to signal their intent to engage socially with those around them, by limiting their access to distractions, external entertainment and self-gratification. Others will keep such devices fully hidden from view in a bag or pocket. Here the material sensing technique uses only the front facing camera and display as a multi-spectral light source. Studies show that it can recognise colours at 10 degree aparts in the HSB space, and 30 types of surface materials with 99% accuracy. 

Or we might consider Discreet Gesture Interaction using EOG Sensors in Smart Eyewear. This sensing technique can detect finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglass wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures can be used to control a wearable computer without calling attention to the user in public. 

This talk considers the question of what is “discreet computing” and what research and development challenges are there in context-awareness which will allow us to afford subtle, discreet, unobtrusive and seamless interactions. 

About this Lecture

Number of Slides:  35
Duration:  40 minutes
Languages Available:  English
Last Updated: 

Request this Lecture

To request this particular lecture, please complete this online form.

Request a Tour

To request a tour with this speaker, please complete this online form.

All requests will be sent to ACM headquarters for review.