Trajectories Through The User Experience

Speaker:  Steve D Benford – Nottingham, United Kingdom
Topic(s):  Human Computer Interaction


The idea of designing the ‘user journey’ is familiar to User Experience designers. However, the emergence of mobile interaction, followed by augmented reality and now the Internet of Things challenges us to extend our view of such journeys. As the digital becomes evermore closely interleaved with the physical, so UX has to consider the design of the physical aspects of experience and how the digital becomes embedded within them.

This lecture introduces a new approach to thinking about the design of extended user experiences that weave digital media into physical objects and places. At its heart is the idea of designing experiences in terms of three kinds of trajectory – intended trajectories that express the designer’s plan; actual trajectories that describe how participants’ experience may diverge from and reconverge with this plan;  and historic trajectories that consider how participants reflect on and recount their experiences afterwards. The lecture introduces each if the conepts in turn and shows how it can help in the design of transitions between physical and digital experiences, managing the gaps or ‘seams’ in wireless and positioning technologies, and shaping social encounters and moments of isolation.

This work on trajectories first emerged from a series of projects in which artists designed unusual games and performances that took place between players on the city streets and those online and that drew physical props and locations into a digital narrative. The lecture will draw on video documentation of these works to motivate and illustrate trajectory concepts. However, trajectories have since been adopted by more mainstream companies, including the BBC which has been using them to shape the design of future learning experiences. The lecture will also draw on these examples to illustrate the wider application of the approach.

This work was first published in a series of papers at the ACM’s annual CHI conference, two of which won best paper awards (top 1% of all submissions). A book-length treatment was published under the title ‘Performing Mixed Reality’ by MIT Press.

Versions of this lecture have previously been given at Microsoft, Google, Disney, Yahoo and Stanford, Washington and Fudan Universities among others as well as through invited keynotes at the Computer Aided Learning (CAL) and Collaborative Technologies and Systems (CTS) conferences. Video documentation of the Stanford lecture is available at:

About this Lecture

Number of Slides:  30
Duration:  20 - 50 minutes
Languages Available:  English
Last Updated: 

Request this Lecture

To request this particular lecture, please complete this online form.

Request a Tour

To request a tour with this speaker, please complete this online form.

All requests will be sent to ACM headquarters for review.