incidental interaction           

where actions performed for some other purpose
or unconscious signs
are interpreted in order to influence/improve/facilitate
the actors' future interaction or day-to-day life
 
Alan Dix
Lancaster University, vfridge and aQtive
alan@hcibook.com
www.hcibook.com/alan/


Traditional human-computer interfaces are designed to be purposeful - the user wants to do something and uses the computer to do it. This is even true of games - the player wants to fly the aircraft or make Lara jump and so presses the relevant buttons.

In many experimental and proposed systems the interaction is far less direct. A person walks into a room and the smart house, sensing their presence, adjusts the air conditioning and lighting to their normal preference. The person doesn't intend to dim the light or turn up the heat it just happens because she entered the room.

One aspect of this is the richer set of sensors in ubiquitous computing and mobile applications: sensing where objects and people are, perhaps what they say, may be even their physiological state, but that is not the whole story.

Tangible computing also makes use of the sensors in the environment allowing the user to control virtual entities as they manipulate physical artefacts. However, the focus in tangible computing is again purposeful - the user moves the block representing a house because he wants the house in the virtual plan to move also.

In the smart house this is different - the occupant's purpose is to go into the room and incidentally, as a side-effect. the light and heating changes ... this is incidental interaction.

everywhere ...

When we look for incidental interaction we begin to see examples everywhere, both in existing systems and in proposed or experimental systems:

  • car lights that go on when the door is opened
  • room lights that go on and stay on so long as there is movement
  • auto-flush toilets
  • mediacup as a sensor
  • bio-sensors used for dynamic function allocation
  • active-badges (e.g. Xerox Pepys project's automatic diaries)

We can also see examples that are purely within the computer domain (that is where the sensed activity is purely electronic):

  • adaptive interfaces
  • automatic 'help' systems such as the Microsoft paper clip!
  • other forms of 'auto-completion' or automatic macro creation interfaces (e.g. Eager)
  • e-shopping systems that recommend alternative purchases based on your previous shopping basket (e.g. Amazon)

Having looked at these I also realised that onCue, on which I worked for some time, is exactly taking advantage of incidental interaction - it watches the clipboard and when the user cuts or copies anything it analyses the type of data and adjusts its toolbar to suggest potential things to do with the copied data.

a definition ...

incidental interaction
where actions performed for some other purpose or unconscious signs
are interpreted in order to influence/improve/facilitate
the actors' future interaction or day-to-day life

Note the parts of this definition.

First "for some other purpose" distinguishing incidental interaction from purposeful interaction such as switching on a light, or selecting an option from an on-screen menu. This does not mean that the actor is unaware that the action may have secondary effects, but it is not why the action is performed.

Second "or unconscious signs" is to include physiological signs such as body temperature, unconscious reactions such as blink rate, or unconscious aspects of activities such as typing rate, vocabulary shifts (e.g. modal verbs). The word 'unconscious' emphasises it is still a (human) actor or group of actors who are being directly or indirectly sensed, in contrast, say, to a thermostat which is responding to purely environmental conditions (although using room temperature to detect number of participants would be included).

Note also the use of the plural for both 'actions' and 'signs'. This is because many forms of incidental interaction will involve time series and sensor fusion - using many actions or environmental readings over a period of time to generate a model of user activity.

Third "are interpreted in order to ..." distinguishing incidental interaction from undesigned influences. For example, if you repeatedly take the same route through a wood, you will wear down a path through the undergrowth, which will then make the journey easier. Of course, noticing how the environment unintentionally and accidentally moulds itself to us can be and has been a fruitful inspiration for designed systems. Similarly the computer environment may have unintentional interactions, for example, in a small network you may be aware if one of your colleagues is transferring a large file as it slows down the network.

The fourth part "to influence/improve/facilitate" is a little problematic. For example, a criminal under a curfew order may have a sensor fitted to a bracelet that administers a small electric shock if it detects infringements. Hence the use of the inclusive word 'influence'. However, on the assumption that most uses will not be so coercive the definition also includes alternative more benign wording!

Fifth "the actors' ..." again focusing on human actors and the effects on them. Note that this is phrased in the plural so as to allow either a single actor, or where, say, a system that notices that a group of people are doing something (perhaps just being together) and reacts accordingly. However, very significant is that it is the actors' own lives/interactions that are being affected. This is to distinguish incidental interaction from surveillance or other forms of monitoring (e.g. security video, recording transaction details for data mining).
intention
purposeful accidental
who it affects actor purposeful
interaction
incidental
interaction
someone
else
control /
messaging
monitoring /
surveillance

In the case of a group, the effect may be on the group as a whole (e.g. the system detects a particular project group in the meeting room and the wall screen opens the group workspace) or may be more inter-personal (e.g. one member's mediacup movements mean that another member of the group gets a 'person X is in the office' message). There is of course no hard boundary between these inter-personal incidental interactions and surveillance, except that the former tend to be symmetric and indirectly impinge back on the primary actor through the enhanced group interaction. We will return later to the relationship between incidental interaction and awareness.

Finally, the last clause says "future interaction or day-to-day life". This is to include things like menus that change their defaults depending on assumed tasks (future interaction) and physical things such as the room temperature control (day-to-day life). Although not emphasised earlier, this is also true of the actions and signs being sensed and interpreted. Valid sensors for incidental interaction may include:

  • watching the users own computer interactions (e.g. screen saver cutting in after inactivity),
  • watching the system state which has been affected by the user (e.g. keeping a diary of altered files),
  • watching the users own body (e.g. bio-sensors or recognition of body gestures), or
  • watching the environment that has been affected by the user (e.g. fridge light comes on when the door is opened).

awareness and ambience ... incidental perception

Issues of awareness and also ambient interfaces obviously share a lot with incidental interaction. Whereas incidental interaction is about things happening to you incidental to your main locus of activity, awareness is about what you perceive of others incidental to your main focus of perception. Of course, gathering the information to give awareness may be explicit (e.g. launching an instant messaging window), but as often may be implicit (e.g. mediacup), so awareness may be a result of incidental interaction.
intention
purposeful accidental
modality sensing
actor
conventional
input
incidental
interaction
influencing
actor
conventional
output
awareness /
ambient

A lot of the awareness literature is focused on other people's actions whereas ambient interfaces are typically focused on giving awareness of things in the environment (e.g. network traffic). In incidental interaction a sensor may detect the effects of your actions on the physical environment and reflect this in some way in the computer system. In ambient interfaces the system reflects some aspect of the computer system in the physical environment.
ambient 
interface:
computer
system state
physical
environment
of user
incidental 
interaction:
physical
environment
of user
computer
system state

incidental human-human interaction ...

Of course incidental interaction is not just a feature of human-computer interaction. Again and again ethnographic studies have exposed the importance of subtle unintentional interactions within workgroups. For example, in the studies of the London Underground control centre, the large size of the main display screen (showing the locations of trains) means that controllers have some awareness of where other controllers are looking on the screen. The same studies also showed the importance of overhearing. As one controller is talking on the telephone and discovering, the neighbouring controller starts to act in response to the problem even before the first controller has passed on the information.

Of course, the controllers are at some level aware of this potential and may emphasise their actions speak louder etc. to facilitate third-party 'over hearing'. So, third party incidental interactions have become co-opted and become, at least subconsciously (if this is not an oxymoron), intentional.

co-opted interactions ... stage whispers

This management of incidental interaction is not confined to human-human interactions. In a hotel room the guest is sitting quietly in bed reading a book - the lights go out - "not again" she thinks and waves her arms - the sensor detects the movement and the lights come back on. Similarly auto-flush toilets are designed to detect a person sitting and then standing up and moving away. If you want to flush the toilet deliberately it is possible to deliberately 'fool' the sensor by moving a hand back and forth in front of the sensor. Automatic interior car courtesy lights can also be controlled in this way. You have just got into the car and are checking your route on the map when the light goes out. You could switch the interior light on, but might simply partly open and close the car door re-triggering the courtesy light.

We can do these things because human beings are 'natural' scientists constantly analysing, theorising and modelling our environment, then using this knowledge to predict and control (natural engineers as well).

In onCue we also noticed users co-opting the incidental behaviour for purposeful activity. In the early versions of onCue there was no way to explicitly address it. All interaction was incidental through the clipboard. Enthusiastic users of onCue who wanted it to do something for them could not simply enter it into onCue. Instead they would deliberately type a word or phrase into a word-processor window - that was not intended to be part of the document - and then copy it knowing that onCue would react to it. In later versions of onCue we added a type-in box that would appear if someone selected the onCue window to allow users to directly address it.

failures of co-option ...

Of course, part of our design of incidental interaction must be a recognition of model making and co-option by our users. In onCue we got this wrong initially.

If the rules are complex or non-deterministic (perhaps relying on statistical algorithms or spare resources) then there are many possibilities for confusion. At one extreme the user may not be able to understand the relationship between what they do and the effects they cause. This magic model of the interface may be the safest form of failure.

More problematic are times when the user comes to rely on incidental behaviour either not realising it is probabilistic or unreliable, or not understanding completely the circumstances in which it operates. For example, in a Java development environment I use, the system notices when you are about to type a method name and suggests possibilities based on the class of the variable. This is very useful and I come to rely on it, but occasionally, and annoyingly doesn't work. Over time I've come to understand some of the reasons, but still sometimes find myself waiting for a prompt that never appears.

Even more confusing are coincidental interactions where two potential causes are often coincident and the user infers the wrong relationship. For example, you might assume that the lobby lights turn on because of some movement sensor when in fact they are on a timer triggered by opening office doors. This may only become apparent of you stood talking in the corridor for a long time.

These issues of magic, unreliability and co-incidence are potential problems in all interfaces, but perhaps especially so in incidental interaction because the algorithms used are more likely to be non-deterministic and also because they are incidental they are more likely to be undocumented.

building incidental interactions ...

Most ubiquitous systems, wearables and bio-sensors involve bespoke architectures. For smart homes there are bus standards, but still quite diverse. For closed systems involving incidental interaction, such as the hotel room lights, standardisation doesn't matter. However, if we want one application to eavesdrop on sensors intended for some other application it is essential that architectures are open both in terms of the ability to listen-in on events and also in the interpretation of events. This is true of both electronic domains and physical sensors.

Let's look first at onCue as an example in the electronic domain. When designing onCue one of the few events that we were able to reliably listen-in to was copy/cut to the clipboard. This was partly because we were intending to eventually target multiple platfoms and looking for commonality across platforms. If we had been targeting MacOS we would have found this easier as 'good' MacOS applications are supposed to be factored so that the user interface and back-end communicate via AppleEvents. This is partly to allow external scripting, but has the effect of making incidental interaction easier to implement. On Windows it was possible to get some interactions on an application-by-application basis using COM, but not at this level of generality.

Architectures for ubiquitous applications are still in their infancy and the need for extensible mechanisms for incidental interaction should be one of the drivers. Happily, the implementation mechanisms being used often do involve fairly open eventing mechanisms.

Of course, having an open event infrastructure within a computer is one thing, but having this within the home poses new problems of privacy and security. What if the new net-enabled toaster you have recently bought from the door-to-door salesman is surreptitiously using the open event architecture of your in-house domus-net to monitor your behaviour and transmit your comings-and-goings to a the door-to-door salesman's burglar brother. If that sounds far fetched how about the information your internet fridge sends to the supermarket about your eating preferences?

As well as security issues, open architectures could cause performance problems. Some eventing systems force all event listeners to act synchronously and serially - the model is that an event listener is given the event and can either do something or pass the event on for further processing.

In the design of onCue the Windows clipboard listener interface was particularly problematic in that the individual applications were given the link to the next application's handler and expected to pass it on. Badly written applications loaded after onCue could unintentionally 'consume' the event. The underlying model for this form of event chain comes from GUI component chains where different levels get a chance to 'consume' events before passing them on to other levels either top down (applicationwindowwidget) or bottom up (widgetwindowapplication).

This sounds as though it would only be an issue for within PC event notification systems, but in fact distributed event systems sometimes 'inherit' this model. This means that before an event can be produce its intended effect the system has to pass the event to any remote agents that have registered an interest in the event.

It is clear that event architectures allowing incidental interaction should always allow asynchronous event listening - "tell me sometime after". Furthermore, the rate of event notification should ideally be more flexible. In GtK (Getting-to-Know) an experimental notification server, Devina Ramduny-Ellis and I have experimented with pace impedance - allowing applications to select rates of event notification appropriate for the user's tasks. For example, a background window may elect to only be sent events bundled every 15 seconds rather than exactly as they happen. However, in the existing GtK, this rate is determined by the application registering interest.

For incidental interaction it would be ideal of the listening application could also register weak interests "tell me when you have time" allowing the notification server to prioritise event notification central to the users' purposeful interaction, queuing up incidental event notification until a gap in activity. At the extreme listeners should be able to register "tell me if you have time" allowing the notification service to optionally drop events altogether or flush pending event queues when they get too full.

Writing on incidental interaction:

A. Dix (2002). incidental interaction
working paper (PDF, 328K)

G. Baxter, A. Dix, A. Monk, A. Schmidt and N. Streitz (2005).
Trust and Incidental Interaction: Would You Let a Talking Paper Clip Run YOUR Home? In Proceedings of the INTERACT '05: Communicating Naturally through Computers (Adjunct Proceedings), Bueno, F., Constabile, M.F., Paterno, F. & Santoro, C. (eds.), pp. 73-74
abstract and Alan's slides

A. Dix (2002). beyond intention - pushing boundaries with incidental interaction. Proceedings of Building
Bridges: Interdisciplinary Context-Sensitive Computing, Glasgow University, 9 Sept 2002
abstract

A. Dix (2002). Managing the Ecology of Interaction. Proceedings of Tamodia 2002 - First International Workshop on Task Models and User Interface Design, Bucharest, Romania, 18-19 July 2002
abstract, paper and slides

Also discussed in Chapter 18 of A. Dix, J. Finlay, G. D. Abowd and R. Beale (2004) Human-Computer Interaction, third edition. Prentice Hall.

examples ...

courtesy lights

In many cars an the interior light turns on triggered by various events: the opening of the doors, unlocking the car, stopping the engine. These car-related events are chosen to correspond with likely times the interior lights are likely to be required - whilst getting in or out of the car. Other car-related events may turn the lights off: locking the car, starting the engine.

The lights are also typically turned off by some timer - I think to stop the battery running down. Note that this is because the sensors that are available (doors opening) are only loosely correlated with the actual desired status (people in car) - it would be possible to open the door, then not get in and the lights stay on overnight.

A more sophisticated sensor, perhaps tied into the car-alarm's infra-red sensor, could turn the lights on only when and so long as people were in the car and it isn't actually in motion.

The driver's purpose in opening the door is to get in the car; incidentally the lights go on making it easier to get settled.

onCue

aQtive onCue is a sort of 'intelligent toolbar', it sits at the side of the screen and 'watches' the clipboard. Whenever anything is cut or copied to the clipboard onCue looks at it and tries to recognise what kind of thing it is: ordinary text, a table of numbers, a post code (zip code), a person's name. Depending on the type of the object onCue adds icons to its' toolbar window that suggest things you can do with the text in the clipboard: for example a search engines for plain text, graphing tools or spreadsheet for a table, mapping tools for post codes, directory services for names.

The user's purpose in copying the data is to paste it somewhere else; incidentally onCue offers alternative things to do with it.

onCue how-it-works:
http://www.aqtive.net/community/
developers/developers-pack/onCue-hiw/

aQtive: http://www.aqtive.net/

mediacup

The mediacup is an ordinary coffee mug, but with an added electronic base unit. In the base unit are temperature sensors, switches to detect tilt and movement and small infra-red unit broadcasting the cup's state every few seconds. Infra-red receivers around the office building pick up the infra-red signals and then interpret the measurements as indications of user activity and location. This can then be used to give both explicit information and general awareness information to colleagues.

Hans's purpose in filling and lifting the cup is to drink some coffee; incidentally his colleagues become aware that he is taking a mid-morning break.

mediacup@teco - http://mediacup.teco.edu/

shopping cart

As you move around the Amazon site and look at books, perhaps choosing to buy some, the web site records your choices and behaviour. This is used partly to build a model of overall user behaviour and preferences (people who like book X also like book Y). It is also used to suggest other books you might like based partly on special promotions and partly on previous users' observed behaviour.

My purpose in navigating the site is to find Gregory's 'Geographical Imaginations'; incidentally Soja's 'Post-modern Geography' is suggested to me (and I buy it!).

Xerox Pepys

In Xerox' Cambridge laboratories a few years ago, everyone was issued with 'active badges'. These used small infra-red transmitters to broadcast their location to receivers throughout the office building. At the end of each day the location data was analysed to produce personalised diaries for each person. It knew about the office layout so it could say "went to Paul's office", but also could use the fact that, say, several people were in a room together to say "had meeting with Allan and Victoria".

Victoria's purpose in walking round the building is to visit Paul's office; incidentally a diary is produce for them both recording the meeting.

active badge - http://www.uk.research.att.com/ab.html

Pepys project XRCE - http://www.xrce.xerox.com/ programs/mds/past-projects/pepys.html


http://www.hcibook.com/alan/topics/incidental/ Alan Dix 20/3/2002