1.1 The scope of interaction design
A staggering variety of interactive products – devices, software and services that support user activities with computing technology – have become embedded in everyday life, enabling all kinds of activities and experiences anywhere and anytime. These interactive products might include computer applications, websites, heating controllers, smart watches, bio-sensing garments, satellite navigation systems, interactive books, social media, computer games, digital hearing aids, advanced driver assistance systems, healthcare technology such as drug delivery systems, mobile applications, web services and many more.
As computing technology has developed, the nature of users’ interaction with technology has changed – and the role of interaction design has expanded accordingly. When computers became small and cheap enough to enter the general market, they became ‘personal computers’, such as the Apple II (Figure 1). Because of their size and weight, these computers would sit on desktops. Users would interact with them via a keyboard and a small, low-resolution display that showed only text and primitive graphic symbols. Data was stored on magnetic ‘floppy disks’, and any operations carried out on the data would be slow by today’s standards. With these characteristics, these computers could only be used comfortably by one person at a time. All this meant that the experience of the user was relatively simple and straightforward, as well as constrained.
40 years later, computers don’t just sit on desktops but have also become embedded in interactive products all around us, on both small and large scales – in our workplaces, homes, cities, transportation or clothes – even in our bodies. Computers such as smartphones and tablets are now so small and light that we can carry them around with us and use them almost everywhere. We no longer rely solely on a keyboard and mouse to communicate with these computers, but we can interact with them through touchscreen or voice, and so our use can be more spontaneous, and we can do many tasks while on the move.
The number of activities for which we use interactive devices has also increased – because the integration of capabilities such as wireless connectivity, high-speed data processing, high-definition graphics, video and sound means that we can use a single device to carry out a range of activities and enjoy a variety of experiences, such as listening to music, watching movies, messaging friends, calling or video conferencing with colleagues, reading books, browsing the internet, taking photos or drawing pictures.
These capabilities mean that computers and the range of applications available for them have become an integral part of our daily lives, often changing the way we do things – for example, the way we access information (see Figure 2), entertain ourselves or socialise. Furthermore, they have expanded the way we explore and experience the world around us.
Consider how the interactive maps available on smartphones have changed people’s relationship with and experience of their surroundings. Do you remember what it was like trying to find your way around a new city before these came along – having to unfold a large paper map (Figure 3a), find the relevant section, identify your correct position on it, etc.? Now we only have to get our phones out and with a few taps we get to an interactive map (Figure 3b), that shows us exactly where we are, which direction we are moving in, how far we are from where we want to be, what routes we can follow to get there, what other places of interest we could find along the way and even which friends and family might be nearby. Such applications make the world around us available to us in new ways that augment the reality that surrounds us and our experience of it.
It is not just small, personal devices that have entered our daily experience, though. Large, high-quality displays such as multitouch tabletops are enabling people to play and learn together cooperatively. Just as one might exchange real objects, such as documents or photos, digital tabletops make it possible to use similar gestures to manipulate and exchange virtual representations (Figure 4).
Table applications may also enable the use of real, instead of virtual, objects to produce visual or acoustic effects on the table's surface (Figure 5).
Using one’s hands or other body parts is no longer the only way of interacting with computers. Headsets that read our brain activity when we think of certain actions enable players of computer games to interact with the game by simply using their minds (Figure 6). These products have not only changed the way in which we can entertain ourselves, but have also enabled people with physical disabilities to control aspects of the world around them, bypassing physical limitations. For example, prototypes of appliances such as blinds and lights have appeared that can be operated by someone’s thoughts while they are wearing the headset.
Products designed to monitor and interpret what our bodies do have also revolutionised other aspects of human life, for example, healthcare. While in the past, if you wanted to check your heart, you would need to go to a doctor who would use a stethoscope to listen to it (Figure 7a), these days a range of wearable products can monitor vital signs while we go about our daily activities. For example, biometric shirts designed to monitor the vital signs of sports players (Figure 7b) seamlessly embed sensor technology that can measure heart rate, respiration, or motion patterns in real time.
Similarly, many wearable products exist that enable us to keep track of aspects of our behaviour and health, such as how much we exercise, how much energy we consume, how well we sleep at night, and so on. All these products have made more visible, and therefore allowed us to better manage, aspects of our life that might otherwise escape our attention even though they are important for our well-being. Importantly, because they blend in with the clothes we wear or the objects we use daily, these technologies have allowed us to monitor these aspects in real-life contexts.
The capability of wearable products has been used in other ways as well. It has enabled fashion designers to create clothes that can detect the wearer’s inner moods in different situations and represent them through changes in the fabric (Figure 8). Imagine wearing one such garment and meeting someone you like: your heart accelerates and your garment lights up in response to your heart rate. Representing our emotional responses so directly and explicitly can change the way in which we interact with others, automatically sharing with them inner states - however they might be interpreted by others - that otherwise would not be so obviously perceivable.
Now, imagine if whole rooms or even buildings could be designed to respond to the emotions or behaviours of those who occupy them. In many modern buildings, aspects such as lighting or heating are already capable of adapting to people and their activities, for example, by detecting their presence and switching on or off accordingly. In more experimental buildings, even features such as the shape or decor of rooms can change in response to how they are being used (Figure 9).
We are not used to thinking of buildings as dynamic, so living in buildings that adapt to their inhabitants, instead of inhabitants having to adapt to buildings, significantly changes the experience of what it means to inhabit a place.
These examples should give you an idea of how interactive products, both large and small, are changing our experience of and relation to the world around us, to others and even to ourselves. The breadth of interactions is staggering:
- giving us access to everything from tiny single devices to sensor and information networks that span the globe
- supporting activities indoors or outside
- ranging from devices that resemble ‘computers’ (such as tablets or smartphones) to objects that we don’t traditionally think of as computational (such as houses or watches)
- supporting explicit interaction through interfaces we notice (such as touchscreens), or implicit interaction through interfaces we’re meant to ignore (such as biometric garments)
- connecting us to objects ranging from those we hold, carry or wear, to those embedded in our homes, our vehicles, our cities.
Interactive products can enable us to experience our world in new ways, augmenting our senses, our attention, and our experience of the world.