‘Monitoring vital signs or mediating loving care and connection: the social codes of new care technologies’

Ingunn Moser and Jeanette Pols – University of Oslo/University of Amsterdam:

Download power point presentation> Telecare_IMJP

Icat interacting with user or master
The Icat by Philips

Notes written by students of Theo Vurdubakis -Department of Organisation, Work and Technology, Lancaster University:
Ingunn’s presentation focused upon the growing recognition of the need for a deeper understanding of care technologies and the complexities of material, social, cultural, emotional and aesthetic relations involved in these. The aim of the presentation was to examine and discuss how certain and specific new technologies of care were constructed/designed and how this affected and is affecting relations of care and what it means to be cared for in contemporary times.

The first example of a ‘care technology’ that was introduced was the ‘HealthBuddy.’ Designed by IDEO in 1999. This was outlined as a ‘typical’ technological care system to which those being cared for can be seen to become socially and emotionally attached. This system is designed to monitor and educate those being cared for by asking them a series of questions each day relating to their symptoms, behaviour and knowledge. For example, ‘Did you weigh yourself today?’

health buddy

The answers to these questions are sent to a call centre once a day and encoded on computer. If a problem is detected then a call will be made to the home of the individual needing care (the assumption being that people have poor relations and that the ‘Health Buddy’ can help to change this). This technological system, because of the assumed increased efficiency and improvements to the quality of life of the care-receivers, is envisaged as reducing costs in the long term.

In practice, this system was found to have value for the patients as it provided them with links to services that they had not had previously and it was found that it also made them feel safer and looked after. These values were measured in addition to ‘normal’ care services and were seen as indicating the success of these technologies in increasing independence by limiting social and affective relations. This suggested changes to what it means to be a ‘good patient’ and, paradoxically, creates even more attachments as these technologies are bound up with an array of relations/scripts to which the cared for become attached.

It was discussed how these scripts can be seen as social programmes of technologies constructed with a configuration of imagined users. It was discussed how some social technologies were attempting to build into these scripts relations and emotions. Two examples were given. The first was Aibo which was a robotdog designed to generate new relations, to bring value to the user by giving them something to care for and which configures users as companions and carers.

The second example was the I-cat. This technology of care was designed as an assistant-servant and the user in this case was configured as master and dependent care receiver. This was suggested to reinforce the patient’s dependent position.

It was suggested that certain technologies of care such as the examples above raise some critical and ethical questions when thinking about technology v’s care: What needs and desires are taken into account when designing these technologies?

How are agency positions afforded through these technologies and in their design?

What norms are created through these types of technologies?

For Ingunn, therefore, code offers possibilities of connections to care but these connections are linked to the designers of these technologies and the scripts they use to make the carer fit into a culturally positive ideal.

 

Notes written by Adrian Mackenzie – CESAGen, Lancaster University

How are care technologies entangled in people’s lives? What are the social codes and relations in care technologies? How should be think about and design these relations? Success of them is explained by unexpected relations that entangle with health care. The first example is called the ‘HealthBuddy.’ Designed by IDEO in 1999

In theory, it is typical of the monitoring and educating approach. It monitors health indicators and trains patient to monitor and work on their own health indicators. One time each day the patient has to answer questions on HealthBuddy. The box reacts. A heart failure patient would be asked ‘did you weigh yourself today?’ It would be followed by information that is meant to educate. Patients get feedback on their answers. The questions and answers are sent daily to a call centre. The information is encoded by computer. When the values are divergent from expected values, an alarm is set off at the call centre. The machine therefore will help people to live better, to prevent exacerbations, hospital admissions and reduce medical consultations. This will cut costs. Once people have learned how to run their lives, they can give the device back.

Patients valued the HealthBuddy, and became attached to it. They new strange answers would bring the nurses to see them. They didn’t want to give it back after three months. It made people feel safe, and was being used. But it didn’t make them self-monitoring and independent. Instead it created greater reliance on nurses and health care services. A new norm for what it is to be a good patient or elderly person emerged.

More generally: technologies don’t work unless bound up with material, social, cultural, affective, aesthetic relations. The script or code of these relations is usually not made explicit. It ascribes needs, norms, and relations around potential users.

What happens if you explicitly try to design social and affective relations? For example Aibo, the Sony robot dog, has been used as telecare technology.

Aibo robot
Aibo allows an entree to people to communicate with each other in a nursing home. It increases socialisation and attention that would otherwise be lacking.

Icat by Philips is a contrasting example:

Icat by Philips expressions

The cat offers to set an alarm, plays music, and provides a weather report.. It isn’t able to supply a swimming pool. The I-cat is designed as a servant. It offers tasks and services. It refuses to uses a first name, only a last name. It reinforces the user as patient. On the other hand, Aibo is meant to generate new attachments and affections. It invites a wide variety of relations.

Dog offers something of value to the user by being programmed to have a mind of its own, while the cat offers tasks, while HealthBuddy offers independence or safety. Aibo does not structure interactions very much since it behaves like a dog-machine. But I-cat does not have cat-like interactions. It functions as an information service and its cognitive processes often break down, even as it tries to take care of the client.

What needs and desires are taken into account here? What positions, identities and relations are afforded? What attachments and detachments does the technology open up?

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: