the Sociality of Care Technologies

Mediating Loving Care and Vital
Signs: the Sociality of Care
Technologies
Utrecht
September20-21 2007
Jeannette Pols* & Ingunn Moser**
*Academic Medical Centre, Amsterdam
**Centre for Technology, Innovation and Culture, Oslo
Structure of talk
 Aim: contribute to understanding of the social
relations of care technologies
 Examples of new care technologies
 Some tools to think with
 Two examples of explicit social and affective design
 Concluding argument: technologies don’t work
unless they are entangled in relations that are
necessarily both material, social and emotional
2
Homely technology:
the example of the HealthBuddy
3
The HealthBuddy – in theory
 Typical telecare system: medical approach,
monitoring and education
 Aim and intention: to monitor health indicators of
people with chronic conditions, and to educate and
train them to monitor and manage their own health
 Improve selfmanagement of patient
 Prevent exacerbations
 Prevent hospital admissions (costs, QoL).
 Original timeplan: 3 months
4
The HealthBuddy – in practice
 Patients valued the HealthBuddy: it links them to
health care apparatus, expertise, nurses
 Patients became attached to/dependent upon the
HealthBuddy because it makes them feel safe and
looked after. They want to keep it!
 Success in some terms, but not the predefined
intentions and criteria
5
So how to think about these findings?
How to explain the success?
 HB as rational medical technology vs social and
affective relations
 Intention: limit social and affective relations: more
independent, self-managed patients (more care
work and responsibility)
 New norm for what it takes/means to be a good
elderly person and or patient
 HB still allowed for and created new attachments
-even while the intention was the opposite (to limit
relations)
6
Tools to think with : technologies in relation
and technologies with scripts
 technologies don’t work by themselves but only in
relations – material, social, cultural, affective,
aesthetic…
 Technologies come with a social programme or
’script’ that outline and define needs, norms,
relations and potential users
 What social/ emotional relations technologies offer?
What scripts do they come with and should come
with?
 Question: to what extent do these ’built in’ users fit
with real users? How do real users find ways of
creatively negotiating the script to meet their
wishes?
7
Designing for and building in relations and
emotions: the example of Aibo the robodog
 Aibo.com
8
Designing for and building in relations and
emotions: the example of I-cat
 Philips.com
9
To start: stroke the cat
10
What does I-cat enable, what is its script
and how does it configure its users?
 I-cat is designed as an assistant-servant: (cathuman) butler
 restricted relation with its user, who is configured as
a mixture of master and dependent care-receiver
 allows for the enactment of tasks or services for the
user /patient
 Limited interactions, relations, attachments and
affections with the cat
 Tries to add emotionality on to functionality
 User identity: dependence and patienthood
11
 msnbcmedia1.msn.com/j/msnbc/Components/Photos...
12
What does Aibo enable – what is it’s script
and how does it configure e-care?
 Aibo is designed to generate new relations,
attachments, and also affection
 Affords or invites a wide variety of relations:
interaction, communication, affection, compassion,
humour, play, liveliness, care…
 Configures its users as companions and carers
 Opens positions /identities other than patient/
dependent
 Dog allows and facilitates attention and
communication.
13
Cats and dogs: differences
 In order to be used, technology should bring
something of Value to the user
 Dog: dog-friendship, fun, play, talk, resistance, mind
of his own, interaction, affection.
 I-cat: enacts tasks - value is: comfort, maybe
’health’.
 Health buddy: independence – or safety?
14
Differences; individuality
 Robotdog is ‘individual’ (programmed to learn and
adapt to user), is unpredictable and shows a variety
of emotions. Sometimes refuses to do something
(has a mind of his own).
 I-cat is basically polite and neutral, and emotions
serve to underline this politeness and neutrality.
15
Differences; structure of interaction
 Aibo: structures, but not too much. It does not talk.
It allows for dog-like interactions. No failures.
People use A. as dog and machine.
 Cat: particular and functional/ rational interactions.
Limited use. No cat-like interactions. Cognitive
charachteristics, communication breakdowns.
 Cat remains machine: no function apart from the
ones programmed.
 HealthBuddy: structures, but allowed for alternative
norm/ideal and negotiation
16
Care
 User has to care for Aibo: help yourself by helping
‘others’
 I-cat cares for user: reinforcement of patient
position.
 HealthBuddy cares for user on a distance and helps
user care for her- or himself
17
Critical and ethical questions
 What needs and desires are taken into account?
 How is agency, work and responsibility shifted and
distributed?
 What positions, identities and relations are
afforded?
 What attachments and detachments does the
technology open up for?
 What norms for being human, elderly, patient, care
receiver and carer does the technology enact?
18
Conclusions
 Take technologies’ social relations, emotional bonds,
scripts and configuring of users into account
 Take technologies prescriptions /norms for what it
takes to be human, patient, elderly, care receiver
and carer into account
 Take the multifaceted needs and desires of potential
users into account
 Technology brings values and allows for stronger or
weaker social and emotional connections
19