Chambers, AC., Elizabeth, HJ., 2017.
Inhuman care-giving, emotional labour, and the dehumanised public health service in Humans
|Output Type:||Conference paper|
|Presented at:||Care + Machines|
|Venue:||Lincoln Theological Institute, University of Manchester|
|Dates:||20/10/2017 - 21/10/2017|
Humans (2015-, Channel 4) is set in a parallel present-day Britain and imagines a world where Synths (anthropomorphically framed robots) are part of everyday life. No longer merely employed as physical labour-saving devices, they act as caregivers; privately purchased as housekeepers and nannies, or publically funded through the NHS to provide live-in care. Humans explores the affective experience of synthetic 'care' by exploring several close interpersonal relationships between Syths and their human patients. The Synths perform controlled emotional responses in the face of repetitive and extremely intimate care actions and through these intimate relationships, Humans poses questions about the value of emotion, intent, and agency.
George, a retired robotics engineer, forms a father-son relationship with his early care model Odi and goes to great lengths to protect him from being 'recycled' and replaced by Vera, a new and apparently improved NHS elder-care Synth. Odi shows vulnerability (mechanical failure/need for care) and his knowledge of George is framed as familial love, whereas Vera's adherence to her programming is interpreted as un-caring. George refers to Vera as his 'jailer' and sees her clinical lack of emotion as incompatible with providing care. Even though the majority of Synths are unable to show 'real' emotion many of the human characters develop an emotional attachment to their Synths actively blurring the distinction between authentic and automatic relationships.
The Synths, unable to tire or experience the highs and lows associated with intimate care, act as emotional-labour-saving devices, efficient and incorruptible. Their agency is limited by their programming and they seemingly provide the infallible solution to costly social care. However, as the Synths lack the agency to consent to give care, the value of the care they provide is seemingly reduced. Can care given without consent be considered care? Does their synthetic-nature make Synths poor/incomplete caregivers or would the development of emotionless AI live-in care be a form of liberation for anyone with complex, prolonged and costly care needs?