Could robots in care homes solve the UK's social care crisis?

Meet Pepper, a robot friend for the elderly that could be a vital helping hand for Britain’s social care crisis.

The glossy white, streamlined machine resembling Will Smith’s I, Robot, is ‘culturally competent’ and engages in unscripted conversations, reminds residents to take their medication, and plays music on demand.

Although it cannot perform physical tasks or prepare medication, the robot is fully autonomous which means it operates on its own.

Pepper is part of the international Caresses project, trialled across the UK and Japan with care home residents for a year, where each resident was given the robot for up to 18 hours over two weeks.

Researchers reported a ‘significant improvement’ in the mental health of residents who interacted with Pepper, and a slight improvement in loneliness.

The lead researcher behind the project, Dr Chris Papadopoulos from the University of Bedfordshire said residents were initially sceptical about the robot fearing it would replace human care.

However, once residents realised Pepper was a ‘supplementary tool’ to support existing care, their attitudes towards the robot softened.

He said: ‘This robot couldn’t possibly replace human care. It’s clearly got limitations and is no way as capable as a human carer.

‘There are so many periods of time during the day where residents in care homes don’t have anyone to talk with.

‘They’re isolated. They’re on their own, that is the reality for so much of existing social care because there’s so many care assistants we don’t have.’

According to Age UK, 1.4 million elderly people in England are chronically lonely.

Since 2010, the government has cut spending on adult social care by £86 million, and 1.5 million people aged 65+ don’t receive the care and support they need with essential day-to-day activities.

Arzoo Ahmed, a researcher at the Nuffield Council on Bioethics, said the use of AI programming in healthcare and research raised a ‘range of ethical issues’ particularly around racial prejudices and inherent biases in data.

She said: ‘The inability of AI to display human characteristics, such as compassion, and an increased use of these technologies could lead to greater social isolation, if they result in further loss of human contact. 

‘The data used to train AI systems may reflect and reinforce inherent biases and wider prejudices, especially when lack of diversity among developers is coupled with an underrepresentation of racially diverse populations in research.’

Get in touch with our news team by emailing us at

For more stories like this, check our news page.

Source: Read Full Article