The sudden urgent discussions around Artificial Intelligence (AI) flooding into every aspect of our lives has sparked a resurgence of interest in the topic of robot caregivers. In Italy, Europe’s oldest country, some think robots will be able to fill the gap left by shrinking families and concerns about the sustainability of its system of live-in immigrant caregivers. Like Italy, Japan has projected an image of strong family bonds and respect for elders, but rapid social demographic changes over the last half-century have put an increasing pressure on a much smaller pool of available carers. In 2022, Japan’s fertility rate once again dropped to 1.26, its lowest level ever. Over the last decade, I’ve been looking at how this changing landscape of care has affected the millions of unpaid family carers in Japan, many of whom are providing 24/7 in-home care for relatives living with advanced frailty or dementia, for years on end. While discussions of robots or other AI technology never arose (in contrast to the more mundane and messy technologies like PEG feeding tubes and colostomy bags), you’d never know it from the news coverage (when is the last time you read a front page article about someone cleaning up their father’s poo?).

But while it is little surpise that robot care would make the headlines of popular media, it also seems harder and harder to have conversations with gerontological researchers without some excitement about the newest care gadgets, whether they are in medicine, business, or architecture. Empirical data on real world use of robots in care contexts, however, is still patchy, and work by anthropologists of Japan has tended to be open, if not optimistic about the potential benefits for older people as well as for carers. I still had my doubts, but had to wonder if my personal feelings were also coming from an ignorance about the real capabilities of these technologies and the roles they’ll play in the future.
In the fall of 2022, I attended a seminar presentation by Dr. James Wright, Research Associate at The Alan Turing Institute, whose new book, Robots Won’t Save Japan, presents the culmination of one of the first ethnographic accounts of the development and use of robotic technology in elder care. Just a few days earlier I had received the first copies of my own ethnography on caring for older family members in Japan. This serendipity was too perfect to ignore, so I suggested that the two of us meet up in a few months for a conversation about relatives, robots and the future of elder care in Japan. After a few edits here, we’re happy to share this conversation, and invite you to leave comments and questions below.
Jason Danely: Nice to see you again James. It was great to see you speak about your book recently in Oxford– I just felt myself nodding the whole way through the talk! I think you have a perspective on care and ageing in Japan that is so valuable, but that I haven’t heard much from others who write about social robots and the automation of elder care. Before I get into my questions, for those who haven’t read your book yet, could you tell me a little bit more about Robots Won’t Save Japan?
James Wright: Sure! The book is about recent efforts by the Japanese government and Japanese robotics industry to develop a range of different robotic devices to try to automate a broad swathe of different elder care tasks, particularly in care homes, in order to solve Japan’s care crisis. I write about what I found during my fieldwork, where I spent 3 months with robotics engineers at the National Institute of Advanced Industrial Science and Technology who were administering the world’s biggest care robot project to date, which was a really ambitious project aiming to develop and deploy all sorts of robots: lifting mobility aids, monitoring devices, communication robots, and toilet and bathing aids. Then I wanted to see how those robots would actually fare in the real-life setting of the care home. So I spent seven months at an elder care home to see how care was done before robots were introduced, and then to see how three different types of care robots – a lifting robot (“Hug”), a seal-shaped communication robot (“Paro”), and a humanoid robot (“Pepper”) – were implemented, and what happened after they later left the care home. I also tried to engage with some of the narratives and stereotypes about robot care in Japan and beyond. Finally, in the last part of my book, I look at what emerges through the attempts (and in the case of the care home I was looking at, failed attempts) to implement these robots, and what alternative approaches might be possible.
Jason: What motivated you to write it?
James: My interest actually started back in 2007 when I was at the University of Oxford doing my Masters in Modern Japanese Studies, and at that time, I was interested in robots. There were a number of high-profile robotics projects in Japan being reported on in the media, and it seemed as if we were just about to see this explosion of different robots entering people’s everyday lives. By the time I started my Ph.D. in 2014, there was a lot more interest, and some of those initial conceptual, imaginary scenarios that had been published about how robots and other technologies could be used in Japan seemed like they were actually being funded and becoming concrete policies to develop these devices. The reason that Japan specifically appealed was because I was interested in how service robots might be introduced in everyday life, and it seemed that Japan was the place where that was most likely to happen, for a number of reasons that I’m sure we’ll get into. And it seemed that elder care was the area in Japanese society where robots were actually starting to be implemented. I wanted to write the book because I wanted to explore and ultimately push back against some of these very dominant narratives in Japan and beyond about how successful care robots are and how they’re going to solve all of Japan’s problems. The title of the book, “Robots Won’t Save Japan” is a response to two earlier books in Japanese that have the title “Robots Will Save Japan.”

Jason: Absolutely brilliant. Wow, I didn’t realize this project started so far back! This has been a long time in gestation for you. I think what’s really fascinating about the project is the scale – you’re not only following the development of the technology, in an engineering sense, and the policies and implementation guidelines, but you’re also following its social path. All of these things are developing around these dreams about “this is how robots will be in our future.” Some might come to fruition, but some don’t. I remember seeing an article that was probably from the early 2000s, and it said, “in 10 years’ time, everyone’s going to have robot nurses in Japan!” And, you know, that seems ambitious, but you were able to follow not only the technology, but also these dreams that are inherent in this technology, in a way.
James: Absolutely. A great example is “Innovation 25”, which Jennifer Robertson has written about. Innovation 25 was a vision statement produced by the Japanese Government in 2007, imagining what Japan would look like in 2025 with advanced technologies, including robots, fully integrated into daily life. These dreams have not yet come to fruition, but this mixture of government and industry visions combined with appropriations from sci-fi and pop culture to create an optimistic narrative continues today in the government’s official policy vision of Society 5.0 and state discourses about AI. However, one of the things that surprised me during the fieldwork was that the robotics engineers involved in these projects didn’t necessarily share the dreams depicted in these grand sci-fi visions of the future. They just wanted to do engineering and work on their personal projects, within the big umbrella project under which they got funding, such as the care robot project that I talk about in the book. I found that the grand dreams of robot care were not widely shared, even less so by care workers or care recipients.
I wanted to explore and ultimately push back against some of these very dominant narratives in Japan and beyond about how successful care robots are and how they’re going to solve all of Japan’s problems
Jason: That’s fascinating. I think care robots are still propelled by the idea that, even though they are not sufficient now, they will be in the future. It has a futuristic feeling, and the idea that the aging society and care needs can be solved in a more futuristic way is a kind of fantasy. However, if I can turn to my book for a moment, what I found was that for most people, elder care in Japan is still centered around the family, rather than a futuristic approach emphasized by the robot advocates. It’s still a fantasy, but maybe a more conservative or even past-facing fantasy.
This is where I think we have an interesting difference in how we approach the issue, but these two fantasies really do come together. In Fragile Resonance, I chose to focus on unpaid carers of older family members in Japan because most older people’s care needs are still met by family members, and most older care recipients are living with the person who is their primary carer. This setup is really different from that of the UK or the US, or other places with aging populations. What is similar is that with demographic changes and low fertility rates, there are fewer children to care for a growing number of older adults, and this traditional family caring model doesn’t fit as well as it once did. Yet, it is still propelled by these assumptions around the traditional family unit – these sort of fantasies of a past that’s no longer with us, if it ever was– and I think that’s also quite fascinating.
I wanted to understand what these caregivers were going through – not only the ones who were struggling with care, but also the ones who were perhaps enriched by the experience of caregiving. In the end, I was really interested in the question, ‘how can we improve the experience for family members who are caring, rather than ‘how can we replace them with something else, whether that means paid care or robots.’ I wasn’t necessarily coming at this from the perspective that family care is exploitative and these caregivers need to be relieved of all responsibility, but I also wasn’t assuming that all caregivers are really enthusiastic about caregiving all the time. I was trying to get a more balanced picture of their lives. As this developed, I realized that caring is not unique to Japan, so I brought in another case to highlight the cultural and historical specificity of Japan. So it turned into a comparative study, where I developed the notion of resonance. At first, I interviewed people in England and realized they were telling me the same stories as people in Japan. Eventually, I realized there are differences between the two countries, situated in a particular historical and cultural context. So when we talk about care, we have to talk about that context as well.
James: I found it really interesting. Even though I’ve looked at technology in the context of English adult social care, I’d never considered it through the lens of “charity” that you develop, although it made sense as you described it.
Jason: I’m glad you found that interesting. I describe these two orientations towards care as charity, in England, and compassion in the Japanese case. Both of those have historical and religious roots, and so on. But there’s a way in which they’re diffused throughout culture in a more subtle way. And when people start to do care, they find they want to draw on some kind of cultural model, some kind of narrative of what care is and how to do it, and that’s where I saw paths diverge. That doesn’t mean that English carers couldn’t be compassionate or that Japanese carers couldn’t have an orientation that’s more in line with this charity model, but among the carers I spoke with, the tendency was to emphasize one over the other. I’m glad that it made a little sense to you.
James: I found a lot of similarities in how you describe carers you knew in Japan and the care workers in the institute where I did my research. I think you made the point that the barriers between informal care and formal or paid care work are quite porous. That was definitely my experience with care workers at Sakura, the care home in my research, because a lot of them had cared for a family member themselves or had wanted to do so but hadn’t been able to due to their circumstances. They came to care through that path of family care. I think that experience had transformed them, to some extent, in the way that you talk about, and changed their orientation towards wanting to care for somebody. For many of them, that’s how they’d ended up working in this care home.

Jason: I came away from writing this book feeling that family carers also – they’re unpaid, they’re putting so much work into this, emotionally and in so many other ways – that they’re such a vital part of the overall landscape of caring for older people. But they need support too, and professional care workers and others who are experts in this area, some of whom were family carers themselves, could be such an important support for them. But I didn’t encounter any robots!
James: Right, and did robots or technologies in general even come up in conversations?
Jason: They didn’t. As far as technology goes, the most I saw where those call button things, you know, like if there’s an emergency so you can call an ambulance. That was the technology that I saw. Instead, people in Japan were talking about touch and the body and close physical proximity. And that was the key thing for so many people to have that emotional and empathetic awareness of the other, to be able to respond to their feelings, and their rhythms, and to be very close to them. And that was the thing that people talked about the most, and I would think it would be very difficult for a lot of these carers to imagine how technology could support that kind of relationship.
people in Japan were talking about touch and the body and close physical proximity. And that was the key thing for so many people to have that emotional and empathetic awareness of the other
James: That’s really interesting. There’s a point of connection there with the way that the lifting robot Hug was used at the care home and what it revealed about care workers. The surprising thing with Hug was that most of the care workers reported back pain – over 80% – which is similar to the level of back pain among the wider caregiver population. In Japan, as you know, it’s a very common problem in professional and institutional care because staff still lift residents manually, and so lifting robots were presented as a kind of archetypally helpful robot that would prevent back pain. The expectation was that care workers would jump at the opportunity to be able to use a lifting robot that would help them in that work and reduce the physical burden of care. But what I found was that actually care workers rejected Hug, perhaps more than any of the other robots, and part of the reason they gave was that they felt they wanted to care with their own hands, and found it disrespectful to older adults in the care home to use a robot to move them around “like a piece of luggage” as one care worker put it. It’s not as simple as just implementing a technological device that’s going to solve a particular problem. How care is thought about and the value and meaning of care tends to be flattened through the view of engineers and roboticists, in a way that often reduces care workers and care recipients to two-dimensional characters.
Jason: Yeah, and you see that I think a lot, not just in Japan, but in other places where there are care institutions or even home care these days. Care gets rationalized in such a way that it becomes reduced to a set of discrete tasks. And each task is not seen as a very skilled kind of task, right? So by deskilling the labor, care becomes something a robot could do. Maybe it’s because I’ve spent so much time seeing what family carers deal with, but I feel like this is a kind of violence against what care really is, which is a much more holistic kind of thing that involves a real awareness of the other person as a whole person!
James: Absolutely. In Robots Won’t Save Japan, I call that perspective “algorithmic care”, insofar as care is viewed as a kind of algorithm – a series of discrete physical and verbal tasks that are done step by step. Understood in this way, care becomes a kind of rationalized logistical exercise where you’re just moving a body through space. You’re feeding a body. You’re cleaning a body. But then, when you try to translate that view into robot care, what I found was that of course, it wasn’t as simple as that. There are all these additional invisible tasks that care workers then have to do to clean or store, maintain, update, and move around these robots. So, like you said, it’s a kind of deskilling because the skilled labor of care that involves communication and building relationships with the care recipient, tended to be displaced by more manual tasks of moving the robot around and “caring” for the robot rather than the care home residents. That created this kind of distance, and I was really struck with these physical metaphors that you wrote about, which were also often used by the care workers at Sakura – like yorisou, which I think you translated really nicely as “snuggling up.” I also found care workers talked a lot about proximity, and the need to reduce the distance between themselves and the residents, while the introduction of these robotic devices actually increased this distance rather than bringing them closer together.

Jason: But do you also see a potential role for technologies in the future of elderly care in Japan? There are real dangers involved in care, and I try not to romanticize the situation of families in my book. Because again, I think even though family care and professional care, when care workers are treated well, can be excellent, it is exhausting. I wonder if some of that could be mitigated in some cases by technology in some form?
James: I don’t want to present myself as being completely anti-technology or anti-robots. But I think it’s very much a question of the specificities and the contexts in which the technology is developed and deployed. For instance, there was resistance from care workers towards these robotic devices to varying degrees, but it wasn’t that the care workers were totally anti-tech, because, for example, they’d implemented iPads for electronic record keeping and note-taking. And eventually, the care home adopted some robotic vacuum cleaners, because they were doing a job that the care workers didn’t want to do – hoovering and mopping the floors. So I think it’s really a question of how these technologies are being developed and what they actually do. I found there was very little connection between the robotics engineers who administered this big robot care project and the end users. I think if technology is going to play a really important role, those users have to be centered and engaged at every stage of the life cycle of whatever technology, at the phases of design, development, testing, deployment, maintenance and disposal. Otherwise robots end up as solutions in search of problems. So, I think a key issue is, how are these technologies being developed? And that’s something that I’ve tried to explore, and I want to explore more.
Dr James Wright is Research Associate at The Alan Turing Institute, the UK’s national institute for data science and AI, and Visiting Lecturer at Queen Mary University of London. He received his Ph.D. in anthropology and science and technology studies from the University of Hong Kong in 2018. His research interests include the development and use of robots, AI, and other digital technologies for elder care, and his current project, PATH-AI, focuses on intercultural AI ethics and governance in the UK and Japan. His first book, entitled Robots Won’t Save Japan: An Ethnography of Eldercare Automation, was published in early 2023 by Cornell University Press.
Dr. Jason Danely is Reader in Anthropology and Chair of the Healthy Ageing and Care Research Network at Oxford Brookes University. Jason is currently Chair of the IUAES Commission on Aging and the Life Course and is past-President of AAGE and Convenor of EASA AGENET. Fragile Resonance: Caring for Older Family Members in Japan and England, is Jason’s fourth book, and was published in October 2022 by Cornell University Press. He is currently working on research with formerly incarcerated older adults in Japan and England.
1 thought on “Relatives and Robots: A Conversation on Elder Care in Japan with James Wright and Jason Danely”
My strongly positive review of Robots Won’t Save Japan will soon appear in Aging and Anthropology. Here my concern is with the ancient but still contemporary and widespread fantasy, utterly odious, entirely execrable, of the strong, loving, devoted, capable person without a will of their own at one’s attendance, that person who is not a thing but does one’s bidding willingly, imaginatively, foresightfully, the faithful servant.
Perhaps I just got up on the wrong side of the bed this morning, but this foul fantasy, that if we cannot allow ourselves to treat people as if they were things, then we can find or create a thing that we can treat as such a person, seems to fill and distort our every thought about what is required of what have now come to be called robots. And if this is not what every human wants of every robot, what do we want?