Mr. Roboto, LMSW

Mr. Roboto, LMSW

By Elliott Golden

Volume 24, Number 2, Don’t Be Evil

Art by Alice Mao
Since the COVID-19 pandemic pushed his services remote, Henry Cole,1 Licensed Master Social Worker (LMSW), has had to remake his clinical practice for under-resourced and high-risk patients. As a therapist at two different community clinics in New York City, certain things that were once central to his everyday—opening the door and greeting his patient; checking in with their body language and nonverbal cues; providing a welcoming space to decompress; to “just be,” as he calls it—are not possible in the virtual world. Instead, digital sessions often take place in environments overtly antagonistic to his clients, including overcrowded apartments, unsafe living conditions, and frequently interrupted by shoddy internet connections. In some cases, ambiguous silences on the other end of the line make it difficult to know if a session has even started. “My role changes, I’m no longer the therapist, I’m something different,” he remarks. Without the infrastructure of his clinic team, of a safe therapy room, or of knowing that he can connect a person in crisis to a support system at the end of a session, Cole has struggled to treat his clients’ many needs. “Look, this work is meant to be face-to-face, to be in the same room,” he declared to me over another phone call. “In order to care, you need to be in as much of a sanctuary—without interruption, private—as possible. You just can’t provide that level of care when you can’t see the other person, when you can’t control their environment.”

These clinical realities have been obscured by a popular discourse valorizing the possibilities of telehealth and digital care tech. As much as the CDC and private practice start-ups venerate our contemporary technological reformulation of care as an opportunity to expand access and personalize services, in reality digitally-mediated relationships of care threaten to exacerbate inequalities for those in need of support. By confining individuals to their everyday environments and commodifying all services, this emerging system has ensured that the quality of care services is entirely dependent on their socioeconomic background and the contours of their daily living arrangements. “In the world of telehealth, if a person had a backyard and pool, one could have some privacy and there would be some replication of services,” noted Cole. “But the young person I was working with lives in an overcrowded railroad apartment with four siblings. Their room was Grand Central Station.” Care tech’s utopian vision of support beyond the constraints of space and time demands a client untethered by the world of poverty. For those already disenfranchised by racial capitalism’s social order, the transition to virtual care is just another feature of the community disinvestment that has produced isolation and social exclusion in the first place. Is this the expanded access and personalization care tech promised us?

And yet, as COVID-19 restrictions soften and the world returns to a more in-person existence, there is little reason to suspect that these digital services will disappear. The market for telepsychiatry and telehomecare, already a multi-billion dollar sector, is set to grow by over 20 percent in the next seven years.2 This has less to do with providing support for those in need than it does with the technologies’ extractive potential in a new market. As demographic aging accelerates and trauma abounds, the societal need for care services will only grow; tech is well poised to capitalize on the growing demand. Consulting company Grand View Research, Inc. notes, “enhancing internet application, virtual medicine and rising demand for centralization of healthcare are expected to save on cost incurred, which is one of the critical success factors attributing to market growth.” In other words, virtual platforms remove pesky overhead like coordinating human beings, paying employees, or maintaining spaces for care, making it easier to provide services at much lower costs. They are profitable because they are efficient.

One question, particularly in the face of Cole’s daily challenges over digital forums, is the compatibility of effective caregiving with a profiteering pursuit of efficiency. As Emma Dowling notes in her recent account of capitalism’s remaking of care, The Care Crisis:

Care work is a relational affective kind of personal service requiring the intensive deployment of mind and body, for the most part in the presence of the person who is being cared for. . . . The time needed is not easily reduced, lest the quality of care be compromised.3

Feeding and washing, therapy sessions, maintaining social and emotional functions for differently-abled persons: these are not races to the finish. They are long hauls requiring patience and compassion, solidarity and concern. For those like Dowling who consider care a practical and ethical question, care worthy of the name does not concern itself with the speed and sparsity of resource allotment, but with the attention and attunement of the caregiving relationship. Care tech is valuable insofar as it enhances the depth and texture of this connection by alleviating strain for caregivers and amplifying dignity for care recipients.

You just can’t provide that level of care when you can’t see the other person, when you can’t control their environment.

Institutions and agencies, however, beholden more to their bottom-lines than to any particular ethos of human social ties, take a different perspective. For them, the quality of care is directly dependent upon its speed, cost, and scope of its distribution—the cheapest care is necessarily the best. In this sense, technologies that can minimize both the labor costs of providing care and the geographic constraints that might otherwise curtail market transactions, are coveted not because they might enhance human caregiving, but because they can remake or altogether replace it at a profit (hence the projected market growth). It is a crucial distinction: tools designed to reduce the “contact time” between caregivers and care recipients are fundamentally different from tools designed to reduce the burdens of a caregiving relationship.

There is serious danger when the former confronts the physical and emotional limits of caregivers and their clients. “The current arrangement is literally unsustainable,” Cole says without reservation. While he concedes that remote services are conducive to higher “attendance,” and agency pocket-books remain flush, the lived realities of care are impossible to uphold. “It’s a crash and burn situation. We can’t keep this pace. I think that patient lives are in danger and that definitely provider lives, clinician lives, are too.” Saddled by untenable caseloads and little continuity in their client relationships, Cole and his colleagues, like so many in the care economy, are fighting an uphill battle against the new realities of care technologies.

Care work has always been exploited under capitalism. Feminist and anti-colonial scholars have long understood that child care, domestic work, elder support, cooking, and cleaning have all historically been either un- or undercompensated.4 Since capitalism could only maintain profitability if it extracted such labors free of charge, racialized and feminized persons have been compelled, by slavery or by social order, to perform what Silvia Federici has termed tasks of “social reproduction.” Federici explains the contours of this logic in Caliban and the Witch:

Capitalism must justify and mystify the contradictions built into its social relations … by denigrating the “nature” of those it exploits: women, colonial subjects, the descendants of African slaves, the immigrants displaced by globalization.5

As the tools and relations of production have shifted, so too has the nature of the exploitation of care work. If in the monopoly capitalist era of the Global North, heteronormative structures associated the tasks of social reproduction with “womanhood” and the role of the housewife, in the transition to neoliberalism, care work has become a private, personal, and economic responsibility.6 What was once delegated to the housewife, has now been offloaded to underpaid domestic workers, primarily migrant women of color.7 These jobs, physically and emotionally draining, are criminally undercompensated, rarely unionized or regulated, and precariously contracted, a direct consequence of the racist exploitation of those who perform this labor.8

Care technology only emerges in this context of familial disinvestment from kin-based care. Employing a rhetoric that emphasizes autonomy and choice, these tools rely on a logic that disparages dependency on others—including family members—while valorizing self-reliance even in persons requiring care. Take the artificial companion Care.Coach, one example from the growing array of “agetech,” tools that track, monitor, or comfort older persons remotely. Care.Coach is a digital tablet-based pet avatar designed to provide company to older persons with cognitive impairment in their own homes, replacing or supplementing home or residential care. One user testimonial underscores the liberty provided by the device, saying, “Without you, my dad would definitely be in a nursing home…you have allowed me to keep him safer and happier than he would be anywhere else.”9 Unable to take care of their father, this family member opted for the use of a digital companion in place of a more labor-intensive and spatially constrained nursing home. For this family, and for so many others, Care.Coach is a relatively inexpensive tool that ensures safety and support for their loved one. Its sophistication and playful design not only promise improved social-emotional outcomes for recipients and peace of mind for families—they really do deliver it.

These tools comfort and soothe, listen and attend; in their own ways, they do support and care. Still, their support and care is for individuals, and more precisely, individuals who can afford it.

But underneath its shining service, what really is Care.Coach and what are its implications at the population and community level? Snappily self-described as “a real-time fusion of human and software intelligence through a 24×7 companion that develops a trusted relationship and empowers individuals ​to actively improve their health,”10 Care.Coach promises emotional, instrumental, and informational support to its users around the clock. Despite the company’s techno-utopian rhetoric, this “real-time fusion of human and software intelligence” is really more like an international call center than anything else. University of California–San Francisco medical sociologist Dr. Elena Portacolone explains:

The user has the impression of conversing with the pet-avatar because the artificial companion is remotely guided by a technician who can hear users’ words and then replies by typing answers on a keyboard. . . . Specifically, behind the eyes of the pet-avatar are actual human eyes of one technician who is observing a specific user in real time through the camera of the tablet . . . To ensure financial feasibility, these technicians are hired in Latin America and the Philippines. Since they directly type their responses to the user, distinctive accents that may be foreign to users are avoided [my emphasis].11

Care.Coach’s logistics demonstrate a true assimilation of caring into the vertiginous dynamics of transnational business.12 In actual fact, Care.Coach’s founder, Victor Wang, came up with the idea after developing a service that contracted employees in India to remotely control the buffers that clean US factory floors at a reduced cost. A Wired profile explains his thought process: “If [Wang] could tap remote labor to sweep far-off floors, why not use it to comfort [his grandmother] and others like her?”13 The equivalency between sweeping floors and caring for older persons suggests a complete instrumentalization of caring as such. Insofar as care can be provided by digital avatars controlled by people halfway around the world, it is reduced to a set of tasks, either done or not.

This is precisely what Dr. Maja Mataric, a professor of computer science and neuroscience at the University of Southern California and pioneer in socially-assistive robotics argues. “It isn’t actually very hard to project empathy,” Dr. Mataric is quoted as saying in a 2018 Wall Street Journal article. “Empathy is what you do, not what you feel.”14 For Mataric, as technologies of care achieve the necessary sophistication to perform caring they can be thought of as human caregivers immune to empathy fatigue. Haunting as her comments may be to the tender-hearted amongst us, it’s a hard ontological proposition to counter, and the growing array of tech-directed companions indicates its resonance with the wider community. Artificial pets, like Sony’s Aibo which can cuddle, wave its tail and play with its owner, and the humanoid Zora who specializes in entertaining, teaching, and organizing classes for aging persons, are all part of this care revolution. In mental health, apps like Real and Sanvello, offer self-guided exercises and programming as well as group engagement forums like discussion boards or virtual events while AI counselor programs such as Youper and Woebot prompt users to text their feelings and chart their emotional states.

These tools comfort and soothe, listen, and attend; in their own ways, they do support and care. Still, their support and care is for individuals, and more precisely, individuals who can afford it. Like so many other services of the neoliberal order, care technology tasks each person or family who needs support with the provisioning of their own care. Those who cannot afford the tools (or happen to be the technicians behind tools), are left, like Henry Cole’s clients, in constrained and challenging circumstances with little hope for empathetic support, robotic or otherwise.

If you like this article, please consider subscribing or purchasing print or digital versions of our magazine. You can also support us by becoming a Patreon donor.


  1. This clinician has been pseudonymized to preserve privacy.
  2. Grand View Research, “Telemedicine Market Size Worth $298.9 Billion by 2028: CAGR: 22.4%,” March 9, 2021,–cagr-22-4-grand-view-research-inc-301243027.html.
  3. Emma Dowling, The Care Crisis: What Caused It and How Can We End It? (London: Verso Books, 2021), 137.
  4. Silvia Federici, Revolution at Point Zero: Housework, Reproduction, and Feminist Struggle (Oakland, CA: PM Press, 2012); see also Patricia Hill-Collins, Black Feminist Thought (London: Routledge, 2008).
  5. Silvia Federici, Caliban and the Witch: Women, the Body and Primitive Accumulation (New York: Penguin Books, 2021).
  6. Dowling, The Care Crisis: What caused it and how can we end it?, 32.
  7. This is not to say heteronormative structures of power simply vanished with the elections of Reagan and Thatcher— heteropatriarchal roles still very much enforce who cares for whom and how it is economically valued. Consider the value that feminized labor is still worth: “If American women made minimum wage for the work they did around the house and caring for relatives, they’d have earned $1.5 trillion in 2019. Globally, the value of that unpaid labor would have been almost $11 trillion,” writes Jordan Kisner of The New York Times. All the same, as more women have entered the labor force outside of the home, there has been sharp growth in the low-wage professions of the care-economy. According to sociologist Rachel Dwyer, care-work made up more than half of low-wage job growth of the 1980s, and nearly 75 percent of its increase in the 2000s.
  8. Echoing Federici, Ai-jen Poo, director of the National Domestic Workers Alliance, notes that “the cultural devaluing of domestic work is a reflection of a hierarchy of human value that . . . values the lives and contributions of some groups of people over others, based on race, gender, class, immigration status.” For an incisive analysis of the challenges facing contemporary care-workers, see Ann Neumann, “Family Care for All,” The Baffler, April 2020,
  9., “Engage Your Most Complex Patients: Enhance Care, Improve Health, Reduce Costs,” accessed July 6, 2021,
  10., “Engage Your Most Complex Patients.”
  11. Elena Portacolone et al., “Ethical Issues Raised by the Introduction of Artificial Companions to Older Adults with Cognitive Impairment: A Call for Interdisciplinary Collaborations,” Journal of Alzheimer’s Disease 76, no. 2 (2020): 445–55.
  12. For the sake of the present argument, I have ignored the ethical issues of 24×7 monitoring of adults with cognitive impairment, many of whom may forget or never fully understand that a real life person is watching them at all hours.
  13. Lauren Smiley, “What Happens When We Let Tech Care for Our Aging Parents,” Wired, December 19, 2017,
  14. Imani Moise, “For the Elderly Who Are Lonely, Robots Offer Companionship,” Wall Street Journal, May 29, 2018,