Writers (and journalists, critics, activists and novelists) Laurie Penny, Leigh Alexander, RJ Barker, and Mazin Saleem got together with articles editor Eli Lee to discuss emotional labour—a subject that's becoming more and more important—and how it's explored in SFF...
EL: “Emotional labour” might be defined as going above and beyond to look after or protect people from things that might hurt or be difficult for them, to the detriment of your own time, health, or interests. And I think the reasons why it happens are many and complex. Leigh describes it here as “efforts that involve giving care, carrying a burden, and taking responsibility for the well-being of others.” Jess Zimmerman describes it here as “dispensing care and attention.” It’s often unpaid/poorly paid, and unrecognised—the unseen work required so that others can function.
What SFF (novels, TV shows, games, films, etc.) from the past stands out to you—and why—as being useful in terms of the way it has explored emotional labour, and taking care of people's emotional life? I've been thinking about Aldous Huxley's novel Island (1962), which has some pretty progressive ideas on this—but I wonder if such visions are few and far between?
RJ Barker: I think the first time I remember being aware of something “caring” in an SFF situation would be Robby the Robot in Lost in Space, and it always struck me as a bit weird. Here they have (what's meant to be) this amazingly capable and technologically sophisticated machine and they basically use it as a nanny for their kid. Although, to be fair, the kid was lucky to have the robot as they were terrible, terrible parents. Space social services definitely slipped up there.
Mazin Saleem: I liked Quantum Leap as a kid and, being a kid, didn’t appreciate the significance of the ending till later. While a lot of Quantum Leap was Sam Beckett stopping murders or exposing miscarriages of justice or, later on and less seriously, thwarting Lee Harvey Oswald, some episodes were less dramatic and more therapeutic. In those, Sam was more like a Spacetime Amelie, either nudging others towards emotional closure or, by taking over their lives, doing it for them.
By its whole set-up of a middle class white guy leaping into other, often oppressed people’s lives, the show might now seem like the epitome of appropriation, ventriloquism, voyeurism, fiction-as-tourism, the saviour complex, etc. But the ethic behind it was as decent and “progressive” as, say, Star Trek ever aspired to. The show literalised basic homilies about empathy, like: “try and see things from their point of view,” or “take a walk in someone else’s shoes.” All of which was the necessary starting point for any work righting what once went wrong.
What stood out, though, was how by being an ongoing serial narrative it also literalised the cost of caring for other people’s emotional life—Sam has to keep putting right what once went wrong, instead of managing to get home. The show made this explicit in its premature finale, where God the Bartender gave Sam a choice between finally leaping home, or leaping into his best friend Al's past, in order to save his marriage and always leaping from that point on. It's so embedded, this idea that to care is to be the Giving Tree, sacrificing everything you want for others’ well-being. But maybe we need to ask: who’s demanding the sacrifice?
Leigh Alexander: I think that questions of social care and how those interrelationships will evolve is actually core to the thematic appeal of maybe all science fiction. I believe at heart we love the genre because it helps us imagine we are preparing for all possible futures. It’s about reassuring ourselves that there are circumstances under which our better ideals, whether on an individual or a group level, can prevail.
Although Star Trek: The Next Generation had its two most significant female characters, Counselor Troi and Dr. Crusher, in care roles, it was pretty progressive, relatively speaking. I see the series as basically about this wonderful commitment to act on the best of human nature, when constantly surrounded by aliens with negotiation issues and an android who requires responsible human educators. When you think about it, “Trek” isn’t a very inspiring word. TNG in particular is less about celestial adventures, and more about the careful roads we follow to find ethical social outcomes.
In a sense the whole series is about emotional labour. Patrick Stewart is an actor’s actor, of course, but his sincere devotion to the role of Captain Picard provokes me to love him irrationally. He consistently demonstrates the most ethical qualities of human beings while faced with greasepaint-and-prosthetic extraterrestrials, and he takes that work seriously because he understands, I think, what science fiction is “for,” for people.
It’s Picard who performs the emotional labour of Star Trek. When Troi is not being awkwardly sexualized, her work as counselor, enhanced by her alien “empath” skill, is mostly limited to obvious inferences (“he’s lying,” she’ll warn of someone who is evidently ruddy-fisted and evasive; “he’s afraid,” she’ll say of one who has just fled).
Sometimes it seems like on the Enterprise, basic intuitive empathy is somehow a secret power, and a strange compartmentalization occurs: there is the noble, genderless, futurist idea of social care on a grand scale represented by Picard and the ship itself, and the dubiously-useful “empathy” represented by the sexy chocolate-eating female counselor. I wonder if science fiction’s notions of care aren’t usually structural first and emotional later?
Laurie Penny: Octavia Butler’s Earthseed series—The Parable of the Sower and The Parable of the Talents—has to be the ur-text here. It’s the collapse-of-civilisation story done with a difference—a story that isn’t your standard dystopian narrative where one white guy and maybe his family battle against the odds to survive. It’s a story about how human beings survive together, especially when the social currency of violence changes. It’s also disturbingly prescient—the chief villain in Talents is an extremist right-wing politician who wins the presidency under the slogan “Make America Great Again.”
Then there’s Ursula Le Guin’s The Dispossessed, another iconic and still relatively overlooked work by a female author, which is about—among other things—the practical application of anarchism as a social philosophy. What matters here is that the methods of care and self-care are part of the narrative, not just background—as well as the ways in which those systems of care calcify and fail. Neither of these series are strictly utopian or dystopian—they’re more complex than that. They’re about people trying to find out what works, slowly and painfully.
There are lots of others, and almost every one that comes to mind is female-authored. Marge Piercy’s Woman on the Edge of Time; Lois McMaster Bujold’s brilliant Vorkosigan Saga series, which was the first sci-fi I read that handled mental illness in anything like a respectful and interesting way. The first few books have their share of swashbuckling space adventure, but the sixth or seventh book, Memory—depending on how you count it—is almost entirely the protagonist sitting in his house having a major depressive episode, and his friends and family trying to work out the best way to take care of him and each other. And it’s the best book in the series.
That’s a lot of books—so let’s have a shout out for the entire extended X-Men universe. For so many reasons, but particularly for making the work of building community and collectivising trauma—with two eyebrows raised in permanent disapproval, of course, at the way Jean Grey is pathologised in almost every incarnation.
EL: What contemporary SF&F has the same effect for you, and why?
RJ Barker: Iain M. Banks' Minds in the Culture books. I kind of fell out of love with SFF for a while and it was Banks that brought me back in, partly because of the size of his imagination and partly because it chimed with a lot I believe in. There's this streak of humanity running through Banks's SF, this idea of “be kind to each other.”
In a lot of ways, the Minds have basically become parents. And what he does is quite fun, because to the vast amount of humans the Minds are these benevolent carers, making sure no one plays too roughly or gets hurt. While at the same time, as a reader, you see them on their level, that they're so much more complex, they have lives outside that role that most never see. They have jokes humanity will never get because they're not sophisticated enough, and they know there are dangers out there, and they do their best to protect their charges without anyone ever really knowing.
Mazin Saleem: One story that’s had the same effect is Will Self’s “Caring, Sharing,” from Tough, Tough Toys for Tough, Tough Boys. In it, wealthy Manhattanites own (?) genetically engineered giants that say supportive things, and do supportive things: carry and cuddle the humans when they’re anxious or stressed. The Emotos are dressed like their owners—the ego-ideal for these humans is that you imagine yourself as caring, if not be caring. The humans are narcissists, in the strong sense of the word; one artist has plans to “present a presentation” but doesn’t even do that; he daren’t test himself against the world outside his head. In this world, being procreative (“pro-cro”) is seen as an immature activity to be grown up from, while the Emotos themselves are treated as big kids for liking sugar and saying “Wow!” But at the end of the story, the Emotos are glad when finally unburdened of their neurotic humans. Not on caring duty for the night, they reveal themselves as the real grown-ups, self-possessed, beautiful, and seducing each other to bang on the Persian carpet.
You find the need for care, or inability to care, as a sign of immaturity in more stories closer to home. In the previous question RJ happens to describe a similarly damning example of caring tech in Silicon Valley. (At a push, you could call the show SF&F, in a “two second into the future” way?) For those who don’t watch, one of its characters is Russ Hanneman, Napster’s Sean Parker as played by Adam Buxton’s Famous Guy. He has a Smart Home but with the added feature of a Siri-style voice that coaxes his child back to bed while he is busy ranting about his bank balance getting downgraded from a three to two comma figure.
Which, to combine the two threads, leads me to Spielbrick’s A.I. Artificial Intelligence. In that film, it’s less the case that by adopting robot children, humans are giving themselves the chance to do emotional labour, to be carers (though there is that); what in fact it reveals is the two-way labour of the parent-child relationship, the labour the infant also does for the adult. (Chief Brody to his young son in Jaws: “Give us a kiss / Why? / Coz I need it”). When David the robot child cannot give that labour adequately (when it is superseded by the earlier but updated organic model, i.e. David’s no longer comatose stepbrother) then it’s out of a job.
Imagine a corporate retelling of the A.I. story, set in a fully-automated future, where a boss gets some humanoid staff anyway so he can still go to meetings.
Leigh Alexander: Perhaps it’s a bit obvious but I really liked Her, the Spike Jonze film, and it wasn’t necessarily that I was convinced by the love story between a lonely man and the intelligent operating system he carried around with him, but more because I wasn’t. I think that film is very conscientiously an interrogation about what we—and not “we,” really, your average privileged male consumers—think about the role of both AI and women.
The Joaquin Phoenix character is intentionally creepy, with One of Those Moustaches and a perfectly-pitched smile that’s equal parts childlike and leering as his fingers fiddle hungrily over all his devices and games. A lot’s been written on the visual aesthetic of the film, which is purposefully timeless, a kind of pastel-futurist 1985—the teen boy in his wood-panelled basement entertainment room has grown up in appearances, but that stilted emotional architecture is still obvious.
Her is loosely warning about underestimating the potential of superintelligent AI, sure. But the more explicit warning is against treating them in especially this way—as full-service female labour solutions for lonely men. The female-voiced AI is the protagonist’s smart home, his secretary, his music player, his computer and phone, and also his sexual partner and his girlfriend and the person to whom he combines his deepest heart. She, the superintelligent AI on which this friendless loser depends completely, is satisfied with this arrangement because he is helping her learn something or other about humanity. It is not meant to be romantic; it’s meant to be grotesque. And the best thing about the film is it’s made that distinction so subtle, it’s made so many obeisances to the male point of view, that you really believe that the age of our robot overlords could very well be brought about by some sorry man who was too horny for his Siri.
Scarlett Johansson was absolutely the perfect choice to voice the AI, incidentally. She’s one of the most idealized actresses in the world, and yet her voice has a slight “flaw”— a touch of fry, the sort of vulnerable note that dorky men climb all over themselves to be the first to exploit.
Laurie Penny: There are two standouts for me—the first being N.K Jemisin's iconic Broken Earth trilogy, which is all about what my friend, the scholar Willow Brugh, calls “apocalyptic civics.” It's a story, in part, about how we survive social collapse as part of a community—and its flashback sequences show us that the tools we use to build resilient emotional infrastructure in times of crisis emerge from what came before. The books are about a planet shaking itself apart and the people whose superpowers could doom or save it—but they are also as much about parenthood, about community building, about exploitation, and about self-care as a survival strategy.
The second is Becky Chambers' Wayfarers series, which is a delicious, oozing retake of the classic “band of renegades on a spaceship” trope that actually deals in an exciting way both with how the crew take care of one another and with how different alien and robot societies manage what we’d think of as emotional and domestic labour. I squeed everywhere at the polyamorous collective-parenting lizard people. More of that, please.
Emotional labour, of course, is a new name for an old phenomenon that has been as invisible in fiction as it has been in our real lives—there have always been characters whose job it was to soothe and support the hero, to run his or her (usually his) household, to provide the social and emotional infrastructure that allowed the hero to keep journeying without any pesky domestic or emotional arrangements impeding the flow of the plot. Often, all of this happened seamlessly, in the background of the main story. These days, it’s very slightly more likely that those arrangements are actually part of the plot. “Mainstream” literary fiction has the edge on SFF in terms of novels and stories that foreground the experience of caregivers and home-makers—from Jane Austen to Liane Moriarty.
One thing that's really interesting to me is that until very recently, the ideal “strong” or exciting female protagonist was someone who did all of the things male heroes were expected to do in stories, and only those things, albeit with substantially less character development. Now we're seeing stories where people of all genders do the traditionally “feminized” work of survival and building that's just as important as the swashbuckling and derring-do, if not more so. Emotional labour has become part of the adventure, and that's really thrilling to me. Heroes don't simply have to be atomised individuals with a sidekick and a phaser taking on the whole universe—there are different kinds of heroism.
EL: How do you think emotional labour might be depicted in the future—are we just going to write a lot more stories about robots doing it? And do you think there are current forms of emotional labour that should be explored far more in SF&F?
RJ Barker: I think, as the elderly population grows, we're going to see a lot more about caring for them—especially if government continues to cut back. It's already being explored in films like Robot and Frank and I think that idea—of kids who pass on the burden of caring to machines or others—though not new, might start coming up a lot more.
Mazin Saleem: Now we have robot nurses, and thousands of de-stress, mindfulness, or other therapeutic apps, the future feels like it must be moving on. A next development from stories about the outsourcing of human care to the non-human could be stories that explore where the need for so much care comes from, and/or what it'll do to humans when they no longer “have to care.”
Adding to Jess and Leigh’s definitions, Laura Anne Robertson writes about how the concept of “emotional labour” is all about capitalism and gender; in a “post-revolutionary, transformed society, caring labour would no longer be primarily the domain of women. Freed from wage slavery, men and women would both have time to care for the young, old and sick. The collapse of the patriarchal family, a cornerstone of capitalist society, would engender the development of communized care institutions: People would continue to express their love for each other through caring work but this love would no longer be confined within exploitative interpersonal relationships or waged employment (i.e. families or existing social service and health care structures).”
So one way emotional labour could be explored in SF&F is imagining what worlds might look like where people have neither incentive nor disincentive to care, where the need for care from others is lessened in step with the economic coercion to provide it. What happens to a story where none of the characters have mothers or fathers, daughters or sons, as we’d understand them? Eli mentioned Huxley's Island earlier; it’s about as sane a future society along these lines as you could imagine.
On the other hand, it's a lack of imagination to conceive of emotional labour as necessarily exploitative, not to mention a lack of historical perspective; it wasn’t the case that love came about because of capitalism; love was different, and will be different again. Still, a kind of establishment cynicism says that the future will be heartless, either because of capitalism's cyclical dismantling and co-opting of care, or because of some presumed cold communist self-sufficiency. (See The Year 200 by Agustín de Rojas) More optimistically, an equal and just world might not be one where we can hand off the burden of emotions a la the Emotos, but free them—the emotions—as with much else; (Badiou: “love between two is the minimal communism”; Cornel West: “justice is the public side of love").
As for what forms of emotional labour we should see in SF&F, though it's true we can only get as well as the system allows, and though it's true we’re worn down at every pass, this only heightens the urgency to try salvage some of your self; to make systems that do not replicate the unequal and ever-switching power dynamics of carer and cared, but instead give you the tools with which to dig yourself out of the trap of your life—dig out by digging down, stupid. Even in these purportedly pro-feels times, there should be more stories about the care you have to take with yourself, although labour is a lot better a word than care, because it is work; “self-care” might be kinder but lets you off the hook from the effort it has to take. None of this will be easy.
SF&F, especially at its more radical, psychoanalytical, and even mystical end, might have once provided tools for this work. But Ballardian psyche-explorers have stopped going inwards, or, under the guise of going inwards, are in fact escaping outwards, from themselves, to those gardens and their machine elves. Maybe more stories should tackle the suspicion one has that if primitive people from thousands of years ago were brought to the sophisticated present and saw us and our “inexplicable” bouts of anxiety and rage, they'd have to ask us “if you’re so smart, how come you’re not happy?” What’s needed is stories like a positive spin on what happens to Eleanor in Shirley Jackson's The Haunting of Hill House (the haunted house being a good synecdoche for human civilisation): your historical trauma and the house’s historical trauma and their link must be confronted, so that you can finally be at home.
Leigh Alexander: I think it’s fair to project that robots will, in real life, be doing more of our emotional labour in the near future—but by this very specific definition whereby the jobs in technology that are destructive for humans to perform, like content moderation, or triaging users who are in a bad place. You can have bots reply to your harassers for you, or automatically serve 101 level answers to people who tweet things like “why don’t we say ALL lives matter?” because those things are exhausting and take bites out of people.
These developments in tech are a response to supremely increased degrees of connectivity and exposure to one another at high user volumes, and I wonder about the ways that science fiction will respond to this condition of the present. One interesting question a lot of AI professionals are discussing that I’d like to see contemporary fiction explore—especially because the industry’s voiced assistants are pretty much all still soothing, obedient women—is if we can use bots as “nodes” for our most undesirable interactions, do we risk degrading our empathy? Do we start to see people as functional, disposable relief nodes, too?
The notion of empathy in the social media environment is among the most popular conversations going on in tech right now, as it concerns not only what we’re learning about society from, for example, Twitter, but also the way we share and process news, the way we shape our beliefs, etc. The emotional labour we do when we politely attempt to convince our Trump-supporting relative that they are spreading non-facts—that experience is both increasingly common AND unique to the current techno-political environment. Can AI argue with our racist relatives? Will your “rational” sexist classmate be more convinced by an argument with a bot than with you?
The concept of robot servants humming ineptly around the kitchen, offering us beverages from the door in its tummy, makes me profoundly sad. The true future of AI servitude is luckily just chilling: everything we want is now at our fingertips; soon we’ll only have to say it out loud, and a woman’s voice will reply “okay.” We’ll need to be prepared for who we will be in such a world.
Laurie Penny: My friend and colleague Willow Brugh coined the phrase “apocalyptic civics” to describe the growing field of thought and study around how we deal with any number of disaster scenarios as a series of interconnected human tribes rather than infinite striving individuals trying to decide whether and how to eat each other’s children. I’d like to see more fiction that works on that—as well as more fiction focusing on the different kinds of relationships and family arrangements that people can have. We’re doing reasonably well, or at least better than before, when it comes to talking about queer and gay relationships—but someone needs to write more multi-person love and domestic relationships so we can finally, finally, please please stop talking about Robert Heinlein as if he’s some sort of polyamorous prophet. Stranger in a Strange Land is one of the few books I’ve actually put down because the misogyny was too much.
EL: What role does emotional labour have—or might it have in the future—in your own fiction?
RJ Barker: Age of Assassins (and the two follow-ups) are based around the relationships of Girton and his master, Merela, which is a basically a parent/child relationship (probably why it's so up in my mind when doing this roundtable), and I've thought a lot about how the way you bring up a child influences who they become. And as Girton grows up (the books encompass a period of about twenty years) you get the gradual reversal of that dynamic and you move from Girton's simplistic, hero worshipping, view of his master to a more complex, though sympathetic, and I think stronger, view of her. I don't think I could have written this series the way it is had I not been a parent. Moving on, I'm not really sure. I tend to discover what I'm writing as I write it but I've always been a bit of an outsider and I kind of like the idea of groups and how they bond, just because I've always felt a bit like I'm on the outside looking in, and it might give me an interesting view. Or it may not.
Mazin Saleem: I’m interested in stories where you might not expect or realise that emotional labour is at work, haha, good old puns. In Do Androids Dream of Electric Sheep?, the human race does not realise its emotional connection to the animal kingdom until most other life has died out, after which every human is at varying levels of depressed. The more we discover or admit animal minds and animal emotional life, the starker our exploitation becomes; on top of all the things we do to animals, we suck their emotions like marrow, without them, let alone us, even realising. Maybe a story where, as penance for our past crimes, humans give therapy to oblivious animals, who, only after generations, en masse, and at a subliminal level, start to feel no longer petted or “humanely reared,” but cared for. What would that even look like? In Always Coming Home, all living things are called people, and to us this seems like hippy nonsense. But to the future people in the book, our time was so wrong-headed about everything that they imagine us as having our heads literally screwed on back to front. I’m writing a story called “The Three Versions of Eden” that’s about something like that.
Leigh Alexander: Every bit of fiction that I endeavour turns out to be in some way about this, I suppose because my fiction is partially a way of expressing the ideas that are important to me in my technology journalism. I got to write a little book for the universe of Netrunner, which is a cyberpunk card game, and my brief was basically to contribute to the world and the lore of the game in ways that would deepen the meaning of the interactions between competing players. Netrunner is about our bleak collapse under the things we’re discussing here—capitalism, corporations, invasive technology, and I wanted to write about the universe’s evil Facebook-Google-Comcast-inspired corporation.
Digital society just didn’t turn out like we hoped. We thought cool hackers would be committing open digital warfare to protect our rights, and instead we just get “mobs of spambots posting about ethics in video games and how feminists should die.” Present tech is always about these grotesque disappointments, probably particularly now that the investor economy for the industry has almost literally escaped into virtual worlds.
For example, when we heard that “Russia may have hacked the US election,” we thought, oh! International digital criminals, able to influence the numbers at the ballot box! But no—that kind of digital interference turns out just to be spambots spraying low-quality news everywhere. We thought, oh! Advertisers will soon be able to look directly into your brain and read your mind, but no, they just spy on your Facebook messages looking for keywords. When I was targeted by a hate mob because of some of my writing online, I was not locking wits with sophisticated hackers. I was dealing with hundreds of twitter posts all just devoted to mirroring my own personal information back at me, so I would know they had it. It wasn’t any single terrible thing that was done to me. It’s that there was a critical mass of access to me enabled by default, and I couldn’t turn it off.
In the Netrunner book I ended up writing, the corporation simply took advantage of what people were inclined to do with its tools, and that was important to me to communicate. The possibilities of tech are often ground up in the boring and horrible things that people do with it, and the industry isn’t terribly interested in learning from anything. And all my fiction since then, which includes an upcoming short story and a few narrative design projects, seems to naturally orbit around the idea that these episodes of women’s harassment, these servile talking female-voiced AI, and these algorithms that perpetuate the racial biases of their own data, are crucial problems for the future of progress.
Yet just as Deanna Troi is relegated to optional “empath” work, the men who control the tech industry are more interested in holodecks and spaceships. If fiction can do some of the work of supporting that perspective shift, then I want to try.
Laurie Penny: Oh, any fiction I might or might not be writing is absolutely going to focus on the drama of emotional labour and its various fail modes. But I’d prefer not to talk too much about that—I don’t want to jinx it.