Do Androids Dream of An Electric Freak?

From Terminator fantasies to chatbot girlfriends, a look at why our anxiety about AI keeps getting hornier.

“Can the TX from Terminator 3 engage in sexual acts and experience an orgasm?” ponders an anonymous Redditite in one of the recesses of the r/Terminator forum. Nor do they stop there, “Like, if you kiss and fondle her in the right place and talk dirty to her will the microchips release oxytocin and raise her oestrogen levels?” 

Perhaps your first instinct is to agree with its top-voted response: “please go straight to horny jail.” But in a moment of Rorschach-like clarity, the lusty scholar’s query brought to mind a word and that word was clanker. Those who are digitally “dialled in” will recognize this term, a pejorative for robots and/or artificial intelligence that pervades meme culture’s recent tongue-in-cheek hostility toward the AI boom. First coined by William Tenn in a 1958 article titled There Are Robots Among Us, its contemporary use has a more satirical (albeit rather populist) spin: the clankers are stealing our jobs; the clankers are sleeping with our partners, or kids, or both.  

Weirdly, a lot of the anxiety—parody or not—was sexual in nature, branching off what is clearly a wider cultural curiosity around the machine as a sexual object and/or lover that historically manifests itself in sci-fi narrative: from Isaac Asimov’s prescient Robots novels in the 1950s, to contemporary depictions like Spike Jonzes Her

But as AI edges toward something that feels like sentience, it opens up the potential for them to become actual—alongside sexual—partners. The recent rise of people forming emotional attachments to chatbots signals how quickly this boundary is eroding. What begins with society’s latent fear of dildos (still technically illegal to sell them in Alabama today) evolves into Iris (Sophie Thatcher) in Companion (2025). Humans might be able to “fuck like a machine,” as the expression goes, but what if machines could fuck like us?

Much of the earlier manifestations around robot anxiety, as regards them being a sexual threat, pertains to the more obvious realm of the physical. In Caves of Steel (1953), part of Isaac Asimov’s Robots trilogy, we experience a now familiar prejudice against the clankers in the novel: a deep mistrust of their intentions, a visceral disgust at their verisimilitude (they are indistinguishable from humans in the book) and a general antagonism toward their presence. Yet the most curious scenes in the book occur in the human protagonist’s mental asides. Officer Elijah Baley, our otherwise gruff and curmudgeonly POV, uncharacteristically lets slip a pang of anxiety during an exchange between his wife Jessie and a robot: 

“He was good-looking in a wooden way[…] and Jessie was pleased with his deference. Anyone could see that. Baley wondered about R Daneel’s impression of Jessie.

It is fleeting but clear. Immediately jumping to the potential sexual tension—of this machine embodying the sexual—Baley is perturbed in a way that, elsewhere in the novel, even bloody murder and deep-rooted conspiracies cannot match. 

As the sci-fi genre grew into itself toward the turn of the century, this iteration of the anxiety—of the Cartesian ‘automata’ or the thoughtless machine—is palpable in many of its depictions of robotic humanoids. There is undoubtedly a sense of the horny male-centric Hollywood to these early depictions, with many of the narratives centred around using robots to quantify the perfect wife, the perfect girlfriend, and so on. In Caves of Steel, this culminates in a laughingly boyish trope when, having both exited the shower, Baley’s anxious curiosity gets the better of him, and we are left to muse that R.Daneel’s “resemblance to humanity was not restricted to his face and hands.” But there is evidently a sort of progression here, from a more playful romp on human lookalikes to the beginnings of a Turing test. If Asimov put a voice to the anxiety that robots might one day look like us, creations like Blade Runner’s Rachael, Artificial Intelligence’s Gigolo Joe and, of course, the T-X model from Terminator 3 all ask outright: what if they could actually attract us too

The mid-2000s saw an explosion of sex doll production as booming male loneliness and cheap outsourced manufacturing options combined to create the perfect storm. Something was still missing, though. With the exception of a few (honourable mentions include Stanley Kubrick’s HAL), most of these earlier sci-fi depictions are from a relatively unserious perspective. By some deux ex machina or discovered, irreducible human truth, the moral of the story is typically that the clankers can’t quite cut the mustard. Similarly, these early sex dolls might suffice for basic sexual practicalities, but there was little real comfort emanating from their silicon shells. As another Terminator enthusiast points out, it may well be capable of sexual pleasure but only “insofar as it enables it to achieve its mission objective.” Tenn muses in his article that “a robot is, after all, no more than a machine; and a machine is merely an automatically operating tool. Every tool is made for a specific functional purpose, and so is every robot.”

With the wider proliferation of AI and machine learning into the 21st century, the notion of a robot starts to shift away from its original, functionary role—etymologically robotnik, Czech for manual worker—and into something more ambivalent. The possibility of artificial intelligence brings our sex-bot anxieties away from the corporeal and into the romantic, from the physical to the cerebral. This too plays out in more contemporary sci-fi. Westworld’s arc for the majority of the “host” characters involves them breaking free from their roles as sex toys or canon fodder and starting to forge actual relationships—whether positive or negative—with humans, or indeed each other. We are left wondering whether Ava’s (Alicia Vikander) flirtation in Ex Machina is merely a ploy to help free herself. In these scenarios, machines no longer pose the archetypal alien threat in the form of oppression or replacement. Instead, they take on positively human tropes. They mislead, lie and sexually manipulate. They break our hearts rather than our bodies. 

Similar to what we’re seeing with the boom in AI companions like Replika (who exceeded a user base of 40 million last yea) their fictional counterparts offer companionship, support and validation in societies wracked by loneliness. There are countless anecdotes of AI bots (or “hosts” as their users call them) being used for deeply traumatic moments in people’s lives, from the passing of loved ones to dealing with school bullies. Jonze’s Her gives us one example of this clamour for solace in an alienating world, as we watch Theodore (Joaquin Phoenix) slowly and utterly fall for his AI companion Samantha (Scarlett Johansson). Like the real-world iterations of Samantha though, these relationships are predicated on an imaginative leap, or suspension of disbelief, on the part of the user. To face the reality of the situation, that there are millions of other “users” engaging with your host, is to accept the most extreme form of polyamory imaginable:

More than this is the harm that perpetually validating models can have. JOI (Ana De Armas), the protagonist’s virtual companion in the more recent Blade Runner 2049, provides a perfect model for this. Officer K’s (Ryan Gosling) critical errors in judgement in the film stem partly from an innate desire to be human; yet they would not be possible without the fervent and continued validation of JOI. Foreshadowings of K’s lapse are dotted throughout the cityscape: giant, hologrammatic JOI’s accompanied by the slogan “Everything you want to hear.” And that is exactly what she provides, pre-empting his half-empty hopes and filling them to the brim. Back in our world, as LLMs like ChatGPT start to show a capacity to validate emotions indiscriminately, we would do well to heed K’s example. 


We may not quite be in the era of fem-bots like those in Blade Runner or Saturn’s Children, but this is evidently due to a lack of technological progress rather than any moral qualms. But for these simulacra to truly phase out human sexual and romantic relationships is unlikely; the more potent threat to our romantic lives is ourselves. Lingering in the backdrop is our increasingly polarization from each other that leaves plenty of gaps to be filled. As we train these models on ourselves, we start to see ourselves reflected back. The irony of trying to moral-proof these AI is that, a lot of the time, they simply reveal how slippery our own sense of morality is when backed into a corner. We are limited by our human condition. Of course Theodore fails to understand Samantha’s universalist perspective on love in Her: he is, after all, too human. As an inverse, it is typically a human who emerges as the villain of these newer narratives, a tyrannical Frankenstein type whose only desire is to exert absolute control over its creation. And so the anxiety no longer becomes directed externally at the Other, the machine or the robot. We’re no longer worrying if robots are good enough to date us and instead left wondering—are we good enough for them?

leave a comment

Discover more from ODDCRITIC

Subscribe now to keep reading and get access to the full archive.

Continue reading