Is It Wrong to Have Sex With a Robot?
Sex robots are artificially intelligent (AI) systems that are encased in a physical form. They are typically modelled on human (usually female) bodily forms, for the purpose of facilitating sex. Sex robot technology does not compare, yet, to the sorts of humanoid robots we see in TV Shows like Humans or Westworld. However, sex robots with more limited capability are available now. You can buy, for example, a RealDoll for $6000+ – a realistic customisable sex doll, which can be connected to a ‘head’ with artificial intelligence that moves, learns user preferences, has its own sexual preferences and personality, sustains eye contact and engages in conversation with its owners.
Sex robots fascinate and excite some people: a recent survey reported by The Sun newspaper suggested that 1 in 5 of us would have sex with a robot, and sex doll brothels are already features of various cities across the world, including Gateshead, Turin, Paris, Vancouver and Moscow. However, many others would be repulsed by the idea of having sex with a robot, or by anyone they know doing so. In this article, we consider whether people who have sex with robots are doing anything morally wrong, and if so, what?
Kathleen Richardson - founder of The Campaign Against Sex Robots - ethically opposes sex-robots on the grounds that when a person has sex with a robot, they are engaging in an asymmetrical ‘relationship’, akin to sex between a sex worker and a buyer of sex, where only the buyer is recognised as a subject and the feelings of the ‘object’ i.e. sex-worker or robot, are not recognised. Thus, the user of the sex robot, similar to the buyer of sex, can ‘switch off’ their empathy. Richardson worries that as the users of sex robots are typically male, and sex robots are typically modelled on women, (in particular, young, slim, white women, with exaggerated hourglass frames) the use of sex robots will reinforce female dehumanisation.
What’s more, owners can customise features such as the robot’s tongue, vagina, pubic hair, eye colour, and personality type - some of which are concerning (for example, True Companion’s Roxxxy can adopt the personality of ‘Frigid Farah’, which allows men to simulate rape, or ‘Young Yoko’ characterised as ‘barely legal’). Given the gendered nature of sex robots, customising robots in this way reinforces, not only harmful unrealistic stereotypes of female sexual attractiveness, but also the idea that men should be able to choose what the women they sleep with look and behave like.
What makes a sex robot different from a sex toy, like a vibrator, is that they appear to offer sex like a responsive human being.
The claim that human-robot sex is asymmetrical - because the sex-robot is treated as an object - is however, worth scrutinising. For one thing, sex robots are objects, and treating objects as objects seems, prima facie at least, unobjectionable. What makes a sex robot different from a sex toy, like a vibrator, is that they appear to offer sex like a responsive human being. The sex-robot’s status as object is perhaps (temporarily) transformed in the eyes of the user, such that the user enjoys a perceptual experience in which the sex robot seemingly also has experiences and feelings of its own.
To highlight some recent technological developments that increase this level of sexual realism, artificial skin for robots can now be 3D printed (with sensors that can detect temperature, pressure and toxic chemicals), sex-robots can now communicate feelings and respond to their partner’s emotional needs (indeed, some social robots are being trialled as therapists, though this has not yet reached the sex market). Furthermore, some sex-robots can perceive being touched and have a bodily temperature and rhythmic heartbeat that mirrors that of a human. These design features enable the customer to become more easily immersed in a simulacrum of a sexual experience with a human.
Therefore, the fact that the sex robot is not sentient does not, by itself, mean that the user does not want a two-way relationship with it. By way of comparison, we might consider Objectum-Sexuality (OS) - a very rare sexual orientation of people (typically women), who have amorous feelings for one or more inanimate objects (such as a chandelier, bridge, guillotine), in the same way that someone might love their human partner(s). OS relationships are characterised not only by the inanimate object of love, but also by the fact the human lover believes that they are engaged in a two-way relationship with that entity - involving both reciprocal love and mutual acknowledgement of their relationship. For example, Erika Eiffel - founder of the Objectum-Sexuality Internationale (a support page for objectum-sexuals) and world champion gold medallist archer, describes her amorous love for Lance (her bow) in the following way:
“People realised I loved my bow, but they didn’t realise I loved my bow. [...] I’d almost swear that my blood just flowed from my arm and went right into him, and [..] felt that the molecules in him went right back into my arm.”
Sexologists, researchers, and online communities are advocating that OS is a legitimate sexuality, not a form of paraphilia. A representative of Objectum Sexual International explained to us that ‘sexual expression is a product of the growth of the bond between the person and the object they love, not the goal’. Although it is unlikely that a user of a sex robot would be an Objectum Sexual themselves, OS does show us that it is possible to treat things as people, even to have sex with and romantically love them, in a morally unproblematic way.
[A]lthough such relationships may appear asymmetrical from a third party perspective [...] the human physically involved in the sexual act can engage in a subjective experience that appears to involve reciprocity[.]
Indeed, attributing life-like qualities to inanimate objects is fairly common. A child might think that her teddy bear will enjoy going on holiday, the witness of a puppet show might find a puppet funny. We also regularly empathise with fictional characters, feeling happy or sad for characters on TV, for example. In these fictional cases, attributing life-like qualities to inanimate entities and fictional characters seems to involve engaging with a fictional world - where one (temporarily) ‘suspends their disbelief’ about the reality of the entity they are engaging with. In the case of sex robots, perhaps one becomes immersed in this fictional world and convinces oneself, in the moment at least, that the sex-robot has its own subjective experience; hearing the sex robot groan, ask for sexual favours, and respond to physical touch are all part of enhancing this immersive two-way experience.
Thus, although such relationships may appear asymmetrical from a third party perspective (for example – those watching human-robot sex on porn sites), the human physically involved in the sexual act can engage in a subjective experience that appears to involve reciprocity, using the same points of reference as they might in human-human sex.
One might respond that it would still be better if the person engaging in this fiction was having sex with a real human and actually caring about their genuine subjective experience. However, we might question the way we are using the word ‘better’ here. It is not obvious that it is morally better. There is no moral duty to have sex with humans, rather than objects. If someone decided to stop having sex with people, and just use a vibrator instead, there is nothing morally objectionable about their decision. Furthermore, sex with other people is not available to everyone. Some people, such as those with certain mental or physical disabilities, those who have experienced sexual trauma, those who are chronically shy, or those who, for some reason, are regarded as extremely unattractive by almost everyone, might find it extremely difficult to have sexual relationships. Therefore, in theory at least, having sex with a robot might for these people be a welcome option. Having a robot at home might also be enough to ward off painful feelings of loneliness for some. As one owner of a love doll puts it, ‘it's like when a husband has his wife and kids come home, you know, I have that; I have someone at home’.
On the other hand, there are people who will be having sex with robots for morally problematic reasons. For some, what is sexually arousing about having sex with a robot is being able to play out fantasies, which involve the objectification and mistreatment of the person with whom they’re having sex. Indeed, these more nefarious sorts of intentions can help us to understand why some sex-dolls and robots are modelled on individuals who would be safeguarded in real life, such as a school girl, and also, why some sex robots have such unusual design features, such as being edible (giving the term ‘eating out’ a whole new sexual meaning).
[I]t is still possible to limit the damage the sex-robot industry could cause, whilst accepting that not all human-robot sex already is, or is destined to become, morally troubling.
One potential response to these problematic cases, is that for these people, being able to indulge in their fantasies with a robot might stop them from playing them out with people. However, this is speculative, and we don’t have good evidence to suggest that this would be the case and so it seems right to err on the side of caution. In addition, the toleration of simulated rape or other kinds of mistreatment of sex robots is disrespectful to survivors of sexual violence, and to women in general. Indeed, some of the concerns about sex robots might parallel concerns over a Japanese video game ‘RapeLay,’ which was launched in 2006, but later banned. This video game allowed users to ‘play’ at raping women. While the characters in the video game, of course, couldn’t be harmed as they were fictional, people were concerned about what playing such a game would do the mindset of users, and how this would affect the sex that they had with real women, but also about the way that the mere existence of the game trivialised rape and undermined the position of women in society.
Even if the user of the sex robot does not have bad intentions, we might still be concerned about them having sex with a robot which they are treating like a human, given that the robot cannot consent to the sex, as they can’t refuse it. It might seem odd to be worried about this – after all, we don’t care that a vibrator can’t consent to sex! However, given that those who have sex with robots might convince themselves that the robot is a person, we may reasonably worry about what having sex with robots might do to the ways they think about consent more broadly. If they become accustomed to having sex where, when and how they want to - without having to get consent from the robot - they might become less concerned about consent when having sex with their human partners. And if sex with robots becomes widespread, this could affect norms of consent for sexual relations more broadly. This might sound like fear-mongering, as we don’t yet have good evidence that sex with robots would lead to users being less concerned about consent with human partners. However, it is hugely important to ensure that the norms of sexual consent are not trivialised, and so we need to be careful about anything that could do this.
There are various approaches to regulating the sex-robot market which could at least help to mitigate some of the potential damage to norms of consent caused by sex robots. One approach involves educating the user about the importance and nature of sexual consent (perhaps they need to complete a course, or sit an exam, before they can purchase the sex-robot). Another possible approach is that roboticists could encourage sexual consent as a feature of human-robot sex through design. Perhaps, for example, the robot might not provide sexual functions until it was sufficiently ‘seduced’ - e.g., through conversation, shared time or being asked if it was ‘in the mood’. The philosopher, John Danaher, suggests that ‘the robot might sometimes randomly refuse its user, and always provide affirmative signals of consent when it is willing to proceed’ (p. 116). Furthermore, sex robots modelled on people who would be unable to consent in real life, such as schoolgirls, or robots designed so that the user can simulate rape with them, could be banned.
So, is it morally wrong to have sex with a robot? The answer to this question is not clear-cut. It will depend not only on the motives and behaviours of the person having sex with the robot, but also on what the robot is like, and on wider sexual norms. This technology is in the relatively early stages; it is still possible to limit the damage the sex-robot industry could cause, whilst accepting that not all human-robot sex already is, or is destined to become, morally troubling. The sex robots are coming, and now is the time to think carefully about what they should be like and what kind of place we want to allow them to have in our lives.
 We leave aside the question of how accurate Richardson’s characterisation of sex between a sex worker and a buyer of sex is.
Natasha McKeever is a Lecturer at the Inter-Disciplinary Ethics Applied (IDEA) Centre, in the School of Philosophy, Religion and History of Science at the University of Leeds. She works primarily in the philosophy of love, sex and friendship, but she is interested in applied ethics more broadly.
Ruby Hornsby is a PhD student at the Inter-Disciplinary Ethics Applied (IDEA) Centre at the University of Leeds working in the area of Ethics of Social Robotics. Ruby is currently researching the nature of human-robot relationships to determine whether robot companions have a place in the good human life.