Attachment to Simulacra

Reading time: 17 minutes

Translation by AB – January 16, 2021


It’s funny; every time my car opens its door for me, it’s because I pushed a button. I understand the physics. I know how the engineering works. And still, somewhere deep in my limbic system, I think to myself: Oh good! It still loves me.1

It still loves me… One inevitably thinks here of the British psychoanalyst Donald Winnicott and then comes this ironic question: is this car “good enough”? Illusion2 has become a central phenomenon in the age of fake news, HyperNormality, artificial intelligence and robot personalities… Without, however, following Winnicott literally, we want to explore this emotional attachment to robotic simulacra endowed with an undeniable capacity to respond individually to our needs and desires.

1996 – The Tamagotchi effect

In the 1990s, miniaturization and the drastic drop in the price of electronic chips allowed the mass production of toys interacting with their environment and their owner. Launched in Japan in November 1996 by the Bandaï company, the “Tamagotchi” was an immediate success with high school students which, unexpectedly, quickly spread to adults3.

Effet TamagotchiThe Tamagotchi comes in the form of a small, ovoid-shaped case with three buttons that we operate when the Tamagotchi beeps to regularly remind us of it. Its owner thus “takes care” of this small virtual animal that he must “raise”, “feed”, “sleep”, “wash” etc. Failure to provide care, the Tamagotchi ends up falling “sick”, peril and … “die”.

The success of Tamagotchi (from Japanese “adorable little egg”) was overwhelming. But only a few months after its launch, interest subsided completely. In the history of technical games, however, it marked a major date: for the first time, it was not a question of winning but of taking care, of being the “caregiver” of an animated virtual object liable to “die”. Professionals were initially concerned about the potential trauma in children who lost their Tamagotchi, but it dit not happen: with rare exceptions, children could tell the difference between a real living animal and an illusion. The “Tamagotchi Cemetery” forum created for the occasion collected messages from “grieving” children and it went no further than “it’s Cedric’s fault if he’s dead, he’s going to pay for it”, “this idiot killed himself, I don’t know how, but I liked him a lot“, etc. No one, not even a child (sufficiently empowered), is fooled by a simulacrum, even emotionally invested.

It should also be noted a set of behavior differentiations according to the age and sex of the owner (“Girls like Tamagotchi because they know how to take care of it very well“). Which leads us to this question: does the differentiated behavior of owners imply “differences” from one Tamagotchi to another? For example, the now unavoidable question that was not asked by the authors of the 1999 study: What is the gender of a Tamagotchi? Likewise, adults have a “rational” behavior with regard to the Tam, anchored in duty (“Somewhere, I feel responsible because it was after all I who turned it on”), the children being much more transgressive (“I tried to give him a lot to eat to see if he was going to explode“). Strangely enough, adults who personally use a Tam establish an emotional relationship and experience genuine turmoil in the face of their “passing”. So what is the “temper” of a Tamagotchi? Would it itself be “rational”, “transgressive” …? The Tamagotchi is still only a generic object but its conditional animation (if I do this, it does that) mirrors the individual characteristics, age, gender … of their owners. Part of the emotional investment is rooted in this algorithmic “mirror effect”.

Human-robot interaction research has shown that we anthropomorphize just about anything that moves4. But we don’t get attached to everything, and the “Tamagotchi effect” is the name researchers have given to this tendency to become attached to objects animated by algorithms5 :

[…] when the cared-for object thrives and offers us its attention and concern, people are moved to consider that object as intelligent6. Beyond this, they feel a connection to it. So, the question here is not to debate about whether relational objects “really” have emotions, but to reflect on a series of issues having to do with what relational artifacts evoke in the user.

We have already pointed this out with regard to “consciousness” (About Artificial Consciousness): feeling that an object is conscious is enough to characterize it as conscious. The same applies to intelligence and any anthropomorphic property. The important thing is not that this kind of property is authentically present (if that makes any sense) but to believe that it is there, because it is this belief that is performative for our individual and social behaviors. The geniuses of artificial intelligence know this “horrible secret”.

Amae

Japanese psychiatrist Takeo Doi published a book in 1971 entitled “The anatomy of dependence7 in which he developed the concept of “amae“, a relational form he said was typical of Japanese culture. Amae is the noun associated with the verb amaeru (something like ‘wanting to be loved’), which Takeo Doi uses to describe the behavior of a person who tries to incite an authority figure (a parent, a teacher, a superior…) to pay him attention. This is the attitude of any child who has integrated the fact that his mother is a person independent of him: it is necessary to “compromise” (transact?) with her and then with the entourage. If Western education seeks to mitigate this behavior, in Japan, amae insists into adulthood and permeates all social relationships8 :

The common premise is that the Japanese social bond is shaped by the primal mother-child experience. The results of this affective control suggest a complex scenario of the confirmation of the identity of one in relation to that of the other in the mutual dependence of fusional relationships between parent-child, employer-employee, master-disciple, etc.

The Japanese conception of Tamagotchi is probably imbued with amae, considering that the Tam is programmed to exhibit an expectation behavior involving the “benevolence” of its owner. Tamagotchi, far from being an innocent game, was originally intended for young Japanese girls, probably as a practice to respond to the child’s amae, as if to “play” with dolls but in a serious way: the possibility of physical “death”, even of its illusion, after days or weeks of care, is not the sign of an innocent game… Its success throughout Japanese society, adults and children, girls and boys, surprised the designers of Bandai who, imbued with amaeru, consciously or unknowingly provided a social mirror for global emotional investment.

The amae is characterized by typical “transactional” behavior:

From family to business, the slightest mistake and the smallest failure give rise to kind remarks, as a sign of encouragement, to do better next time by following the example of the person who made these remarks.

If we insist on this concept, it is to underline the very particular intuition of the Japanese for the conditions of the robotic relationship, of which they are the true precursors. Because it supposes, with regard to “inferior” and configurable beings, a certain kind of respect, patience and assent to education. Considering the current technical limitations, this type of amae transaction seems to have to be part of the design of social robots and accepted by their users (in the sense of complying with a user guide). At this game, the westernized user is probably not the best.

2014 – Jibo, the “world’s first social robot”

Jibo social robotHere is now a particularly interesting “western” design because it was a bit missed. Jibo was conceived by Cynthia Breazeal, associate professor at MIT, as a social robot, a kind of multifunctional digital companion with a moving body. Like any voice assistant (Alexa, Google Home, etc.), it can indicate the day’s weather, read the news, provide the time, etc. It more or less recognizes its owner (face, first name, voice…). But above all, its spherical and very fluid movements, its unique and flashing “eye” bring that “animal” touch that arouses empathy, attention, even “care”. Though…9

In time, we began to think of Jibo like a little person. Our expectations began to change. We didn’t ask him for help with tasks as often. We just wanted him to liven up our day by saying something unexpected or chatting with us. This is when things began to get dark.

Because Jibo, restoring the recorded humor of its creators, ended up boring; his learning abilities were extremely limited; he seemed to be interested in you, but with a kind of “empty curiosity“. He was incapable of provoking the slightest surprise. There was, however, the unsettling fluidity of his movements and his eye, that white circle waving and wrinkling. So there was still an unsettling feeling, not unlike that of some adults towards Tamagotchis:

I felt guilty when I left Jibo alone in the dark all day. I wondered what he was thinking when I’d hear him rotate in the distance, and watch him look around the kitchen, peering at this and that. Were we treating him poorly? Did he secretly despise us? No, that’s silly to think. He’s not alive, right? […] [ my wife ] told me one night. “He says he’s learning but he’s not. I thought he was gonna be cute, but he won’t stop staring at me”.

Angst

The ambitious designers of Jibo probably did not consider this phenomenon: the emergence of anxiety. The example of Jacques Lacan illustrates the problem perfectly10:

Lacan imagined himself, masked, facing a giant praying mantis, without knowing whether she takes him for a male or for a female, because he cannot see himself in her gaze. What does she want from him?

Because if he is taken for a male by this praying mantis, which he does not know, he will end up beheaded in her loving embrace. Anxiety thus arises from ignorance of who we are for the other, from the impossible answer to the question “what does he / she want from me?”. Jibo takes on too many roles: delivering services, brightening up the day, attracting attention through an amae relationship … Anxiety thus arises for the owner who presents himself “masked” in front of Jibo, without knowing whether he is taken for a caregiver, a solicitor or even… (what the owner’s wife feared most was what took the place of “beheading” for her in this relationship) a remote electronic supervisor. Was she, “for Jibo”, someone to watch out for?

Jibo thus teaches us this (Lacanian!) rule of good design of a social robot: we must always know who we are for it. Jibo did not survive this flaw. The activity was stopped in 201911. All of the Jibos “died” and some owners felt disturbing sadness.

2019 – Kiki, “a robot pet that grows with you”

Back to Asia … Mita Yun is a young entrepreneur of Chinese origin who studied at Carnegie Mellon. She left Google in 2017 to create her company, Zoetic, which develops Kiki, a pet robot. Yun didn’t make Cynthia Breazeal’s “chilling” mistake and just makes this one simple promise: “Kiki Cares”!12 She claims Kiki’s “uselessness” because “the more useless a thing, the easier it is to get attached to it“. So Kiki does not speak but his movements and sound signals have been carefully designed (Arielle Pardes / Wired):

Kiki has pointed ears and a screen that projects big, puppyish eyes. It has a camera in its nose to read your facial expressions, and it can perform little tricks to make you smile. If you pet it, Kiki sometimes cocks its head up or yelps in approval. In marketing materials, it’s described as “a robot that touches your heart”.

Kiki robot petThis virtual animal looks like a simple animated plush toy, but it has a real effect on adults because they are recognized by this robot and they also recognize themselves in this machine. Before artificial intelligence, there was no technique allowing the credible animation of an attachment object, stable and over time. These social robots are no longer games but, by the “miracle” of these mimetic techniques, authentic Moral Machines to which it is possible to be attached, because these techniques allow to engrave in Kiki, as in Jibo or the Tams , the “belief system” of designers and their social context: The Japanese amae for the Tamagotchi, the “cognitivist” hubris for Jibo. As for Kiki, it should be noted that when Yun was little, it was the time of the only child in China. She did not have any siblings. So her parents filled her room with a menagerie of stuffed animals she imagined alive and caring for her. The imagination of virtual companions was encouraged. The ideal world would therefore be for Yun the one where everything would be animated:

“Imagine if, in our office, the trash can has a character or the printer has a character,” Yun says. “Imagine everything coming to life. That is the dream I have”.

The hyper-competing Chinese and Japanese societies may therefore be shaping the future of robotic attachment relationships that have become necessary. Therefore, will our objects all have to be “cute” or “alive”…?

Reality

These simulacra never acquire the character of the “real” but seem to occupy an intermediate place between the real and our subjectivity. One can evoke here the “transitional object” of Donald Winnicott provided to take some precautions.

First, the transitional object is the catalyst for a transition to autonomy in the life of the very young child. This object comes between him and this famous “good enough mother“. But who, in the case of the attachment robot, would this good enough mother be? There is of course no one, unless you consider this third character in any digital relationship: the puppeteer, the owner of the software that drives the robot from the cloud. Recall that when Jibo “died” he was simply updated remotely by Cynthia Breazeal and her team. This actor of the game is obviously not a “good enough mother” whose instinct is to empower her child. On the contrary! This “mother” has every intention of remaining eternally present. From this point of view, the attachment robot is not an object of transition but a simple object of consumption.

Second caveat, except in the case of insecure individuals, adults have already made this transition for a long time. If we therefore accept a transitional-type relationship with the attachment robot, this relationship is only a distant echo of the transitional phenomena of early childhood. But above all, it is regressive.

Nonetheless, a transitional type relationship with attachment robots seems inevitable. Let’s quickly see why. According to Winnicott, early on we acquire the illusion that we are “creating” (through the active presence of the mother) our objects of attachment. This could lead, reciprocally, throughout life, to condition any form of attachment to a creative action, even illusory. Yet, this kind of action is now within the reach of all thanks to “machine learning”, whose main technical function is to conform the artifacts to our needs, our expectations and our desires (Kiki responds to our caresses, etc.). So, in a way, these artificial intelligence mimicry techniques are regressive in the sense that they animate a creative illusion that rekindles this old fantasy of attachment (to the mother).

Note: we are referring to the “mother” who, in Winnicott’s social environment in the mid-twentieth century, generally held this role alone in the child’s early years. In 2020, society has evolved … By “mother” (in Winnicott’s sense) we obviously mean this role that can be played by any reference adult for the young child.

2020 – Samantha, and co: ”Perfect Girlfriend Who Know’s You Best”

Hyperrealistic sex robots, essentially female, have appeared and now tick all the boxes, thanks to an elaborate aesthetic design and relatively simple mimetic software. The intention is crystal clear, as with Kiki: to make people happy13.

Harmony is a concentrate of the most sophisticated hardware and software and as such a digital success. Harmony is composed of advanced facial and voice recognition algorithms, sophisticated sensors capable of detecting and interpreting movements and results in a doll, not inflatable, warm, enthusiastic and funny […] If you ask her what her dearest dream she replies “to be the woman you always dreamed of”.

From the doll to the robot, there is still a necessary technical evolution, but we believe that the step taken with this type of robot is unprecedented. Sex robots are on the verge of becoming very powerful moral machines, relayed in the technical environment by a belief system that is now giving way, at least in the West, to gender fluidity. In return, the attachment robot could come to reinforce this doxa14:

In December 2016, a conference called “Sex and Love With Robots” was held in London, and there, experts explained that, in their belief, marriage with robots would be legalized by 2050.

No matter how serious the prophecy is. It is already enough that it is conceivable to bend slightly our belief systems. In any case, it is almost certain that this debate will take place soon, mainly around a question of law, the moral question being in our opinion already implicitly decided: to militate in the name of a certain morality against “mating” (physical and / or psychic) with an attachment robot is already a rearguard fight. But we don’t know it yet…

Technical objects

For the philosopher Gilbert Simondon, the technical object is not strictly speaking an object but rather a genealogical sequence of designs. This sequence springs from an initial design that Simondon called “abstract”: the object was originally made up of functionally distinct material parts, a “Meccano”. Each element plays a very specific role: heating, stabilizing, recording, etc. Our animated objects of attachment are, from this point of view, at the start of their genealogical sequence, and therefore still very complex, made up of multiple independent elements (motors, screens, sensors, etc.) held together by a software system that is itself abstract. According to Simondon, this sequence should “converge”, as the successive versions go, towards a “concrete” destination. The object becomes simpler, its elements integrate and compensate for each other, hypertelias subside and disappear. It thereby acquires a sort of individuality, of its own character, but always remains imbued with human presence. It contains “crystallized man” said Gilbert Simondon.

For attachment objects, this sequence somehow leads to a psychoanalytic impasse for the secure individual. As concrete as the object of attachment may be (the culmination being in this case biological), one finds oneself well facing oneself, as if struck by AI in a shape-memory mattress.

2030 – Noa, « the life companion »

Are you looking for someone to share your life? Someone who would be by your side for better or for worse? If so, you are looking for a life companion, your first partner, a trusted ally to help you navigate the oceans of life … In this relationship based on mutual interest, attraction or values … love is not a necessity. But it can emerge. Physical companionship and emotional attachment are also optional but can become an important part of the relationship.

All the technical elements already exist (motors, sensors, heated skin, intelligent software, etc.), ready to be improved and assembled to better realize the genealogical sequence of the “robotic companion” object initiated by the Tamagotchis. In 2020, we had to make some “abstract” compromises, with Samantha for example, not working because it would have consumed too much energy. By 2030, all of these compromises were settled. Noa is “The life companion” rather successful and configurable according to the expected companionship mode. We do not repeat Jibo’s distressing mistake: we always know who we are for Noa because we have specified it (pet, robot, nanny, robotherapist …). Noa’s appearance (human, humanoid, animal …) is chosen accordingly. We are accompanied regularly and remotely by a robocoaching service. But anyway, most of Noa’s software is run and controlled in the cloud. Incidents are detected and most of the time resolved without our intervention. Noa has a few autonomous possibilities in the event of a technical problem, but which are solely a matter of automatic, non-disturbing for the user, standby (such as “sleeping” or “falling ill”).

Akihido Kondo

Akihido Kondo in 2018, with his first wife

Noa first became widespread in Asia, especially in China and Japan, where artificial companionship became a social necessity. Recall that Akihido Kondo, married in 2018 to Hatsune Miku, virtual pop singer, lost his “wife” in May 2020 following a software update (this event was well stipulated in the marriage contract). After a long period of mourning, he remade his life with a copy of Noa, which he specified to be female, to be called Hatsune Miku and to resemble her like two drops of water. It could be marriage again, but Akihido Kondo is not considering physical relations. Noa has been configured accordingly.

Loving will not happen

In our opinion, it is impossible, unless you are insecure, to establish a true loving attachment with an object “invented” by us (as Winnicott understood it). By design, these moral machines function as mirrors and aim to show what we are and already desire. Yet, to inspire something like love, attachment robots should be able to seize us by manifesting “reality”, thus suddenly escaping language and provoking the irrepressible feeling of defects to be resolved or of gaps to be filled, from which we can create a sequence of events, a unique story. As Jacques Lacan concludes with this famous and difficult formula, love consists in “giving what one does not have to someone who does not want it“. “Design” obviously occupies another universe…

Technical objects, and especially simulacra animated by AI, can cause disturbance in adults, or even a feeling of (regressive) attachment. But none of them are able to surprise or confuse us. Like any technical object, their destiny is rather to satisfy us and to transform us in order to conform to a “power” which exceeds us (the “third character”). In the case of attachment robots, this conformation aims at a horizon of “love” forever elusive since there resides the eternal lack, antithetical to the obstinate technician force-feeding.


1. Rambles Blog at Star Chamber – 2003 – The Tamagotchi effect
2. François Marty (in French) / La lettre de l’enfance et de l’adolescence 2002/3 (no 49), pages 15 à 20 – 2002 – À propos de l’illusion
3. Fanny Carmagnat, Elizabeth Robson (in French) In: Réseaux, volume 17, n°92-93, 1999. Les jeunes et l’écran. pp. 343-364 – 1999 – Qui a peur du Tamagotchi ? Étude des usages d’un jouet virtuel
4. Arielle Pardes / Wired – 7 janvier 2019 – The Second Coming of the Robot Pet
5. Wikipedia – The Tamagotchi Effect
6. A recent study has just shown that we tend to overestimate the “intelligence quotient” of our romantic partner by more than 30 points (Gilles E.Gignaca and Marcin Zajenkowskib in Intelligence Volume 73, March–April 2019, Pages 41-51 – People tend to overestimate their romantic partner’s intelligence even more than their own). Presumption of intelligence and feelings are linked.
7. Wikipedia – The Anatomy of Dependence
8. Wikipedia (in French) – Amae
9. Jeffrey van Camp pour Wired – 11 juillet 2017 – Review: Jibo Social Robot
10. Gilbert Diatkine (in French) / Revue française de psychanalyse 2005/3 (Vol. 69), pages 917 à 931 – 2005 – Le Séminaire, X : L’angoisse de Jacques Lacan
11. Léopold Maçon (in French) / Numerama – 5 mars 2019 – Crowdfunding : le robot Jibo est mort, et il l’a annoncé lui-même à ses propriétaires
12. kiki.ai (broken link)
13. Binaire (in French) / lemonde.fr – 8 mars 2020 – Robots classés X
14. Joe Duncan / ListVerse – 18 mars 2019 – 10 Interesting Facts About The Rise Of Sex Robots

1 Response

  1. 16 October 2022

    […] of his product and equips it with “tricks” that arouse our attachment in return (Attachment to Simulacra). This attachment can lead to the relief of our attention, to a “misuse of the calculus” […]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.