Reading time: 10 minutes
Translation by AB – April 15, 2020
Dans un article écrit en février 2012 et intitulé « How Companies Learn Your Secret »1, Charles Duhigg recalls that the neurosciences, by studying the brain activity of rats looking for food, have spotted a behavioral pattern that we observe permanently for humans: “clue / routine / reward”.
In the rat experiment, this gives: bell sound / labyrinth exploration / food. When the bell sounds, a door opens to the labyrinth. The rat rushes into it because it smells of food. The first times, the rat frantically explores the labyrinth and displays intense brain activity. But as the experience repeats itself, it ends up recording the plan of the maze and reaching the food without even thinking about it: the course has become a habit and the brain activity correlatively very weak.
We have thus acquired thousands of clue / routine / reward patterns which are of, say, energetic interest: while we are developing a habit, our brain is almost at rest. Marketing has taken hold of this principle and has always sought to anchor and / or modify our internal patterns.
Charles Duhigg returns to the history of the Febreze deodorant from Procter & Gamble. The pattern to activate should logically be: bad smell / use of Febreze / disappearance of the smell. The consumption of Febreze was to become a new habit associated with this targeted reward scheme in the first advertisements. In one of them, a woman worries about her dog, Sophie, who has taken up quarters on the sofa:
Sophie will always smell Sophie but there is no reason why my furniture should also smell Sophie.
But sales of Febreze are not taking off. After investigation, the marketing teams noticed what we almost all know: we do not smell the odors we are used to. In other words, there is no clue, no signal (except for a visitor, but nobody will venture to land with friends, family, with a bottle of Febreze – marketing is still constrained by our habits and customs).
On the other hand, the investigators noted that certain people used Febreze well but not according to the expected diagram. Thus, in a clean and well-kept house, a woman systematically spread Febreze after the cleaning:
I don’t use it specifically against odors. I use it during the usual cleaning: two sprays and the cleaning is done […] Spraying, it’s a little feeling of celebration when I finish a room.
In other words, this housewife had developed a clue / routine / reward scheme far from the primary function (deodorizing) of the product: when the bed is made (clue), we spray a little Febreze (routine), and … we are satisfied (reward). Procter & Gamble has therefore redirected its advertising campaigns towards this pattern, emphasizing the clue of an already clean house, perhaps odorless, in which we use Febreze to signify the end of the cleaning and celebrate such cleanliness! A slight odor has even been added to the product to underline the moment. Product sales immediately took off.
From this little story, there are two things to remember. Firstly, the clue / routine / reward pattern is constitutive of our behavior because it optimizes our brain activity. The brain thus remains available to process the unexpected, the potentially dangerous, the uninterrupted flow of new weak signals, but also available to anchor new habits.
Secondly, understanding and modifying these patterns presents a considerable economic challenge. At the time of Febreze, it was necessary to send inspectors on the spot. It took months and it was very expensive. From now on, the data is available in mass and the algorithms infer and modify in quasi-real time our routines. Creating or modifying them “worldwide” costs almost nothing. The only limit is the number of patterns the human brain can handle: the digital competition is fierce to get a “place”. The best of the best (designers, researchers, entrepreneurs …) are on the battlefield.
Many researchers are studying the impact of smartphones on our cognition and the results all go in much the same direction: they induce in us many powerful elementary clue / routine / reward patterns, especially in urban areas (this may be related to the fact that the city is today more than a passive location: it “responds” and “reacts”, in a way).
The clues are elementary and multi-sensory: beep, vibration, image… The very sensoriality of the touch interface (in fact the only real ergonomic revolution since the invention of the mouse), the visual, sound, contact effects (sliding, hissing, bouncing, vibrating…) which require great computing power despite their physical evidence… all this design makes the smartphone a generator of apparently natural clues, therefore simple, neurologically inexpensive, and powerfully inducing new routines.
Generally speaking, the digital industry is always looking for clues production artifacts that are always closer to our “skin” and body. Thus the intelligent personal assistants present in “the air” (Amazon Echo, Google Home, Apple HomePod …), a “voice” of the home among others; the connected glasses, which have so far failed, will have to come back; in general, everything that we wear and that is likely to be networked (watches, clothing, etc.); toys … and tomorrow, obviously, “innovation” pushed to its end: literally go under the skin, directly connect the brain2.
We are surrounded by artificial clue generation devices that are quite far from the natural smell of the dog Sophie or the simple vision of the cleaning done. There would be no problem if they were not therefore strong generators of routines and “second natures” for which our opinion is not really required. A dream for the descendants of behaviorists.
Accept and connect: B.J. Fogg
Following the behaviorist line, embodied in particular by Skinner in the 1930s, these psychological mechanisms continue to be studied at Stanford, in Silicon Valley.
B.J. Fogg, founder of the “Stanford Persuasive Technology Lab“, is one of the emblematic representatives of this current of thought. He wrote in one of these works describing a behavioral taxonomy:
Over the past 15 years, the world has moved from a local environment of human persuaders to a wide universe of machines designed to persuade.
Ian Leslie wrote an exciting article in late 2016 relating the objective that B.J. Fogg drew from his own experiences at Stanford in the late 1990s. It’s pretty trivial3 :
Computer applications could be methodically designed to exploit the rules of psychology in order to get people to do things they might not otherwise do.
Fogg called this area of research “Captology: Computers as Persuasive Technology“, later renamed “behavior design“. The principle is simple: we only do what we want to do. So, it has to be easy. Some methods are unstoppable, like those of Netflix which automatically chains the beginning of the episode of a series to the end of the previous one:
The level of difficulty is reduced to zero. Actually, less than zero: it is harder to stop than to carry on.
We then come to the heart of the method:
When motivation is high enough, or a task easy enough, people become responsive to triggers such as the vibration of a phone, Facebook’s red dot, the email from the fashion store featuring a time-limited offer on jumpsuits. The trigger, if it is well designed (or “hot”), finds you at exactly the moment you are most eager to take the action. The most important nine words in behaviour design, says Fogg, are, “Put hot triggers in the path of motivated people”.
Once these patterns are in place, at each of these signals the brain goes into sleep mode and rolls out effortlessly to reach the reward. In flat encephalogram mode, we also have the illusion of being able to superimpose other activities which themselves consume little energy (except surprise): watching television, walking, driving … The extreme fragmentation of our activities is made possible because these are mostly small but strong habits, which are linked and overlapped as in a dream.
Some simple rules
The research of Fogg and his students, and others after them, has shown that we are all sensitive to the following phenomena, which we will readily recognize.
For example, the famous “Wow effect”, pillar of “digitalization”, marketing star. This “surprise mixed with admiration” during the discovery or consumption of a product or service hooks and starts the anchoring of an automatism (“Such upfront deliveries of dopamine bond users to products“). Once you feel the Wow effect, there is very little chance that we will consume anything else.
The most powerful clues / triggers are not audible, visual or olfactory (one would rather say that these signals are anchors of habit patterns) but rather social. Thus, Ian Leslie writes:
The human brain releases pleasurable, habit-forming chemicals in response to social interactions, even to mere simulacra of them, and the hottest triggers are other people: you and your friends or followers are constantly prompting each other to use the service for longer.
Most of us who use Facebook, LinkedIn, Instagram, SnapChat, etc. know how easily these applications allow you to act and react (difficulty level reduced to zero), quickly (it’s a matter of seconds), massively (unlimited number of contacts and followers) and on the other hand how difficult it is, paradoxically, to ignore a notification itself coming from an effortless and little focused action. The digital world is, par excellence, the technology which allows to multiply the number of social interactions whatever their values.
Nir Eyal is a Fogg colleague and the author of the famous book “Hooked: How to Build Habit-Forming Products” in which he notably defends the idea that digital products and services are largely inspired by Skinner’s observations , especially this one: we can do better with rats, that is to say make them operate themselves and much more often the opening of the door when the reward is variable (a lot of food, little or no food at all):
Every time we open Instagram or Snapchat or Tinder [ the door of the labyrinth ], we never know if someone will have liked our photo, or left a comment, or written a funny status update, or dropped us a message. So, we keep tapping the red dot, swiping left and scrolling down.
Wow effect, social triggers and clues, variable rewards…. Here are some of the rules that are extraordinarily well suited to the digital world, with the help of sensory artefacts, algorithms inexpensive to code and some basic mathematical formulas.
What about ethics?
Marketing has long considered the work of behaviorists and neuroscientists without causing waves of remorse. At most, intellectuals began to denounce the excesses of the “consumer society” in the 1950s, but the products and services continued to be wonderfully well designed and we continued to consume them greedily.
At the time of Febreze, the anchoring of consumption patterns remained long and very cash-consuming. Since the early 2000s, digital has changed orders of magnitude and accelerated the process: our brains are saturated by an intense “pattern competition” and some pattern designers have themselves almost saturated.
This is how, in the heart of Silicon Valley, some awareness has arisen about the captology. Has this ecosystem gone too far? How to interpret this amazing Frogg question recounted in the article by Ian Leslie (remember that we are at the end of 2016):
“I look at some of my former students and I wonder if they’re really trying to make the world better, or just make money,” said Fogg. “What I always wanted to do was un-enslave people from technology.”
This is all the more surprising, if not mind-boggling, that in Silicon Valley many successful entrepreneurs and startups have passed through his laboratory. Frogg has even been dubbed the “millionaire maker”. But one of the most famous cases of contrition is that of Tristan Harris4, ephemeral “design ethicist” of Google.
What does this sudden awareness mean? We suggest a hypothesis. In 2015-2016, the image of Silicon Valley was tarnished by a wave of suicides, discrimination and sexual scandals, and dubious managerial methods in startups … revealing a tension that had to give way: “evil” had won the highest tech, youngest, most innovative environment on the planet. It is likely that this context played a role in the aggiornamento of the designers of digital tools and that “ethics” became for them a central theme.
“Time well spent” then proclaims Tristan Harris5 ! And he criticized roughly what we have just presented: the systematic orientation of behavior design towards capturing our time by anchoring patterns of habits (The value of e-things), comparing the principles of the smartphone to those of a “slot machine” (analogy incidentally very correct for a very successful result), etc. But the posture and the proposals of Tristan Harris are not ethical, except with regard to a few tips for users, most of which consist in using even more technology to counter the technology…
Doing excellent design, based on a good understanding of our human resources, is in itself an activity, a work in which ethics has no role to play, provided that the laws and customs are respected. The Netflix designer does his job well when he manages to get us on board for the next episode. The Facebook designer does his job well when he manages to capture us and our data. The LinkedIn designer does his job well when we react to the slightest notification. In short, ethics is not soluble in design (nor in the “precautionary principle”, which results from the same kind of “trauma”).
Our second natures
A conquering and triumphant technology “codes” our second natures. But we are all responsible for this, and ethics always start by ourselves. Seeking our answers, really looking for them, being vigilant, curious, on the lookout, and constantly asking ourselves the following question: does our second nature, the sum of our digital customs, our shadow, really suit us? Think every time, even for a few seconds, before opening the door of the labyrinth.
1. ↑ Charles Duhigg in The New York Times – February 16, 2012 – How Companies Learn Your Secrets
2. ↑ Forbes – June 6, 2017 – Elon Musk Avec Neuralink Veut Connecter Nos Cerveaux A Internet
3. ↑ Ian Leslie in The Economist – October / November 2016 – The Scientists Who Make Apps Addictive
4. ↑ Tristan Harris – May 18, 2016 – How Technology is Hijacking Your Mind (broken link)
5. ↑ Site of Tristan Harris – Note that since the writing of this article, this link returns to the site of the new business of Tristan Harris « Center for Humane Technology ».