|< Chapter 3. Truth||Contents||Chapter 5. Science >|
Chapter 4. Tricks
In north London, on Walthamstow High Street, there’s a jeweller and pawnbroker. It’s a couple of minutes walk from the Tube station, sandwiched between a Chinese take-away and a chemist’s shop. It’s not a wealthy part of town; the shop-fronts are a bit threadbare; the jeweller advertises prominently that it also cashes cheques. The door is locked, so Tracey rings the doorbell and waits to be buzzed in on the electric latch. She’s tall and elegant, blond high-lights contrasting with her smart black coat and high heels. Inside, a shop assistant presses the button to open the door and Tracey comes in. Let’s watch a robbery go down.
Tracey smiles and says “Hiya!”, shivering and rubbing her hands together. “God, it’s so cold!” Tracey’s accent reveals that she’s not quite as posh as her clothes.
“Yes, I know, it’s horrible today,” says the shop assistant.
“I’m looking for a necklace,” explains Tracey. She gestures round her neck to show how long, and takes off her black leather gloves.
“Gold or silver?” asks the assistant.
“Silver,” she says, as the assistant turns away to let in another customer. It’s a man, bundled up in a winter jacket over his business suit. The shop assistant turns to greet him while her older colleague starts to show some necklaces to Tracey. The man just wants to look at rings in the display cabinet. He doesn’t seem in a hurry to buy anything, so they leave him to browse and concentrate on Tracey, who looks eager to spend money on the right necklace.
“Got a nice one here,” says the older assistant.
“Yeah, that’s the sort of length,” says Tracey, as they cluster around. “How much is this one?”
“It’s quite expensive, actually,” says the older assistant.
“Whoo!” says the first assistant, catching sight of the price tag. Tracey smiles and laughs.
“Six hundred and twenty pounds,” says the older assistant.
But Tracey is happy; this is the one for her. She says she’ll have it, and takes out a stack of twenty pound notes from her handbag. She counts them out, one hundred at a time, onto the counter, but just as she is finished, it happens. The browsing man turns and quickly grabs Tracey by the arm.
“Okay. Stop right there. Police!” he says loudly.
She turns to him, mouth open in surprise. He flourishes a police warrant card in his other hand.
“Walthamstow Police!” he says.
“What?” says Tracey and tries to pull away.
“Tracey, I’m placing you under arrest for deception and fraud.” He turns to the shocked assistants and says “Leave that money on the counter. Please let my colleague in.”
There’s another plainclothes policeman outside, pressing his warrant card to the glass of the door. They let him in. A bit older and balding, he’s clearly number two in this operation. He takes Tracey by her other arm and leads her away from the counter. The assistants look at her with tight lips. The first policeman starts dealing with the evidence.
“We’ve been following her all day,” he explains. “There’s been counterfeit cash. She’s been passing it off at jewellers.”
The assistants eye the stack of cash dubiously. They nearly accepted it.
“Can I have a look?” one asks.
“Yeah, go ahead,” says the policeman, as he starts to stack it up and put it into evidence bags. “They’ll all go in for evidence, for her.”
The first shop assistant is now turning the the silver necklace over in her hands, looking at it. The policeman looks at her regretfully.
“I’m afraid that’s part of the evidence as well, now.” Then he smiles. “You will get it back, obviously!” He continues to bag up the cash. When he’s finished, he picks up another evidence bag and holds it open.
“I’d ask you just to pop that in here,” he says.
The assistant holds the necklace between finger and thumb, as though it’s dirty, and drops it into the bag.
“Thank you,” he says.
The assistant looks down, then over at Tracey, who’s now in handcuffs. The second policeman’s ready to take her back to the police station. He has the appearance of a man who’s looking forward to a cup of tea after standing around in the cold all day. Tracey looks glum.
“You bitch,” says the assistant. “You could have cost me my job, you know that?”
The second policeman doesn’t want a big scene, so he takes Tracey outside and they wait in the doorway while policeman number one finishes up the paperwork with the assistant. She writes the shop’s address on the form attached to the evidence bag and hands it back to him.
“Okay,” he says. “I’ll be back in one hour. I’m going to take a full statement from you, because obviously, it’ll go down as evidence for her, yeah?”
The policeman leaves, stuffing the evidence bags into his coat pocket as he goes out of the door. The assistants start to relax. Excitement over. Maybe it’s time to put the kettle on and have a cup of tea.
Did you spot the robbery? All of this really happened, filmed through hidden cameras for the BBC TV programme The Real Hustle. The programme shows real-life scams, executed on unprepared “marks” like these shop assistants. In “The jewelry shop scam,” Tracey is actually presenter Jessica-Jane Clement and the policemen are co-presenters Alexis Conran and Paul Wilson. The money was real, not counterfeit. It was just a prop for the scam. When did the robbery happen? Conran, playing policeman one, walked out with the necklace in his “evidence bag” right at the end.
In this chapter I’d like to explore the general principles behind scams like this. Scams are built out of a number of carefully chosen little tricks. There must be hundreds or thousands of these tricks, but they all fall into only a few categories. Similar tricks are used by other, more legitimate, persuaders like car salesmen, marketers, lobbyists and advertisers. Their tricks fall into the same categories. When you know about the categories, you will be able to see how a scam worked after it’s over. There’s no fool-proof way to defend yourself against these tricks, but understanding the categories will make you somewhat more resistant. And of course, knowing the categories will help you make up new tricks of your own. Dangerous knowledge.
I’m going to rely heavily on the work of experimental psychologist Robert Cialdini. He started his academic career in the 1970s with a three year stint working alongside and observing “compliance professionals”: salespeople, fund-raisers, advertisers and so on. His textbook Influence, now in its fifth edition, summarises decades of work and shows how their myriad tricks fall into only six basic categories. These are the principles of reciprocation, scarcity, authority, commitment, liking and social proof. As we’ll see, they are closely related to the moral arguments that we met in chapter two. note 41
Let’s start with reciprocation. This is the principle that people are more likely to take an action in return for something they previously received. Or to put it another way, people like to say “yes” to those they owe. I think you’ll find it easy to agree with me that this principle exactly corresponds to the moral value reciprocity/fairness. A reciprocation trick plays on our innate sense of fairness by using a small “gift” to extract a bigger pay-back.
It’s important to note the way that this works: first the gift, then the obligation, then the pay-back. It works better if the gift is unexpected. For example, suppose someone approaches you, offers you a flower, and without thinking you take it. You don’t want it, but they aren’t taking it back. They say “It is our gift to you.” It seems rude to drop it on the floor. The flower-wielder’s accomplice then asks you to make a donation to their charity. Probably you do so. Then probably when they aren’t looking you put the flower in a bin. (And when you aren’t looking they take the flower out of the bin and use it again on their next mark.)
You are put in a similar predicament when a charity sends you some cheap gift like a pen or some address labels and asks for a donation. You don’t want the gift, but you don’t feel you can just throw it away. Perhaps you send them a donation to resolve your feeling of obligation. After that, you feel happy to throw the gift away or forget about it.
The principle says to do unto others as you would have them do unto you, but do it first. Asking for a donation first and giving the flower in exchange afterwards simply wouldn’t work. That is just selling flowers, a purely commercial transaction. The gift has to come first. A small gift can be surprisingly effective. Cialdini notes that mailing out a $5 cheque with an insurance survey prompted completion rates far higher than a mail-shot that promised $50 afterwards in return for sending back a completed survey. This technique was even more cost-effective than you might think, because the people who decided not to fill in the survey almost never cashed the cheque. It wouldn’t feel fair.
Larger-scale donations and pay-backs can be seen throughout politics. I don’t mean outright bribery and corruption, where crooked politicians consciously sell their influence, but rather a more subtle feeling of obligation. Whether they notice it or not, the recipient of a political donation tends to feel beholden to their donor. Although politicians might insist that they take the money then vote the way they please, the statistics show otherwise. (The fact that so many businesses in the USA contribute equally to rival candidates also demonstrates that they are not attempting to get one elected rather than the other. Instead they are stockpiling obligations for the future regardless of who wins.) Even scientists, who like to think that they are unbiased, are influenced by financial contributions which apparently have “no strings attached.” Their findings tend to be much more supportive of the interests of their sponsors. No one is immune from reciprocity. (Except probably psychopaths. Do any of these techniques work on psychopaths? It’s not clear.)
The gift doesn’t even have to be a tangible item, it can be a concession. Cialdini calls this variant “rejection-then-retreat.” Supposing I ask you for a big favour, but you turn me down. What happens if I now come back and ask you for a smaller favour? Surprisingly, you are more likely to agree to the smaller favour than if I’d asked you for that the first time round. It’s as though by asking for less than I need, I’m making a concession to you and that concession feels to you like a gift, a gift that you pay back by doing me the small favour.
When we look at the jewelry shop scam, can we find a trick there based on reciprocation? At first sight no — but think about the reaction of the assistant to Tracey near the end. She says, “You bitch! You could have cost me my job!” It looks as though the policemen have given the assistant a tremendous gift. They have saved her job. Obviously, she’s going to be more cooperative after that, at the point in the scam when the robbery actually happens.
How can you defend yourself against this technique? The defence against all these techniques is first to engage your conscious brain, to stop and think. The techniques work on the adaptive unconscious and they work best on distracted people under time pressure. So the first defensive step is always to stop and think. In this case, when you have stopped, ask your self this question: is it actually a gift, or is it a trick? When you see it for what it really is, then it loses its power.
Cialdini’s second category is scarcity. This is the principle that we tend to over-value things that seem scarce and things that are likely to be unavailable in the future. I think this principle corresponds to the moral value of harm/care. This time the link isn’t so clear, but it seems appropriate to me because all the scarcity tricks rely on forming a feeling of loss or potential loss in the target.
For example, suppose a supermarket has a few remaining TVs on a “special offer” display. You’ve been looking for a new TV but haven’t made your mind up. Seeing the few left on the display gives you a slight pit-of-the-stomach feeling. There are so few left, they must be a bargain. You should go away to think about it, but maybe they will all be gone before you come back. It’s hard to resist.
At Christmas there is often a “must-have” children’s toy, and the thing that makes it most valuable to children and their parents is that it is in short supply. Once the toy is a “must-have,” parents will go from shop to shop looking for it and they will pay inflated prices when they finally find one.
Even something that we can have but others can’t seems more valuable. Classified intelligence reports carry far more weight with their readers than is warranted simply because they are secret. Often their core information is gleaned from two disreputable men chatting in a bar. You would have no confidence in them if you met these men in person, but once their words are typed up in a report, the restricted circulation gives it an added aura of credibility.
The root of all these effects is the possibility of loss. Decades of psychological experiments have shown that we do not weigh gains and losses equally: losses loom much larger than gains. Psychologists say that we show “loss aversion.” At first this sounds completely facile: obviously anyone would rather have a gain than a loss. But that’s not what they mean. What they mean is that we value something that we have, but might lose, more highly than something that we don’t have, but might gain. Even if the things are absolutely identical.
This effect is also known as “the sunk costs fallacy” and the “endowment effect.” It explains why people in auctions can end up paying far more for something than they intended. They make a bid and are winning the auction. Someone bids against them. They counter-bid. At some point they are close to their “fair price” and their bid is the last. They feel that they now have the item. It’s theirs. They own it. Then someone else tops their bid. They have lost the item. It’s just as if the counter-bidder had walked into their home and taken it off a shelf. They feel the loss. And in that very moment, it feels more valuable. They make a slightly higher bid, in line with the new value. And so does the counter-bidder. And again. And again. Eventually someone drops out, and often the loser is left with the thought “Thank heavens I didn’t win! What was I thinking?” note 42
Rather disturbingly, we can often frame the same situation in terms of a gain or a loss. We can then be prompted into making a different choice depending on how it is framed. Cialdini describes the example of an energy-saving “home efficiency audit.” After the audit, one group of householders were told how they could gain, saving say 50 cents a day, by installing insulation. In contrast, the other group were told that they were currently losing 50 cents a day, but they could stop that by installing insulation. Over twice as many people from the second group decided to install insulation. Obviously, the situation was exactly the same, the only difference was how it was framed. note 43
Even more strangely, the risks that people are willing to take also depend on whether the situation is framed in terms of gains or losses. With a gain, “A bird in the hand is worth two in the bush,” and people prefer a sure thing rather than gamble for more. But with a loss, people are inclined to play “double or quits,” and take a risk rather than swallow a sure loss. Since the same situation can be framed as a gain or a loss, we can be driven to take different risks depending on how a situation is presented to us.
A particularly easy way to generate scarcity in a scam is to impose an artificial time limit. “For one day only,” or “Deal only valid until you leave the premises,” or “My ride’s going to leave. Do you want to buy it or not?” All scams work better when the victim is distracted and under time pressure, but this is doing something extra. It creates scarcity from nothing, making the deal much harder to resist.
Do we see scarcity in the jewelry shop scam? It’s not so obvious as in other scams. There isn’t a time limit or scarce item, but we certainly see loss aversion used as a trick. The scam-within-the-scam, where Tracey seems to pass off counterfeit cash, generates a sense of loss in the shop assistants. They are made to think about the loss they nearly suffered, maybe even losing their jobs. It’s such a relief to be spared that loss that they don’t really notice the actual loss when it happens — after all, the policeman says with a smile “You will get it back, obviously!” Not really a loss at all.
How can you defend yourself against scarcity tricks? First, notice that you are feeling more agitated. If you are feeling agitated, you know that you are not thinking straight. Calm down and take your time. Yes, you will feel a pit-of-the-stomach feeling of “I could lose this.” But ask yourself why do you really want this one, right now? Just because it looks scarce? That doesn’t actually make it better. And maybe it’s not really even scarce. Maybe it’s just a trick.
The next category is Authority. This is the principle that we tend to obey people who look like they are in charge, that we tend to defer to people who seem to be expert. I think that you’ll find it easy to agree that this corresponds to the moral value authority/respect. The classic example used to demonstrate the power of this principle is of course Stanley Milgram’s infamous “electrocution” experiments from the 1960s.
In the good old days before ethics committees, research psychologist Stanley Milgram wanted to investigate how the terrible things done by the Nazis in the Second World War could have happened. Were the people who did these things devoid of morals? Or were they ordinary people who knew that they did wrong, but they did it anyway because they were told to by the authorities? To what extent do people just follow orders?
Milgram put an advert in the local paper, asking for participants in a “study of memory and learning” and offering $4 for the one hour experiment. The experiment itself was really a carefully designed scam. When the unwitting subjects arrived at Yale University, they found Milgram dressed in a white lab-coat and carrying a clip-board, the very image of a respectable scientist. Milgram explained that one of the two subjects in that session would be the “Teacher” and the other the “Learner.” These roles would be chosen by lot. The Teacher’s job was to test the Learner’s memory and to deliver increasing electric shocks for each mistake.
In fact, the other subject was an actor, and Milgram arranged that he always played the Learner. Milgram took him to a nearby room, and came back to direct the Teacher, always played by the true subject of the experiment. As the experiment progressed the Learner made mistakes, and each time Milgram directed the Teacher to follow the planned experiment, increasing the voltage and delivering another electric shock. The settings increased in 15 volt increments up to 450 volts. Of course, the experimental setup was fake too. There were no electric shocks.
Before the first experiment, Milgram surveyed colleagues at Yale to see how far they thought the subjects would go. How many would go all the way to 450 volts? Professional opinion was that maybe 1 in 100 or 1 in 1000 people would go that far. They were wrong. Around 2 out of 3 subjects followed through right to the highest settings, despite hearing first cries, then pleas to stop, screams, and finally an ominous silence from the Learner in the nearby room. They made it clear to Milgram that they were very unhappy to keep going, but Milgram told them that they must, and they did. They bit their lips, trembled and stammered, pulled their ears and clawed at their own flesh. But they did what they were told.
After a whole series of these experiments in the 1960s, Milgram concluded that the primary lesson was that adults were very willing to go to almost any lengths at the command of an authority. The Nazis were not different people, they were not less moral. No, for the most part they were just people following orders, and ordinary Americans in Milgram’s experiments would have done just the same. He tried many variations on the experiment, with essentially the same results. When he made the authority figure more disreputable, the subjects were only slightly less willing to obey. The trappings of authority send a powerful message.
Scam artists can exploit this by giving themselves titles and introducing themselves as Doctor or Professor. Diplomas on the wall are a sign of authority, as is an ID badge or a business card for a claimed identity. Clothing has a big effect, from the uniform of a security-guard, through the semi-uniform of the lab-coated scientist to the suit and tie of the businessman. They are all trappings of authority. Tools of the trade, such as a clip-board, can give further reassurance, and even an accent or manner of speaking can give an aura of authority.
So, we are very likely to obey the orders of an apparent authority. We are also very likely to take the advice of an expert who appears to be a “credible authority.” What makes them a credible authority? Well, first they have to look like an authority — with appropriate trappings — but they also need to appear trustworthy. Experts can build trust over time, working with the same people again and again, but Cialdini describes a trick which works straight away. This is to confess a small but relevant weakness just before giving the expert advice. Never make your best point then follow up with a minor caveat. Do it the other way around. Say “there’s this small related problem you need to know about,” and tell people the caveat. Then say “but,” and go on to lay out your strong point. The word “but” indicates to your audience that they should put aside what you just said: the real message is coming next.
The jewelry shop scam is, of course, absolutely built around authority. The fake warrant cards, handcuffs, evidence bags, the unfashionable winter coat over suit-and-tie, the air of command, everything the “policemen” do just shouts authority. So the shop assistants do exactly as they are told.
How can we defend ourselves against authority tricks? As always, time to think helps a lot. Ask yourself is this person really an authority? How do I know? And if they are really an authority, should they be telling me to do this? Is that really what an authority would do?
If they claim to be an “expert,” ask yourself firstly whether they are an expert on this particular thing. Even genuine experts on one thing can falsely think they are experts on another thing. And if they really are an expert, can you expect them to be truthful? Or do they have vested interests? Did they carefully give you a piece of negative information about themselves just before they offered you their advice?
Let’s next go on to Cialdini’s fourth category which is commitment. This is the principle that people prefer to be consistent: they prefer to do what they said they would do; they prefer to do things in line with their past actions. Of all Cialdini’s categories, this seems to have least to do with moral arguments. It seems much closer to Festinger’s idea of “cognitive dissonance” that we met in the previous chapter. (A point that Cialdini himself emphasises too.)
Cialdini gives the amusing example of a researcher who posed as a volunteer worker going door-to-door in California, asking residents if they would allow a “public-service” billboard to be installed on their front lawns. To give the residents an idea what it would look like, the researcher showed them a photograph of a very large and poorly lettered sign saying “DRIVE SAFELY.” In the photograph the sign more-or-less hid the house behind. Not surprisingly, less than 2 out of 10 residents agreed. However, one particular group of residents did agree in much greater numbers: over 7 out of 10 of them said “yes.” What was different about that group? note 44
Two weeks earlier, a different researcher, also posing as a volunteer worker, had asked that group to display a little sign, only 3 inches square, which read “BE A SAFE DRIVER.” The influence of saying yes to this seemingly innocuous request was enormous, even weeks later. This is what salesmen call a foot-in-the door technique. First ask for a very small concession, then later ask for a much bigger related concession.
How does this work? It appears that our adaptive unconscious maintains an assessment of what kind of person we are, and we tend to act consistently with this assessment. When we are influenced to act somewhat differently, our adaptive unconscious re-evaluates what kind of person we are, and we tend to act consistently with that new assessment. As Cialdini notes:
You can use small commitments to manipulate a person’s self-image; you can use them to turn citizens into “public servants,” prospects into “customers,” prisoners into “collaborators.” Once you’ve got a person’s self-image where you want it, that person should comply naturally with a whole range of requests that are consistent with this new self-view. note 45
Be careful what small things you agree to do. The same researchers tried a related experiment where instead of first displaying the little sign, they asked people to sign a petition that supported “keeping California beautiful.” Two weeks later they were asked about the big “DRIVE SAFELY” sign, and half of them said yes! The experimenters were at first at a loss to explain this, but then they realised that these people’s self-image had been changed, into the kind of person “who does this sort of thing, who agrees to requests made by strangers, who takes action on things he believes in, who cooperates with good causes.” Cialdini says that he rarely signs any petitions any more, even when he already supports them. He thinks it’s too dangerous for his self-image.
However, not all commitments change self-image. To have this kind of influence, a commitment must be a deliberate choice, made in public and it must feel like a free choice. When we feel coerced, this induces a backlash. When we feel forced into something our self-image doesn’t change. So, badgering someone to do something, or even giving them several strong arguments for something, is counterproductive. One good argument is more convincing than several great arguments. Several arguments, even several great arguments, can be confusing and feel like badgering. In contrast, people who are only only barely convinced feel that they made a free choice for themselves. Their new self-image will drive them to fill in the rest without effort. Stand back and let them do it.
Explicit public commitments are more effective than private or implicit commitments. Cialdini gives an example which reduced the rate of no-shows for bookings at a restaurant. Previously, the booking-taker at the restaurant had said “Please call us if you change your plans.” The restaurant changed this script very slightly to “Will you please call us if you change your plans?” The booking-taker was instructed to then pause, waiting for a reply. The customer filled the pause by replying, “Yes.” No-shows dropped from 3 in 10 bookings to 1 in 10 bookings.
Written commitments are even more effective. Companies selling door-to-door found that sales went down when new laws setting a “cooling-off” period made their previous scarcity-based tricks ineffective. Many people who had agreed to the pressure-selling tactics on the day cancelled their agreements after they had time to think. However, the companies discovered that they could counteract this by getting the customers to fill out the sales agreement themselves in their own handwriting. Cancellations fell dramatically. People want to deliver on commitments they have written down, particularly where these commitments are witnessed by other people. It seems that the more effort that goes into a commitment, the more it changes the self-image of the person who made it.
Do we see commitment tricks in the jewelry shop scam? It’s quite subtle, but when you look carefully, commitment explains lots of little points in the script. The policeman tells the assistants to “Leave that money on the counter.” They show no sign of doing anything else, but of course they obey in this trivial request. Then he asks them to let in his colleague, which of course they do. It all looks very official. One little request leads to another, a cascade of compliance. At the very end we see the assistant filling in the form on the evidence bag in her own handwriting. She is completely committed to believing the scam.
How can you defend yourself against commitment tricks? That pit-of-the-stomach feeling can give you a clue. If you get that queasy feeling that you have been set-up, that you are being led a step too far, you can just stop and back out. Once the trick is visible to your conscious mind, it’s been defeated. The worst the scammer can do is to complain at you for not being consistent, which you can probably brush off as mere whinging.
If you don’t have that pit-of-the-stomach feeling, but you still have suspicions, you could try the “heart-of-hearts” technique: ask yourself, if you could go back in time knowing what you now know, whether you would make the same choice again. Say you decide to buy a particular car, but after making your choice the salesman says “I’m very sorry, but that special deal isn’t available any more, I’ll have to charge you the regular price.” (This technique of deliberately reneging on their own commitments is known by salesmen as the “low-ball” technique.) Would you still buy it? Ask yourself, in your heart-of-hearts, what you would do if you knew at the very start that this would be the price. Probably you will walk away.
Cialdini’s last two categories are liking and social proof. They are related but different. Liking is the principle that people tend to say “yes” to those they like. Social proof is the principle that people tend to follow the lead of others, to look around and see what’s normal and then do the same. I think that liking corresponds quite strongly to the moral value ingroup/loyalty, but social proof appears not to be based on moral values. Let’s look at liking next.
We find our friends more convincing than strangers, and not just because they have built up a trustworthy reputation over time. In general we believe people who we like, and we believe people more if they seem similar to us or attractive in some way. Physical attractiveness has a very large effect on how we treat other people, though we don’t realise that it does. Cialdini gives many examples where researchers have found that attractive politicians get more votes, attractive job applicants get hired more and paid more, attractive criminals get lighter sentences, and so on. The only exception is when an attractive person is seen as a romantic rival. Other than that, physical attractiveness has a kind of “halo” effect, translating in other people’s eyes into perceived intelligence, good-will and trustworthiness. note 46
Even if they are not especially attractive, people who look and act the same as you also seem more convincing. We like people who are like us, for example if they dress the same, or have the same interests. In other words, people who seem to our adaptive unconscious part of the same “ingroup.” Successful salesmen are adept at finding similarities with customers and pointing them out. These really do make a difference. Even a trivial similarity like a similar sounding name can make a large difference to success rates.
We also like people when they like us, so we tend to like people who smile at us. However, to be convincing, you need a true “Duchenne” smile, which is hard to fake. We also unconsciously tend to mimic people who we like, adopting a similar posture, speaking in a similar way and mirroring their gestures. A salesman can consciously and deliberately “mirror and match” their customer to exploit this effect. But there’s a risk in doing this that it won’t quite ring true. If the customer consciously notices the trick, it will just seem creepy and dishonest, scuppering any deal. It’s more reliable to just boldly say “I like you” or something similar, because we seem to be suckers for praise, even when we realise that it’s insincere.
Even mere familiarity can breed liking. The key here is that it must be in the context of a situation which is cooperative or at least neutral. In that setting, the more we see someone, the more we like them. However, when we are forced into contact with people under unpleasant conditions, where there is frustration, competition and conflict, then the more we see them, the less we like them. This is exactly as you would predict from the moral value ingroup/loyalty: people whose status is initially unclear will be gradually sorted by our adaptive unconscious into two categories: “ingroup,” who we like and “stranger,” who we have contempt for. (It’s possible to revise this classification by distracting people and getting them to engage in a common cooperative task, but it’s hard work.)
Cialdini gives another example, which we might call “shoot the messenger,” where events themselves rub off and influence whether we like someone. He was once called by a distraught weatherman from the local TV station, who wanted to know why people hated him so much. The weatherman was getting hate mail:
“One guy threatened to shoot me if it didn’t stop raining,” he said. “Christ, I’m still looking over my shoulder from that one.”
Cialdini explained that it was just human nature to associate good or bad news with the person carrying it. In ancient times a messenger carrying good news could expect to be lauded as a hero, while a messenger carrying bad news would get a terminally frosty reception. The weatherman was just an innocent messenger, blamed in the same way. When he understood this, the weatherman saw that his situation literally had a bright side. As he said:
“I’m in Phoenix where the sun shines 300 days a year, right? Thank God I don’t do weather in Buffalo.”
Turning back to the jewelry shop scam, can we see any tricks based on liking? Tracey is obviously a very likable woman, attractive and well-dressed. But from her accent, she is closer socially to the shop assistants than you might assume from her clothes. She would be the perfect person to operate the pretend scam-within-the-scam of passing counterfeit cash. This is what makes the subsequent turn of events so convincing. The division of roles between the first policeman and his backup was also well chosen: the first policeman is clean-cut and handsome, while his backup is heavier, balding and less attractive.
Can we defend ourselves against tricks based on liking? Cialdini notes that there are so many potential tricks that it’s pointless trying to look out for them. Instead, he recommends that we should be on our guard for their effects. Stop and think: do you feel an unusually strong rapport with the salesman who you only met 20 minutes ago? If so, spend a moment to separate the person from the deal. After all, this person is not going to be your new best buddy. You are only going to get the deal. Is it a good deal in itself? Would it still seem like a good deal if someone else was offering it?
Let’s turn now to Cialdini’s last category, social proof. This is the principle that we look at the people around us and we tend to act the same way. This seems to be based on the “availability heuristic,” which says that we consider things more believable when they come easily to mind. Social proof is related to that effect: when people do something right in front of us it’s easier to believe that it’s normal. When a lot of people do something, it’s easier to believe that it’s normal. But most people don’t realise how influenced they are, how hard it is to go against this apparent normality.
For example, when my son was at infant school I would sometimes pick him up at the end of the day. The parents would gradually gather next to the playground gate, but we were not supposed to go though until five minutes before the home-time bell rang. With only a few people by the gate, someone would notice the time and open the gate. But there was a critical number of waiting people. When the group was larger than this, no-one was willing to make the first move. Numbers built up further outside the gate, people looked at their watches, but no-one went in. Often it was only when the school bell rang that someone at the front of the throng of parents felt that they now had permission to open the gate.
Think of this as a kind of herd instinct. When we see people walking past a prone body in the street, we are inclined to keep walking too. When depressed people read about a suicide in the paper, they are more inclined to kill themselves too. When people see TV adverts for fast-food restaurants, they are more inclined to to eat there, not because their attitude to the product itself changes, but because they regard eating there as more normal. As you might expect from the phenomenal sums spent on advertising, seeing people on TV is just as effective as seeing them in real life.
People can be prompted into desirable behaviour by making it seem normal. They can also be accidentally prompted into undesirable behaviour by chastising messages saying how bad it is. “Look at all the people who are doing this bad thing,” says the message. But what people hear is “This is normal. You can do it too.” Cialdini gives an example of one of his graduate students who stopped at the Petrified Forest National Park in Arizona with his fiancée. Because so many visitors had been stealing pieces of petrified wood, the park had put up a large sign by the entrance, which said “Your heritage is being vandalised every day by theft losses of petrified wood of 14 tons a year, mostly a small piece at a time.”
The graduate student recounted how he was shocked when they read the sign at the entrance and his fiancée — who he described as the most honest person he had ever known — nudged him with her elbow and whispered in his ear “We’d better get ours now.” By making it clear that the thefts were frequent, the sign had inadvertently made it seem that they were normal.
Subsequently, Cialdini organised experiments which showed that bad signs like the one by the entrance really do increase theft (by a factor of 3 in their study). They also tried signs that marginalised theft, saying “If even one person steals, it undermines the integrity of the forest.” Those signs halved the rate of theft. The moral of this story is to be careful when expressing concern about a problem. Don’t do it in a way that makes it seem frequent, common and normal.
Two factors make social proof especially powerful. Firstly, when the situation is uncertain, and people are confused about how to act they rely on others for reassurance. Is the man on the pavement drunk, or is he having a heart-attack? What do other people think? In a clear-cut emergency, people are eager to help, even to risk their lives. If they are not sure, and they see others doing nothing, they will hang back. The second factor is similarity: people are most influenced by social proof from people similar to themselves. (A clear link here to the previous category, liking.)
Looking at the jewelry shop scam, we can see some clear examples of social proof. When Tracey is arrested, she is open-mouthed with surprise. The assistants are very surprised and the situation is very uncertain. However, Tracey has already established that she is likable, so when she accepts the policeman’s story, this is strong social proof that the assistants should accept it too. She lets the second policeman take her away from the counter and put her in handcuffs. She looks glum and resigned. Her role now is to offer social proof that the story told by the policeman at the counter really is true.
Social proof is often used in scams. The people obviously working the scam are often surrounded by a supporting cast of “shills” who appear to be innocent members of the public. Their rôle is initially to draw in the “mark” using social proof by showing interest and enthusiasm for the scam. After the central part of the scam is over, and the mark has been separated from their money, the shills perform damage-control by “cooling off the mark.” They offer social proof that the mark shouldn’t attempt to recover their money, that they shouldn’t call the police, that they should just let it go. The scam is often bigger than it seems and social proof is a vital part of it.
How can we defend ourselves against social proof tricks? When you know what to look for, you can easily see attempts at social proof in advertising. They are as easy to spot as canned-laughter backing-tracks to cheap sitcoms. Once we notice, they still have some influence on us, but it’s as much annoying as convincing. However, with well prepared scams like the jewelry shop scam, you are unlikely to notice the shills until the scam is over, and maybe not even then.
Those are Cialdini’s six categories of influence: reciprocation, scarcity, authority, commitment, liking and social proof. But if you have been keeping count of the corresponding moral values, you’ll have noticed that there should be at least one more category, corresponding to purity/sanctity. Did Cialdini miss one? I think he did, because he was concentrating exclusively on legitimate influence techniques, not illegal scams.
I think we should call this new category contamination. This is the principle that people tend to avoid things that make them feel contaminated, and if they do feel contaminated they want to conceal it from others. I’m reassured that this really is a category of tricks by recent work from security expert Frank Stajano and Paul Wilson, co-presenter of The Real Hustle. (He plays the second policeman in the jewelry shop scam). They have constructed another taxonomy of scams, related to but different from Cialdini’s and not based on moral values. They call one of their categories “the dishonesty principle,” but it’s essentially what I am calling contamination. note 47
It’s a routine part of many scams that the mark is tempted into doing something which is dishonest. For example, they might be lured into buying something from a man in a pub which “fell off the back of a lorry.” There’s a tacit understanding that really it’s been stolen, but out of politeness, nobody says that explicitly. It’s too good a deal to resist, so the cash changes hands. The operator of the scam leaves and when the mark gets a chance to look at his purchase, it’s not what he saw earlier. He’s been duped. He feels angry, but also embarrassed, dirty. He knew that he was being dishonest. Is he going to tell the police? No.
We see a contamination trick being used in the jewelry shop scam. Tracey counts out her money onto the counter in a rather unusual way, in several separate piles of notes. When she is finished the money is all over the counter. The policeman then pounces and announces that this is counterfeit money — even though it looks real, it’s contaminated and it’s all over the counter. But now it’s evidence, and the policeman starts to gather it up. Unfortunately, the necklace is also evidence — so it’s contaminated too. The assistant holds it gingerly between finger and thumb, dropping it into the evidence bag at arms length. In a way, she’s relieved that the nice policeman is decontaminating her shop.
So, contamination is an important category of tricks. Are there any more? I suggested in an earlier chapter that by analogy with the corresponding emotions there could be two more moral values, which I called wonder/curiosity and loving-kindness. Are there tricks based on these other moral values?
I think that tricks based on loving-kindness are used everyday by beggars and sometimes by salesmen. They are so obvious that Cialdini didn’t think it worth mentioning them. Let’s call this category distress. You see a beggar with a moth-eaten dog, you put your hand in your pocket and leave them some change, or you feel guilty that you should have done so. You meet a salesman who seems down on their luck, and you buy something you don’t really want. But were they really needy, or just faking it? This fraudulent appeal to charity is so commonplace that we have to take a moment to realise that just asking can still be a trick. (Is it really based on loving-kindness? That makes my classification look neater, but you could argue that maybe it has more to do with harm/care.)
I’m also happy to suggest that there should be another category which I will call mystery, based on wonder/curiosity. This is the principle that people just can’t resist looking in a forbidden place to see what’s there. This is over and above feelings of greed and envy. After all, greed is just a misplaced sense of fairness/reciprocity — in a fair world I would have more stuff. Similarly with envy, in a fair world I would have that stuff, not you. With wonder/curiosity we are driven to look inside the box without any expectation of profit. We don’t know whether it’s good or bad. Yet it’s hard to resist, isn’t it?
How many stories revolve around forbidden secrets? The heroine is told, “You can go anywhere else, but don’t open that locked door.” Or an old guy in a beard says to the hero, “Look at this garden. I’ve planted all kinds of trees. You can eat the fruit of all of them except that one there. Don’t eat that.” Or someone gives our heroine a box and says, “Under no circumstances are you to open this.” We know what happens.
A report by the British Office of Fair Trading gives an example of someone tempted into an e-mail scam by curiosity:
“Every other day, I got it through and I used to delete it and then I
was just sitting there and I thought ‘Oh, I’ll just do it this time and I’ll
see what it’s all about and...’ ”
Researcher: “So almost like curiosity.”
Interviewee: “It probably was curiosity, just to see what actually happened, because I kept getting it through and through, for ages and ages and ages.” note 48
Cialdini explains the value given to censored or banned material in terms of the scarcity principle. And yet, as he says:
The intriguing finding about the effects of censored information on an audience is not that audience members want to have the information more than before; that seems natural. Rather it is that they come to believe in the information more, even though they haven’t received it. note 49
In some ways this sounds more like mystery than scarcity. To use mystery as part of a scam, the operator could deliberately arrange for something to be forbidden so as to tempt the mark. Cialdini lends some support to this idea, saying:
The worrisome possibility is that especially clever individuals holding a weak or unpopular position on an issue can get us to agree with that position by arranging to have their message restricted. The irony is that for such people — members of fringe political groups, for example — the most effective strategy may not be to publicise their unpopular views but to get those views officially censored and then to publicise the censorship.
So, I think there are tricks based on mystery, but they are much rarer in practice than the others.
The key lesson from all these tricks is to notice that they work at all levels from small-scale scams to national governments, from door-to-door salesmen to international advertising campaigns. It seems to be an open question whether they work on psychopaths. From first principles we would expect that only commitment and social proof would have a significant effect on them, because those two categories of tricks are not founded on moral values and emotion in the same way as the others. However, I’m not aware of any research which would resolve the question.
|< Chapter 3. Truth||Contents||Chapter 5. Science >|
Version: DRAFT Beta 3. Copyright © Stuart Wray, 29 December 2011.