Does Anchoring Work In The Courtroom?

read time - icon

0 min read

May 07, 2020

Anchoring is one of the most prevalent and enduring heuristics that decision-makers encounter in their daily lives and is particularly powerful when making decisions under uncertainty. An anchor is an initial reference point that has an outsized impact on how decision-makers interpret and encode subsequent information on the topic. And most strikingly, we struggle to overcome the anchor’s effect even when given incentives to do so or when made conscious of the resulting bias.

Anchoring: A Simple Example

Take for example a study where participants were asked to recall the last two digits of their social security number and then were subsequently asked to price a bottle of wine. You can already guess the outcome: those with social security numbers ending with high digits (think 70s, 80s, or 90s) were willing to pay more for the wine than those with social security numbers ending with lower digits.

Behavioral Science, Democratized

We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices. 

At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.

More about our services

[Image taken from https://blog.kameleoon.com/en/cognitive-biases/]

But does anchoring work in the courtroom too?

Anchoring Effect & Juries

Juries are just regular people without any specialized legal expertise, so you’d expect that just as regular people are influenced by anchors in pricing wine, they might also be influenced by anchors in the courtroom.

And you would be right. 56 mock jurors were presented with a hypothetical case where the plaintiff was arguing that her birth control pill caused her ovarian cancer. She was suing the Health Maintenance Organization (HMO) for prescribing her the pill. In the high anchor group, the plaintiff asked for $5 million in damages; in the low anchor group, she asked for only $20,000. The question was: is this anchor going to affect the jurors’ perception of causation?

The answer, predictably, was yes. Jurors in the low anchor condition were 26.4% confident that that the HMO caused the injury, whereas jurors in the high anchor condition were 43.9% confident that HMO caused the plaintiff’s injury.

[image from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]

Anchoring Effect & Judges

Judges, unlike juries, are subject-matter experts. Can they really be susceptible to anchoring as well?

According to some preliminary research, the answer is probably, yes. Once an anchor is set, research suggests that a judge is more likely to interpret subsequent information around that anchor, even if the anchor is totally irrelevant.

In one study, judges were presented with a hypothetical case involving a shoplifter who had just been caught for the 12th time. The judges were asked to sentence the shoplifter, but only after the prosecutor made a sentencing demand. And here’s the twist, the judges were told ahead of time that the prosecutor’s demand was totally arbitrary and random; therefore the prosecutor’s sentencing demand contained no useful information.

Even so, the judges who received the low anchor (i.e., the prosecutor demanding a shorter sentence) landed on a shorter average sentence than the judges in the high anchor condition.

[image from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]

Ok, so the judges assigned weight to a prosecutor’s (random) sentencing demand. But does this really prove the anchoring effect in judges?

The AI Governance Challenge book
eBook

The AI Governance Challenge

So, to squash any doubts about the anchoring effect, the same group of researchers went about designing the most absurd scenario possible. Here, a group of judges were given a hypothetical case where prosecutors were charging a defendant with theft. Instead of being given the prosecutor’s sentencing demand, the judges were told to discover the sentencing demand by rolling a dice. (Yes, really.) The dice was rigged so as to land on high numbers for one group of judges and low numbers for the rest.

And … somewhat unbelievably, the outcome of the dice-rolling exercise influenced the judges’ sentencing decisions.

[image from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]

Ok, so even if the results of this study hold, judges don’t roll a dice before they make sentencing decisions. And prosecutors don’t make random sentencing requests. So does anchoring really affect judges’ decision-making on the bench?

Well, it might. Englich et al. explain:

“Even though judges typically do not throw dice before making sentencing decisions, they are still constantly exposed to potential sentences and anchors during sentencing decisions. The mass media, visitors to the court hearings, the private opinion of the judge’s partner, family, or neighbors are all possible sources of sentencing demands that should not influence a given sentencing decision.”
[Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making (2006) by Birte Englich, Thomas Mussweiler, & Fritz Strack]

References

“Coherent Arbitrariness”: Stable Demand Curves without Stable Preferences (2003) by Dan Ariely, George Loewenstein and Drazen Prelec

The More You Ask for, the More You Get: Anchoring in Personal Injury Verdicts (1996) by Gretchen B. Chapman & Brian H. Bornstein

Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making (2006) by Birte Englich, Thomas Mussweiler, & Fritz Strack

The Anchoring Bias and Its Effect on Judges by Rod Hollier, https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom

About the Author

Tom Spiegler's portrait

Tom Spiegler

Georgetown

Tom is a Co-Founder and Managing Director at The Decision Lab. He is interested in the intersection of decision-science and the law, with a focus on leveraging behavioral research to shape more effective public and legal policy. Tom graduated from Georgetown Law with honors. Prior to law school, Tom attended McGill University where he graduated with First Class Honors with majors in Philosophy and Psychology.

Read Next

Insight

Thinking Outside the App

How can we bridge the gap between virtual interfaces and real-world experiences? We need to design beyond the screen.

Notes illustration

Eager to learn about how behavioral science can help your organization?