Curse of Knowledge

The Basic Idea

People often say “knowledge is power.” But what happens when we all have different levels of knowledge?

The “curse of knowledge,” or “the curse of expertise,” is a cognitive bias where we incorrectly assume that everyone knows as much as we do on a given topic.1 When we know something, it can be hard to imagine what it would be like not knowing that piece of information. In turn, this makes it difficult to share our knowledge, because we struggle to understand the other party’s state of mind.

This is a facet of learning that The Decision Lab hopes to address. Many people with interest in behavioral science come from different backgrounds and levels of familiarity. Our goal is to make information as digestible as possible: to do that, we need to avoid the curse of knowledge!

Lots of us have expertise in particular areas. Becoming an expert in something means that we become more and more fascinated by nuance and complexity. That’s when the curse of knowledge kicks in, and we start to forget what it’s like not to know what we know.


– Chip and Dan Heath, authors ofMade to Stick: Why Some Ideas Survive and Others Die

Theory, meet practice

TDL is an applied research consultancy. In our work, we leverage the insights of diverse fields—from psychology and economics to machine learning and behavioral data science—to sculpt targeted solutions to nuanced problems.

Our consulting services

History

In 1975, American psychologist Baruch Fischhoff published, “Hindsight ≠ foresight: The effect of outcome knowledge on judgement under uncertainty.”2 Fischhoff developed a method to examine hindsight bias, a cognitive bias where an event’s outcome seems more predictable after we know what happened.

Fischhoff’s method consisted of presenting participants with four possible outcomes to a short story.2 Some participants were told which one of the four outcomes was true; other participants were not given any information. Then, all participants were asked to determine the likelihood of each outcome. Fischhoff found that, when a participant was told an outcome was true, they frequently assigned a higher probability to that outcome. On top of overestimating the probability of outcomes for which they had extra information, participants also failed to reconstruct their prior, less knowledgeable states of mind.

Stemming from Fischoff’s work on hindsight bias, the term “curse of knowledge” was first used in the 1989 article “The curse of knowledge in economic settings: An experimental analysis” by economists Colin Camerer, George Loewenstein, and Martin Weber.3 They credited British-American psychologist Robin Hogarth with coining the term, and they explored the curse of knowledge in the context of economic transactions.

Their study observed that different economic agents have different amounts of knowledge.3 Sellers tend to know more about the value of their products than prospective buyers; workers tend to know more about their skills than prospective employers. Crucially, Camerer, Loewenstein, and Weber argued that the curse of knowledge perpetuates this informational imbalance even when an agent wants to convey what they know. They also argued that this unintentional imbalance had two consequences:

  1. Better-informed agents might suffer losses – having more information might hurt us!
  2. The curse of knowledge can mitigate market consequences that result from information asymmetry. A fruit seller might lower prices to reflect unobservable defects.

Following Camerer, Loewenstein, and Weber’s work, Elizabeth Newton, a graduate student in psychology at Stanford in 1990, developed an experiment that is now a classic example of the curse of knowledge.1 4 She asked participants to tap out popular songs with their fingers (known as the “tappers”), before predicting how many of those tapped melodies would be recognized by other people (known as the “listeners”). She also asked the tappers to predict how many people would guess the melody correctly.

In a sample of 120 melodies, listeners got it right only 2.5% of the time.1 The tappers had predicted a 50% success rate: they grossly overestimated how well the listeners could guess, due to the curse of knowledge. Once they were given a song to tap, they couldn’t help but hear the melody that their tapping was based on, so they assumed the listeners would also hear the melody.4 In reality, all the listeners heard was a random series of taps.1

The curse of knowledge was popularized in the 2007 book, Made to Stick: Why Some Ideas Survive and Others Die.10 Therein, Chip and Dan Heath brothers explore the concept of “stickiness:” making ideas become memorable and interesting. Their claim is that, by making ideas sticky, we can avoid the curse of knowledge: they become so memorable we never forget them in the first place. If our memory of a given choice is really memorable, we would be less prone to remembering and reevaluating that choice in accordance with what we know now.

Consequences

Camerer, Loewenstein, and Weber discussed the implications of the curse of knowledge for economics.3 The first type of situation where the curse of knowledge might be important is in cases of asymmetric information. When people have private information that less informed people lack, the traditional economic assumption is that better informed people, behaving as a homo economicus, will optimally exploit their informational advantage.

However, the study found the opposite. A homo economicus with an information advantage would make the same offer regardless of the monetary amount to be divided–that way, they could maximize their profits. Yet, Camerer, Loewenstein and Weber found that people present a larger offer when the amount to be divided is larger. Well-informed people simply assume everyone has the same information, so they make a fairer offer because they think everyone knows what they know. In this situation of informational imbalance, the curse on knowledge yields a fairer deal.

The second situation where the curse of knowledge might be important is when people learn more over time and have to reconstruct their earlier perspectives.3 Here, the curse of knowledge interacts with another bias (hindsight bias). When we’re evaluating our decisions in the past, we tend to mistakenly assume that we knew then what we know now. So, we think that an unfavorable outcome was known to be unfavorable, and that a favorable outcome was always known to be favorable. The curse of knowledge, along with hindsight bias, leads us to judge previous choices according to our current knowledge, even when we didn’t have that information at the time.

A third situation where the curse of knowledge is at play is in teaching, especially for engineering and science in higher education.4 While instructors have extensive expertise in their fields, there tends to be a disconnect between what they understand and what their students understand. Some credit this disconnect to instructors’ lack of training and experience in teaching, being distracted by greater interests in research, or simply not caring about their students’ understanding.5 

The curse of knowledge gives us a more straightforward (and more charitable) explanation. While expertise in a field can increase instructors’ confidence in their ability to teach, they struggle to deliver the material in a way that fits what their students know.6 Cognitive scientist Steve Pinker identified the following issues in how instructors in higher-education deliver content:7

  • Abstract language is used for already complex topics;
  • There can be clumsy transitions between related topics, making the connection between them opaque;
  • Instructors often use so-called zombie nouns instead of verbs or adjectives (verb + ization, e.g., operationalization); and,
  • There can be inadequate interpretations of external sources, further confusing students.

The curse of knowledge also has implications for our everyday lives.9 Many people write to-do lists to keep themselves organized, but sometimes they can look less like a comprehensive list; and more of a collection of scattered sticky notes. This is because, when we lose or forget one of the items of our to-do list, and then reencounter it months later, the curse of knowledge makes us forget why that needed to be done. So, a to-do list item becomes a pointless series of words scrawled on a page.

Case Study

Beating the curse of knowledge

Who doesn’t love online shopping? A few clicks here and there, and voila! A pretty package shows up at your door with everything you need – and you didn’t even have to change out of your pajamas. One caveat to the convenience of online shopping, however, are the delivery times. If you’re unlucky, you might have to wait three weeks for a package that was promised to be delivered in one week, or your package might get lost completely. And by that point, the curse of knowledge might kick in: employees and managers might forget which packages went where, because they can no longer remember what it was like to have that old information.

FedEx recognizes these struggles and avoids the curse of knowledge by telling a convincing story with concrete language.1 Executive members who developed the FedEx vision could have easily used ambiguous language with no motivational component to reach their goals – but this wasn’t the case here. Rather, FedEx’s leaders avoided the curse of knowledge by translating their strategies into actionable language: packages are promised to “absolutely, positively” arrive overnight. This ensures that the promised deadline becomes “sticky,” thereby ensuring that FedEx’s promises turn into actions. Sticky words become actions. FedEx also incentives employees away from the curse of knowledge by awarding a Purple Promise award to people who uphold the company’s guaranteed delivery time.

Faculty development strategies

In light of the discussion around the curse of knowledge in higher education, researchers have explored potential strategies to increase awareness of this cognitive bias among instructors, and close the teacher-student gap.4 In line with the Heath brothers’ suggestion that sticky messages can be used to combat the curse of knowledge,10 the following principles have been suggested for instructors:4

  • Simplicity: Section out content and distill it into concise, accessible messages.
  • Unexpectedness: Step beyond the mindset that since you are curious about your content, your students will be too. Instead, actively share opportunities for curiosity!
  • Concreteness: Provide and ask students to provide concrete, specific examples.
  • Credibility: Use problem- or project-based learning to engage students with issues related to the content. They may want to learn more about a topic once they’ve engaged with it.
  • Emotions: Allow room to cope with the emotional frustration that can be part of learning.
  • Stories: Use stories to make content more personal and accessible to students. Stories are also great memory aids!

On top of these six principles, researchers suggest that instructors evaluate students’ mindsets at the beginning of course, to better understand what they know initially.4  Researchers also suggest collecting feedback throughout the course, to make sure students can convey their feelings, experiences, and stories. Gathering this feedback can inform successful teaching practice, fostering student-teacher connections that make it easier for teachers to understand their students’ perspective. It might even lead to better grades!

Related TDL Content

How working from home can amp up your team’s communication and creativity

You’d be hard-pressed to have a conversation about 2020 and 2021 without mentioning COVID-19, as the pandemic has touched many areas of our lives. One of these areas is the transition to remote work. The curse of knowledge can be difficult enough to navigate during in-person interactions, but it can be an entirely different story in the online world. Take a look at this article for insights on how to capitalize on remote work, and how to beat the curse of knowledge.

Zooming Out: The Impact of Distance on our Decisions

As we saw earlier, the curse of knowledge happens when we have some distance between what we know now and what we knew before. However, this kind of effect temporal distance has in our decision-making is only one way in which distance affects our thinking. In this article, our staff writer Kaylee Somerville walks through the effects other kinds of distance have on what we decide.

Sources

  1. Heath, C., & Heath, D. (2006, December). The curse of knowledge. Harvard Business Review. https://hbr.org/2006/12/the-curse-of-knowledge
  2. Fischhoff, B. (1975). Hindsight ≠ foresight: The effect of outcome knowledge on judgement under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288-299.
  3. Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse of knowledge in economic settings: An experimental analysis. Journal of Political Economy, 97(5), 1232-1254.
  4. Froyd, J., & Layne, J. (2008, October). Faculty development strategies for overcoming the “curse of knowledge”. In 2008 38th Annual Frontiers in Education Conference (pp. S4D-13). IEEE.
  5. Wieman, C. E. (2007). APS News – The back page. The” curse of knowledge” or why intuition about teaching often fails. American Physical Society News16(10).
  6. Fisher, M., & Keil, F. C. (2016). The curse of expertise: When more knowledge leads to miscalibrated explanatory insight. Cognitive Science, 40(5), 1251-1269.
  7. Leddy, C. (2012, November 8). Exorcising the curse of knowledge. The Harvard Gazette. https://news.harvard.edu/gazette/story/2012/11/exorcising-the-curse-of-knowledge/
  8. Birch, S. A. J., & Bloom, P. (2007). The curse of knowledge in reasoning about false beliefs. Psychological Science, 18(5), 382-386.
  9. Berg, A. (2021). 33 Key Principles for Success. Kindle.
  10. Made to Stick: Why Some Ideas Survive and Others Die. (n.d.). Heath Brothers. https://heathbrothers.com/books/made-to-stick/
Notes illustration

Eager to learn about how behavioral science can help your organization?