Top Bar

Go to Current Issue Volume , Issue

As the Webster University community worked to wrap up another academic year this past May, I, like many of my colleagues, found myself talking with students about their plans. For some, preparation for graduate school was in their futures. For others, there was excitement about internships or jobs they had waiting for them upon graduation. Still other students seemed happy to be finished, but they had little idea what they would do next. The one thing I noticed that these students had in common was their optimism about the future.

In short, people tend to underestimate their own risk, or think that their own odds are better when compared to their peers. When researchers study optimistic bias, they typically ask participants to rate the extent to which they believe the chances of some event will occur for other people, then to rate the chances of the same event happening to them.

However, given today’s difficult economic climate, I couldn’t help but wonder, “How could the majority of these graduates muster such confidence? Are they not aware of the crippling student loan debt crisis, approaching nearly $40,000 for the average college graduate? Did they not read the stories in their social media feeds about how Millennials were lazy and would never find meaningful careers?”

Nearly 40 years ago, psychologist Neil Weinstein (1980) coined the term optimistic bias to refer to a “mistaken belief that one’s chances of experiencing a negative event are lower (or a positive event higher) than that of one’s peers” (Klein, n.d.). In short, people tend to underestimate their own risk, or think that their own odds are better when compared to their peers. When researchers study optimistic bias, they typically ask participants to rate the extent to which they believe the chances of some event will occur for other people, then to rate the chances of the same event happening to them. Quantifying the difference between these perceptions yields a measure of optimistic bias.

I’ll bet that if you are being honest, you can think of a number of instances in your own life when you have exhibited the optimistic bias. Have you ever texted while driving? You know it’s dangerous, but you believe yourself to be a skilled driver and a fast texter, so it’ll be fine. Of course, the friends and families of the 20% of teens involved in fatal accidents who were distracted by their phones would beg to differ.

In 9th grade, I tried cigarettes for the first time. Now, given that I had parents who smoked (and I grew up when parents widely smoked with their kids in the car!), and that their smoking really bothered me, I had no illusions that smoking was harmless. Further, I had seen my parents’ unsuccessful attempts to quit smoking, so I knew it was highly addictive. Did that stop me from trying that first cigarette? Unfortunately, it did not. I knew it was harmful and I knew it was addictive, but I didn’t think it would be harmful or addictive to me. Given that I went on to smoke one to one-and-a-half packs of cigarettes per day for roughly 10 years, I was obviously wrong.

As an adolescent, I was particularly susceptible to optimistic bias, which is an offshoot of the personal fable phenomenon characteristic of adolescent cognition. Unfortunately, I was not an outlier, in that adolescents and emerging adults are particularly susceptible to this sort of thinking. Health and developmental psychologists have termed these periods of the lifespan as windows of vulnerability because of the intersection of the drive for independence, increased opportunity, decreased supervision, and, of course, clumsy cognition.

Indeed, research has demonstrated the optimistic bias in a variety of health-related domains. In fact, student researchers within my own lab at Webster University have investigated optimistic bias in order to better understand emerging adults’ beliefs about and reasons for using electronic cigarettes, as well as parents’ beliefs about their adolescents’ substance use.

Interestingly, however, optimistic bias is not confined to the health domain. Recent research by the Yale Program on Climate Communication illustrates how optimistic bias influences our beliefs about what is perhaps the most critical challenge facing our world today: climate change. In their survey, the Yale researchers demonstrated that the vast majority of Americans agreed that “global warming will harm people in the United States.” However, significantly fewer respondents agreed with the statement that “global warming will harm me, personally.” Other research has shown similar discrepancies in individuals’ beliefs about their financial security, with respondents believing that their investments and savings strategies will somehow be stronger than those of the general population.

Improved health literacy would perhaps reduce the discrepancy in risk perception for oneself versus others. Greater financial literacy may help individuals better plan their savings strategies. Improved awareness of employers’ desired skillsets and perceptions of prospective employees may help graduates and other job seekers better and more accurately access their own skillsets.

Unfortunately, this way of thinking is not something that we simply outgrow. In fact, since Weinstein first demonstrated this phenomenon, subsequent research has shown that individuals of all ages are likely to fall victim to optimistic bias in a variety of domains. Teens use illicit substances and often fail to consider the consequences of their actions because they believe that bad things simply won’t happen to them. Upon entering the workforce, many college graduates show a discrepancy between how prepared they believe themselves to be versus how prepared prospective employers find them to be. And if we believe the research, those who are fortunate enough to have satisfying, decent-paying careers are unlikely to be saving enough for retirement, putting their future well-being at risk.

As demonstrated, adults of all ages regularly and consistently engage in poor decision-making because they believe themselves to be immune to what they know to be true for most of the population. So, what can be done about this?

Understanding how cognition works, and how it can often fool us, is an important first step in making better decisions. Thus, people need to know that the optimistic bias exists. As my colleague Julie Smith argued in a previous Glimpse article, literacy is key. Improved health literacy would perhaps reduce the discrepancy in risk perception for oneself versus others. Greater financial literacy may help individuals better plan their savings strategies. Improved awareness of employers’ desired skillsets and perceptions of prospective employees may help graduates and other job seekers better and more accurately access their own skillsets.

A second step, of course, is to be honest and humble about our own behaviors, then to do something about those areas over which we wish to exert greater control. After all, we can’t all be better than average, can we?