I'd like to begin this video with an item from the Snapshot Quiz: the question about four playing cards. So, if you haven't completed the quiz yet, please stop this video. Truly, stop the video. Stop! Stop the video! It's very important that you complete the quiz because it sets up not just the rest of this video, but the rest of the course. The Snapshot Quiz is designed to be fun and interesting, and it's there for you, not for me. I'm not selling anything, I'm not going to share anybody's answers or identity or anything of that nature. It's there to make the course interactive, and engaging, and educational. So, my advice, friendly advice: Stop, stop, stop, stop, and complete the quiz if you want the full experience of this course. I really think you'll get more out of it. Anyway, here is the quiz item that I wanted to consider. Suppose that each of these playing cards, showing A, D, 4, and 7, has a letter on one side and a number on the other, and someone tells you, "If a card has a vowel on one side, it has an even number on the other side." Question: Which of the cards would you need to flip over if you wanted to find out whether the person's lying? Seems like a fairly easy question, but for most of us, it's not. If you completed the Snapshot Quiz, you should now be able to see your answer in the following screen. The correct answer is A and 7, because the only way to falsify a statement of the form, "If X, then Y" (if vowel, then even number) is to find an instance of X and not Y (that is, vowel and odd number). Cards without either a vowel or an odd number aren't relevant. For example, the card with a D can't tell you anything about cards with a vowel, so there's no need to turn it over. If this seems a little confusing, you're not alone. When Peter Wason and Phil Johnson-Laird put this type of question to 128 university students, they found that answers like A and 4, or even just A, were the most common responses, given by 59 students and 42 students, respectively. In other words, cards capable of confirming the rule were those most often seen as the cards needed to test the rule, which isn't the same thing, and which shows what psychologists call a "confirmation bias." A confirmation bias is a preference for information that's consistent with a preconception, rather than information that challenges it. So let me give you another example. This is a problem that I adapted from the book Human Inference, by Dick Nisbett and Lee Ross. Suppose that some researchers are interested in whether a particular symptom happens to be associated with a very common disease. So they do a year-long study of 150 people and they find the following results. Two questions: First, in a table like this, which cells do you need to examine in order to tell whether the symptom is associated with the disease? And second, in this particular case, is the symptom positively associated with the disease? Let's pause to see how you answered these questions in the Snapshot Quiz. Many people answer that the present-present cell in the upper left area of the table, or maybe that cell combined with the absent-absent cell in the lower right, are what you would need to examine, and they say that in this particular instance, the symptom is positively associated with the disease because in most cases (80 out of 150 times) the symptom and disease occur together, and in another 10 cases, the disease is absent when the symptom's absent. So, the presence or absence of the symptom usually matches the presence or absence of the disease. All told, there's a match 60% of the time (90 out of 150 cases). So it seems like the disease and symptom are related. But that's not actually a dependable way to solve a problem like this. In truth, all four cells of the table are needed to know whether the symptom and disease are related, even though it feels like the present-present cell is what's most important. To tell whether the symptom and disease are associated, you need to compare the chances of people having the disease when the symptom's present (which is 80 out of 120 cases, or 67%) and the chances of people having the disease when the symptom is absent (which is 20 out of 30 cases—still 67%). And of course, this gives us the answer to the second question. In this particular case, the disease and the symptom are entirely unrelated because the chances of having the disease are the same whether you've got the symptom or not. It's 67% in both cases. This sort of thinking does not come naturally to most people. For instance, consider another example adapted from the book by Dick Nisbett and Lee Ross: the question of whether God answers prayers. Many people would say yes, because they've prayed for something (for example, the survival of a loved one who's on the edge of death), and sure enough, it's come about. But that's only the present-present cell of the table. To fully answer the question of whether people are more likely to survive when they're prayed for, you would need to know how often people on the edge of death die despite being prayed for, how often people on the edge of death survive when no one prays for them, and believe it or not, how many people on the edge of death die when no one prays for them (controlling, of course, for all sorts of other factors that might have an effect). Obviously, this kind of approach is very different than how most of us operate in daily life. And please understand this is not a case against the value of prayer. The same principle would operate if somebody were to lose faith in prayer because they prayed for something that didn't come about. That would also be just one cell in the table. The point is that all four cells are relevant, and yet, when we have a theory about something, and we get some supporting evidence, we typically conclude that the theory was correct. We don't go out of our way to seek disconfirming evidence and examine all four cells of the table. So let's pause for a quick check to make sure you can tell whether two variables, like the presence of a symptom and disease, are statistically related to each other. Now that we've talked about confirmation biases, you might be wondering, "Why are they of concern to social psychologists?" Well, there are many reasons, but here's just one: When individuals and groups interact with each other, they usually have expectations, and there's plenty of research that suggests that those expectations do not always receive a fair test. Instead, people tend to seek out evidence that confirms their expectations, and they give greater weight to that evidence than evidence that would disconfirm their expectations. Counter-evidence, if it's even noticed at all, is usually very easy to explain away. So the very same mechanism that operates in Wason's research also operates when people have stereotypes about women or about racial minorities or about professors. People focus mainly on confirming evidence and end up perpetuating the stereotypes, or the preconceptions, or social expectations that they have, especially when they're not highly motivated to question those beliefs. So, confirmation biases can have important consequences, but they're only half the equation. Social expectations not only lead us to seek out confirming evidence—they can have an effect on the person about whom we hold the expectation. In other words, social expectations affect not only the person who holds them, but the other side as well. And it's that side of the equation that we'll focus on in the next video.