Psychology for Socialists, Part 1

By Jonathan K.

“Psychology for Socialists” is a three-part series designed to introduce people to findings and theories in psychology that are relevant to socialism and activism. The things I will be presenting aren’t exclusively relevant to those topics; in fact, they apply to almost every facet of our lives. What I will be doing is presenting them in relation to the work we do as socialists.

Let me start with a couple of disclaimers. The first disclaimer is that findings in psychology are (almost) never absolute. We can capture general patterns or describe the most likely behaviors or reactions, but there will always be exceptions. So, for everything I’m about to describe, remember that it doesn’t apply to everyone or every situation. The second disclaimer is that psychology is an imperfect science. Like many sciences right now, it is struggling with a replicability crisis. The findings I will present will be ones I have confidence in, or I will be clear that they are still unsettled. However, even the ones I have confidence in could be overturned at some point in the future.

Psychology is imperfect in another sense because, like many sciences, it has suffered from a lack of diverse perspectives, and more than other sciences it has suffered from a lack of diverse data. Many of the findings I will discuss are based on studies of mostly upper-middle-class and mostly white college students, and conducted by mostly white researchers (though somewhat less overwhelmingly cis-male than other fields). In the last two decades the field has become more aware of this and made efforts to self-correct, but it will take some time for us to be confident that these findings apply to all of humanity.

Part 1: Know yourself

The goal of this series is to help us be aware of things in ourselves and in others that affect our behavior, our interactions, and our work. The intuitive starting point is an introduction to yourself: features of your own mind that you might not be aware of. I’m specifically going to focus on issues about what we think we know, what we think other people know, and how wrong we can be.

I. Illusions of knowledge

Off the top of your head, without looking anything up, how well do you understand how a toaster works? Just give yourself a quick rating on a 1-7 scale, where 1 is “not at all” and 7 is “completely.”

Now try to explain to someone else how a toaster works. Include things like, how does it “pop” at the right time? How does the knob control how toasted your toast gets? I’ll wait.

Odds are good that you just discovered that you overestimated your knowledge. This is called the “Illusion of Explanatory Depth” (IOED)[1] and is one of several related examples of ways in which we overestimate what we know. You may have heard of the “Dunning-Kruger effect,” which is a more general statement of the same idea: The more you know, the less you think you know. It often gets used in classist or ableist arguments, but the underlying idea is neither of these things. It’s extremely difficult to measure the depth of our own ignorance.

As socialists, we are often required to explain complex concepts like capitalism, the difference between socialism and communism, the carceral state, and more. We also have to advocate for complex social support systems, like single-payer healthcare, to say nothing of ideas that are excluded from mainstream political outlets, like prison abolition. Going into a political discussion, you may feel like you deeply understand these issues, that you have comprehensive arguments to make, and that you are ready for the most common rejoinders. Unless you’ve actually tried to explain these concepts to someone, you might not understand them as well as you think.

These effects have been the focus of a fair amount of research, and so we know a few things about how they work. As with everything in psychology, there are multiple things going on. One major piece is that we confuse knowing where to find information with knowing the content of the information. There have been some very good studies showing that, for information that we can look up, we will remember how to find that information, but we won’t remember much of the information itself.[2] Now that we can look up everything on our pocket-sized internet machines, that’s probably even more true (though I don’t know of any studies looking at smartphone usage specifically).

Second, there is the difference between “abstract” and “concrete” information. Think again of the toaster. There are some things you really do know about it — it uses electricity to create heat, and there’s some kind of spring for pop-up toasters. That’s “abstract” knowledge. There are no real details there, just general principles or descriptions of behavior. If you found you had trouble explaining things about how a toaster works, it probably wasn’t those things. The difficult pieces are the “concrete” details — things like how electrical resistance in the material of the heating coils creates heat, how thermocouples control the temperature of the toaster and when it pops, or how the latch on the pop-up mechanism works. One evidence-backed account of the IOED and other, similar effects is that we recognize that we have abstract knowledge, and we confuse that for having concrete knowledge.[3] So, when you feel like you know something, you should ask yourself whether you know only the abstract part or really have the concrete details. For example, when we talk about single-payer, how would it address funding medical education? How would we deal with existing medical debt? How would we deal with malpractice insurance? People have offered answers to all these questions (which I personally don’t know off the top of my head, but I know where to find them), but when you say you understand a single-payer, you should make sure you know what you think you know.

Finally, these effects persist because of a strong desire to “save face” (something I will talk about a lot more in a later post), combined with a negative cultural attitude toward ignorance. In mainstream U.S. culture, there are few more humiliating things than “looking stupid” by being ignorant. Our self-image and self-esteem suffer if we demonstrate ignorance. So, we are motivated to avoid that. One way to avoid that is to simply ignore our own ignorance, perhaps because we so rarely get called on it. You probably spent most of your life until today thinking you knew how a toaster works, because you were never really challenged on it. We can get very far with very little knowledge, as it turns out, and so we can safely assume (most of the time) that we know things we really don’t. (There are no studies of this, by the way. It is just one theory as to how we manage with such frequent ignorance.)

As socialists and individuals, all we can do about the first two issues is be aware of them. They are internal to our own minds, and we must simply be vigilant about monitoring our own knowledge. However, tying ignorance to self-esteem and social scorn is something we can, and should, attempt to combat. As socialists, we should value the act of learning, and be clear that learning starts with admitting ignorance. I like to use the XKCD example of the “lucky 10,000,” which makes the point that for something that “everyone knows” by age 30, there are (if you assume a constant rate of learning) 10,000 people learning it every day. An expression of ignorance should not be a source of shame, but a source of excitement. It is an opportunity to learn, and for others, an opportunity to teach. Indeed, some studies have reported that, in Japanese culture, that is exactly how ignorance is treated[4], and (by some measures) it makes for a much better educational experience.

In thinking about this, we should be careful not to fall prey to a different individualist attitude that ultimately leaves the same problems. Socrates is famously quoted as saying, “I am wiser than he is to this small extent, that I do not think I know what I do not know”[5]. This is an expression of scorn for overconfidence, but it does not welcome expressions of ignorance. At the same time, we should not start treating ignorance as being good in and of itself. Remaining ignorant by choice is something we should not accept. To accept ignorance without elevating it, we must value the act of learning, and be explicit that the first step in learning something is admitting ignorance. We must all become comfortable with saying “I don’t know this, can you teach me?” When we say “there are no stupid questions,” we have to learn to mean it.

II. The “Curse of Knowledge”

Not only do we think we know more than we actually know, we also have trouble figuring out what other people don’t know. An expert in any field will have years of experience and accumulated knowledge, but to be a good teacher, they have to recognize how much of their knowledge is due to experiences that their students have not had yet. How often have you been in a class where a teacher talked about something for three seconds as if you already knew what it meant, and you felt completely lost? The teacher, most likely, assumed you already knew that information because they already knew that information, and forgot it had to be taught to them.

This is called the “curse of knowledge.” There are many examples of it throughout psychology. The classic example is a study in which a group of participants were told to tap the tunes to various popular songs (e.g., “Happy Birthday”), and estimate how easily someone listening to their tapping could figure out which song it was. The tappers, who heard the song before doing the tapping, estimated that listeners would be able to recognize the song based on their taps alone about 50% of the time. In reality, listeners only managed to identify the song successfully 2-3% of the time. There are many other examples of this kind of effect. It starts early, too, peaking in young (3-5-year-old) children, who assume that other people know everything that they do[6].

This is a big problem for teaching. It’s a big problem for me, right now, as I’m writing this. I’ve read over a thousand research articles in psychology over the course of the last ten years (according to my reference library), and I have to try to put myself in the shoes of a reader who might not even have taken an introductory psychology course. If there’s something in here that I talk about like it’s obvious when it really isn’t, it’s because I failed at understanding what you do or don’t already know. It takes constant, conscious effort to avoid making those mistakes.

The implications for advocacy should be clear… or perhaps they are not. In any case, there are two contexts where the curse of knowledge can be a huge problem for us and our work. The first is in internal political education. It is the flip side of being willing to admit our own ignorance: Avoid stating things as if they are obvious, though it can be difficult to strike a balance between that and being patronizing. Communication is key. “Do you know what X is?” is a valid and useful question to ask when engaging in political education.

The second context is when advocating for our ideas in the public sphere. Here is where the curse of knowledge can truly bite us in the ass. We might know that the weekend exists because unions fought and died for it, that single-payer is cheaper and more efficient than private insurance, that health-care reform requires as much work on cutting costs as it does on providing access, but everyone we are talking to may not. They may even have been actively misinformed. We typically refer to this as “meeting people where they are,” but the curse of knowledge can make it harder than you might have expected. Be sensitive to the fact that you might need to re-evaluate what is or is not “obvious,” simply because you learned it so long ago.

III. So what can we do about it?

So, our own minds are working hard to sabotage us. What has the science of psychology given us to counter these bad habits?

I have good news and bad news.

Let me start from the top. Consider the following math problem:

“A ball and bat cost $1.10 together. The bat costs $1 more than the ball. How much does the ball cost?”

For most people, your immediate, intuitive answer will be $0.10. This is incorrect, and a little arithmetic will show you why: If the bat costs $1 more than the ball, then the bat costs $1.10, and the total cost would be $1.20. The correct answer is $0.05, but the question is worded in a way that’s designed to lead you to a different answer at first.

This is a question from an early version of the “Cognitive Reflection Test” or CRT. It measures a cognitive “style,” for lack of a better term. As with many things in psychology, people were able to describe this before we were able to measure it. I’m particularly fond of Terry Pratchett’s description of it in the Tiffany Aching books: “First sight and second thoughts.” First sight is the ability to see the world as it is, without letting your expectations get in the way. Second thoughts is the ability to look at your own thinking and ask yourself, “Is this right?” In psychology, we call these second thoughts “cognitive reflection.” The more “reflective” you are, the more you check your own thinking for errors, and the more likely you are to catch them before you act.

The good news about cognitive reflection is that it can be learned and practiced. The easiest way to develop it is to start by slowing yourself down. Researchers have designed training tasks for preschoolers that increase cognitive reflection. It’s very simple: They are given a difficult (for them) question, but after seeing the question, there is an enforced two-second delay before they can answer. That alone makes them much better at other, unrelated tasks that benefit from cognitive reflection. For adults, we can do this to ourselves. Whenever you are about to make a decision, or answer a complicated problem like the one above, before you answer, deliberately stop yourself and re-examine your answer again. Do this enough and it can become a habit.

The other piece of good news is that cognitive reflection does seem to affect the IOED. People who score higher on the CRT are less prone to the illusion of explanatory depth. However, we have a correlation, but not a causal link. There are no training studies yet that show that increasing reflectiveness makes people less susceptible to the IOED, but in principle it could help quite a bit. Sadly, nobody has looked at cognitive reflection and the curse of knowledge, but based on our best understanding of them, the worst it can do is nothing.

The bad news is that the only training studies I’ve found don’t look at long-term benefits or real-world applicability. How well you can reflect on your thinking in a psychology lab after an intensive training could be very different from how well you can do so when you’re about to run a political education session coming off a long work day. My completely intuitive guess is that it’ll be very difficult to actually apply outside the lab. But it’s still one of the better solutions we have.

Ultimately, the most we can say about the IOED and the curse of knowledge is that you need to know about them to be able to counteract them. The best I can do is introduce them to you. For some readers, this might feel like old news. For others, it might be a revelation. I don’t know, but that’s fine. I can find out by sharing this with all of you.

Notes
[1] Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: an illusion of explanatory depth. Cognitive Science, 26(5), 521-562.
[2] Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips. Science, 333(6043), 776.
[3] Alter, A. L., Oppenheimer, D. M., & Zemla, J. C. (2010). Missing the trees for the forest: a construal level account of the illusion of explanatory depth. Journal of Personality and Social Psychology, 99(3), 436-451.
[4] Heine, S. J., et al. (2001). Divergent consequences of success and failure in Japan and North America: An investigation of self-improving motivations and malleable selves. Journal of Personality and Social Psychology, 81(4), 599-615.
[5] Plato, Apology 21d, tr. Tredennick, 1954.
[6] Birch, S. A. J., & Bloom, P. (2004). Understanding children’s and adults’ limitations in mental state reasoning. Trends in Cognitive Science, 8(6), 255-260.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: