The Importance of Critical Thinking

Studies have shown that the average investor tends to underperform the market. Why does this happen? It all boils down to the way we make decisions.

In this episode of Industry Focus: Financials, The Motley Fool's Gaby Lapera and John Maxfield dig into how investors, and people more generally, can improve their critical-thinking skills and thereby boost the performance of their portfolios.

A full transcript follows the video.

A secret billion-dollar stock opportunity The world's biggest tech company forgot to show you something, but a few Wall Street analysts and the Fool didn't miss a beat: There's a small company that's powering their brand-new gadgets and the coming revolution in technology. And we think its stock price has nearly unlimited room to run for early in-the-know investors! To be one of them, just click here.

This podcast was recorded on Sept. 21, 2016.

Gaby Lapera: Hello, everyone! Welcome to Industry Focus, the podcast that dives into a different sector of the stock market every day. You're listening to the Financials edition taped on Wednesday, September 21, 2016, my name is Gaby Lapera, and joining me on Skype is John Maxfield, one of our top financial analysts at the Motley Fool. How's it going, John?

John Maxfield: It's going great. Thanks for having me, Gaby!

Lapera: Awesome. I'm definitely excited to have you on the show, because today, we thought we'd take a little bit of a risk. Normally, we talk about the mechanics of investing or news stories that are relevant to investing. Today, we wanted to try and talk about a concept that's core to good investing, but it is something that can be applied to all aspects of your life. Today, we're going to talk about critical thinking. This can get a little bit weeds-y, but I think we're going to make it really interesting, and you should definitely stick with us. Let's dive right in, because we have a lot of content to cover.

Let's start with some kind of background: why we decided to do this in the first place. People are bad at making decisions. John and I were talking before the show started about something called the Dalbar studies. Do you want to expound on that, John?

Maxfield: Critical thinking really impacts all aspects of our life, but what's so interesting about investing is that you can actually study the quantifiable impact of how good your thought process is. That's what the Dalbar study, among other studies, allows you to do.

What the Dalbar study does is, it compares how it uses a proxy for an individual investor's performance, and it compares it to the broader market, the S&P 500. What they have found over the years -- they've been doing the study for multiple decades -- is that the typical investor underperformed the S&P 500 by about half. So, the S&P 500 in the last couple decades has returned 8%, the average investor has returned 4%. And what they have found is that the reason for this all boils down to the way investors make decisions.

Lapera: The problem with the way a lot of investors make decisions is that they're making decisions based on emotions, rather than rational, evidence-based, thought-out ideas. This is something that Warren Buffett harps on all the time.

Maxfield: Yeah. What scientists have found -- and behavioral finance has really become a popular study over the past couple of decades, as the Efficient Market Hypothesis has left the scene -- and what they have found is that the human brain is designed to make emotional decisions. This goes all the way back to when we were cavemen and cavewomen. If you were scared of something, you didn't sit down and analyze whether it was rational to be scared of, say, I don't know, a sabertooth lion?

Lapera: Those are sabertooth tigers.

Maxfield: Yeah, sabertooth tigers, whatever they were! (laughs) You acted instinctively, you were scared and you immediately ran. And that whole process, that didn't just stop when we were cavemen. It still impacts how the human brain works today.

Lapera: Correct, it's your fight-or-flight response. And not just that, but humans in general, because we have to make snap judgments -- or our ancestors had to make snap judgments in order to keep themselves alive -- we tend to make a lot of shortcuts in our thinking. So, for example, I was at the barn the other day, and I wasn't really paying attention, but out of the peripheral of my eye, I see something sliding across the ground. And without even thinking about it, I jumped up -- jumped back in the stall with the horse, and the horse was staring at me like I was crazy. And I looked down, and it's just a hose that someone was dragging farther down the hallway. I thought it was a snake. My mind took the circumstances and thought, "Green sliding thing on ground = snake." It didn't think, "It could be a hose, it could be any number of things." It was that instinctual taking very little information and fitting it into a mold that already exists and spitting something out, without critically evaluating what was going on. Like John said, that was a great survival mechanism back in the day, but not a great way to invest.

Maxfield: That's such a great example, because it really boils down to, you're talking about shortcuts. In the behavioral finance literature, they call these behavioral biases. There's things like authority bias. Let's say you read an article by somebody you consider to be an authority, as opposed to going out and doing your own independent research, you will just rely on them because you think they are an authority. Well, there could be a problem in their thought process. But we can't sit down and analyze every single thing to death, every decision that we have in our life. So you have to make shortcuts. And that's fine -- 80% of the time, that's going to be fine, because you're just deciding whether to stop at a red light, who should go first at a four-way stop, whatever it is. Those are instinctive, you don't have to analyze those. But there are certain decisions where you have to slow down your process, think things through logically, and come to a more reasoned opinion. And that's really where that critical thinking is. And when you do that, you are in a better position to avoid the behavioral biases or shortcuts that could otherwise lead to a bad decision.

Lapera: Right. So, we talked a lot about critical thinking without actually ever defining what "critical thinking" is, which is something that, when you think critically, you should always do -- define your terms. So, let's talk a little bit about what critical thinking actually is.

In my view, critical thinking is an active process where you come up with a question, gather information, and then use it to reach a conclusion. What I'm about to say is a definition I got off the internet, but one that I wholly agree with: Critical thinking should be "clear, rational, open-minded, and informed by evidence." This part is my own: Critical thinking is inherently a self-reflective process. I also think, although this isn't mentioned in any of the definitions, critical thinking can be difficult. It can be a difficult thing. And it's something that you have to actively pursue. That's why we decided to do this show today, both to talk about critical thinking in general, and also think how we can use it in our investing lives.

John, let's move on to the next portion of our show: What can critical thinkers do? What would a critical thinker look like to you?

Maxfield: I would say that you pointed out the key quality of critical thinking, and that is thinking rationally. When you think about thinking rationally, there's so many different elements to that, right? But it is dedicating yourself to the following logic, it is dedicating yourself to trying to find -- there's a famous book called The Signal and the Noise, where you have so much information that you have to figure out what to use to make a decision. And so critical thinking is also all about picking the sources, appropriate sources for your information, and fitting them into your rational thought process. That's how I would think of critical thinking -- very similar, probably exactly the same way that you do, Gaby.

Lapera: Yeah. I do have this quote I'm going to read to you guys, it's by a Dr. Paul and Dr. Elder. It says, "Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-correcting. It requires rigorous standards of excellence and mindful command of their use. It entails effective communication and problem-solving abilities and a commitment to overcoming our native egocentrism and sociocentrism."

I think the thing -- when I'm unpacking that quote -- that stands out most to me is how many times they say "self." This is because critical thinkers are independent thinkers who do their own analysis. This gets to what you were talking about earlier, with the authority bias. They're not relying on other people to spoon-feed them what they should think.

But the other thing that comes out in this quote is that critical thinkers are people who can admit when they're wrong, and accept information that challenges their worldview. They don't try to massage the sources to try and get everything to support what they already think. They are willing to accept new information and change their minds.

Maxfield: To that point, Gaby, when you're thinking about taking that new information in, you have to not only take the right information and exclude the bad information, but also process it appropriately.

I'm a lawyer -- I don't practice, I write for The Motley Fool. But one of the things that you learn in law school, and one of the things that underlies the legal profession -- which, when you think about what law school is and what law is, it really is just a study of logic, and an application of logic and persuasion and things like that. What it's grounded in is an idea of a hierarchy of precedent. Everything you read doesn't deserve the same amount of authority. In the legal context, the Supreme Court gets more authority than appellate courts, which get more authority than district courts, which get more authority than state courts, and then you have the hierarchy within state courts.

So, it's not only what information you're using, but it's how you organize it and process it that really differentiates somebody like Warren Buffett, who has made massive amounts of money really by just thinking extremely rationally, relative to these investors who have a tendency to underperform the market.

Lapera: Yes. And this definitely gets into our next topic on the issue, which is how to get started with critical thinking. The first, most important, most basic step is: Ask a question. It can't be any old question, and you can't just ask it any old way. There's this pragmatist philosopher whose name was John Dewey, and he said, "A problem well put is half-solved." That means, when you ask a question, it should be very clear what you're asking. Otherwise, you'll have no idea when you find the answer.

Maxfield: Yeah. Identifying the issue and asking the right question is so important. How I think about it is: If you shoot a rocket, if it's tilted two inches one way, you may not be able to see that difference when the rocket is on the launch pad. But the further and further it goes into space, the more off the original trajectory that original error can make it go. So, identifying the appropriate question, to your point, it's such a critical piece of this.

Lapera: Yes. Ways to know that you're asking a good question is that your question needs to be specific. If it's too broad, it sometimes becomes very difficult to find data to help you understand what you're looking for. And honestly, most really big questions are just a whole set of smaller questions that pile up together to create this huge, overwhelming question that you're trying to answer. If you can, start with the smaller questions and answer all of those, and build and build until you can answer a big question.

And the other thing I think is really important to realize is, it's OK to restate your question. Maybe once you start doing your research, you realize the question you started out with isn't actually the one you were trying to answer. That's OK. If there's one thing you take away from this podcast, it's that it's really important to ask questions of everything, including yourself.

Maxfield: Yeah. Your point about being precise, when you said that, it reminded me of this research by a guy named Philip Tetlock who studies forecasting, and the difference between people who are really good at forecasting and people who are really bad at forecasting. And one of the things that he found -- and he's written a number of books on this, he's really the leading authority on forecasting in the United States, maybe even in the world -- one of the things he found in his research was that the people who were better and produced more accurate forecasts were more precise in all aspects of their thinking.

Lapera: Which makes sense.

Maxfield: Yeah, right? It makes total sense. The tangible example that he gave in his most recent book, Superforecasting, is that when you're asked a question like, "Oh, Gaby, what's the percentage possibility that interest rates will be 2% next year?" And let's say somebody gives 90%, another person gives 80%, but then there's other people who say 71% or 77% or 86%. The more precise those estimates are, and the more precise you think, there is a direct correlation to the quality of the output of your decision-making process.

Lapera: Yes. You are absolutely right. Precision is really important. Precision of language, precision of thought, is really important when you're formulating your question, when you're doing your research. Which brings me to actually doing your research. The first step of doing your research, gathering all your information in order to make your reasoned conclusion, is you have to pick your sources, and you have to pick good sources. By this I mean, you have to question your sources and data: Don't take anything for granted.

For example, say you're on Facebook, and you run across a meme or a picture, and it's one of the presidential candidates with some white text superimposed. That is not a good source. Anyone could have written that. In fact, I remember a couple years back, someone printed pictures of Taylor Swift and put quotes from Mein Kampf on top of it, and people were like, "Oh, she's so original and unique!" Those pictures that you find on the internet can be from anywhere. Good sources would be places like a university or a government study. You want sources that have been examined by multiple other people who are experts in the field.

Maxfield: Yeah. The one thing I would add to that is that you don't just want sources that confirm what you already believe. What's the point in doing a whole bunch of research if you're just adding more substance to what you already think or know? What that means is, you really need to eschew dogma, to eschew ideology. And in that place, replace it with pragmatism.

Let me give you a really timely example of how this works and how this plays into the decision-making process. We're in the middle of a pretty contentious presidential election. One of the things that researchers have found in the past is that in certain areas, like politics, when a person either hears an opinion that is inconsistent with theirs, or reads it in the political context, the part of your brain that is responsible for critical thinking -- that really deep, logical thought process -- actually shuts down. So, they put people into brain scans and ask them questions or give them information that was either consistent or inconsistent with what they already believed, and any time an inconsistent fact came up, their brain shut down. So, you need to be cognizant of that bias, that confirmation bias that is a part of your brain. When you're gathering the evidence, and when you're processing evidence, you need to consciously try to combat that.

Lapera: Yes. You have to be open to information that contradicts your feelings, your expectations, and your worldview. In order to do this at all, you need to be able to acknowledge any bias you might hold. And it might be as simple as, growing up, your dad told you that X brand of cars was terrible, and you never even examined that belief, but you grew up your entire life thinking that. When you go to buy a car, and you run across a lot of articles that say X brand car is actually great, you have to be able to let go of that thing that's so deeply ingrained in your brain, and be willing to change your mind. That's hard, that's really hard. That's something we spent a lot of time talking about when I was getting my anthropology degree -- acknowledging that you have assumptions, cataloging them, and challenging each and every one of them.

So, you might be thinking to yourself, this process you just described -- finding all your data and challenging all of your assumptions -- that sounds like it's really hard, and it sounds like it's going to take a really long time. And yeah, sometimes it takes a really long time. Sometimes it might take your entire life to answer your question. That's OK, that's just part of critical thought.

So, say you have asked your question and you've gathered your data. Now it's time to write your conclusion. Everyone does this in different ways. For me, I find that argument mapping helps me a lot. So, you can make diagrams like flowcharts. If you're really interested, you can go online and Google it, but it's like a visual representation of the information you have. But if you have your thesis statement clearly written, your question clearly written in your thesis, your supporting evidence, and any dissenting evidence, I think it will help you come to a logical conclusion, having it all written out in front of you.

Maxfield: Absolutely. Being disciplined, writing everything out, putting it in front of you so you can actually really analyze it...let me bring up one other point here. This is something I picked up in Professor Tetlock's books, as well. His most recent one, I mentioned it earlier, Superforecasting, is fantastic. It's kind of a cheesy title, but it's actually an excellent read.

What he found is -- let me give you a little bit of background. He has overseen these studies where he will go out and enlist a whole bunch of people to try to make predictions. There was one study in which he did this for the defense agencies. The question was -- they put a whole bunch of university teams of forecasters together -- would some of them be much better than others, and could that be repeated on a consistent basis, so that you could eliminate luck from the equation? And Tetlock's team outperformed the other teams by something like 20% or 30%, in terms of the accuracy of their forecasts in that whole process.

So then he breaks down in this book: "These are the different things that we found that made it possible to make better decisions that turned out to be more accurate in terms of forecasts, looking forward." The one thing he pointed out above all else is what he calls "perpetual beta." This sounds like one of those cheesy business-school book topics, but I think there's really some substance to this.

Let me read a selection from this book. He says: "The strongest predictor of rising into the ranks of superforecasters is perpetual beta, the degree to which one is committed to belief updating and self-improvement." So, really, if there's anything above all the other things that is important here, it is that commitment to getting better every day, whether that's picking better sources every day, whether that's thinking more logically every day, that is the foundation upon which great decisions are made on a systematic basis.

Lapera: Absolutely. That was a really good quote you pulled there. I think that actually gets to the other thing I was going to say: When you make your conclusions, you want to make sure that your conclusions are really clear. If you can't express them clearly like Professor Tetlock apparently can -- his book is very well-written -- if you can't express your conclusion clearly, then there's probably an error somewhere along the line of your reasoning, or there's a concept that you don't 100% understand. You want to go back and examine that. If possible, you want your conclusion to be so clear that you can explain it to your five-year-old child. That's how clear you want to understand the reasoning that led you to your conclusion.

I think that one of the most important things you should take away from this podcast -- besides to question everything -- is: Critical thinking takes practice. You aren't going to be good at it from day one. Even if you're really good at it in some areas, maybe you're not so good at it in others. But, with a mindful approach to it, you can improve yourself and your critical-thinking skills.

Maxfield: Yeah, you can absolutely improve. I know I keep coming back to Tetlock's book, but it's really right on point. He talks about that, that constant improvement, that practice makes perfect. I'm not a huge sports metaphor fan --

Lapera: Me neither.

Maxfield: But the quarterback of the Seattle Seahawks, he has this great saying: "Preparation creates separation." So, the more you do this, the better you get at it, and you can build on that. Let me throw in one final point about this. That is, anytime you're going to make a decision, there's always the possibility that you're going to be wrong. You have to factor that into the decision-making process.

We love Warren Buffett at The Motley Fool. How can you not love and respect that guy? He's amazing. But the term that he uses to describe this is "the margin of safety." What he's talking about there is, maybe you think the sun will come up tomorrow -- maybe that's not a good example, because we all know the sun will come up tomorrow. Maybe you think a stock will go up in five years, that there's a 60% chance it will go up. Well, there's also a 40% chance it will go down, so you have to factor that into your thought process. That's why, when you're buying stocks, buying them for cheap prices is so important. Then, it reduces the downside, the probability that it'll go down, as opposed to the probability that it will go up.

Lapera: You mean, it reduces the potential fallout from you being wrong, right?

Maxfield: Yes, you're exactly right, sorry. (laughs)

Lapera: No, no, I just wanted to be 100% sure that I was clear on what you were saying, so I asked a question, dear listeners.

Maxfield: Good critical thinking.

Lapera: (laughs) So, in conclusion, maybe you think this is all bull****, and that's OK, as long as you have good, well-thought-out reasons for thinking this podcast is no good. This whole episode is a celebration of independent thinking. And John and I are flawed individuals. We don't think we're the ultimate experts on critical thinking, it's just something that we both think about a lot, and we try to incorporate into our lives. We were hoping to do this show to create a starting point to encourage you to get out there and do your own thinking, and question our thinking, too. That's literally the whole point of this show. We want you to ask good questions, of yourselves and of us.

Thank you guys so much for joining us. I hope you enjoyed the show. I'm really interested to see if we get a lot of emails, angry emails or happy emails. I'll find out when I get back from vacation, so it'll be like a whole treasure trove of emails -- or none, that's the other option. Maybe you guys listen to this and you're like, "This is so crazy that I can't even email her about it."

Just to wrap up, as we usually do: People on the program may have interests in the stocks they talk about, and The Motley Fool may have recommendations for or against, so don't buy or sell stocks based solely on what you hear. Contact us at industryfocus@fool.com, or by tweeting us @MFIndustryFocus. Thank you to Austin Morgan for listening to Maxfield and I ramble on so much in the last two weeks, you're awesome.

Maxfield: You're awesome, Austin.

Lapera: And thanks everyone for joining us. Everyone, have a great week!

Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.