Scientific thinking is difficult to teach and difficult to communicate to people who are inexperienced in thinking like a scientist. The reason is that scientific thinking is different from ‘normal’ thinking. It is not the kind of thinking that a person wants to do or ever feels inclined to do. As you can imagine, this makes teaching scientific thought extraordinarily difficult.
One way of looking at the problem is in terms of confirmation bias. It turns out that people tend to pay close attention to evidence that matches their beliefs, and ignore evidence that conflicts with their beliefs.
This phenomenon can be seen all over the place: TV shows a person watches, beliefs about a person’s personalities, gender and cultural stereotypes, and even the books a person chooses to read.
People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light.
We can see from a simple experiment in confirmation bias that folks do not come pre-wired from birth to be scientists:
A popular method for teaching confirmation bias, first introduced by P.C. Wason in 1960, is to show the following numbers to a classroom: 2, 4, 6
The teacher then asks the classroom to guess the teacher’s secret rule by offering up three numbers of their own. The teacher will then say “yes” or “no” if the order matches the rule. When the student thinks they have it figured out, they have to write it down and turn it in.
Students typically offer sets like 10, 12, 14 or 22, 24, 26. The teacher says “yes” over and over again, and the majority of people turn in the wrong answer.
To figure out the rule, students would have to offer sets like 2, 2, 2 or 9, 8, 7 – these, the teacher would say, do not fit the rule. With enough guesses playing against what the students think the rule may be, students finally figure out what the original rule was (three numbers in ascending order).
The exercise is intended to show how you tend to come up with a hypothesis and then work to prove it right instead of working to prove it wrong. Once satisfied, you stop searching.
The scientist knows that the right way to detect a pattern is to form a belief from the data, then try to falsify that belief, and form a new one as needed. Only when the scientist can no longer falsify a belief should he or she consider the pattern satisfactorily detected.
However, even highly trained scientists only think like scientists some of the time. If you have attended a college lecture, you have probably seen scientists who would actively challenge their own beliefs about chemistry, but who have superstitions about teaching which remain unchallenged. Notably, many practicing scientists proudly carry beliefs rooted in faith, in defiance of the value they place on evidence and falsification in their working lives.
So, it seems that even with scientific training, all that has been accomplished is segmenting a person’s mind into sections that are ruled by evidence and critical thought while other sections are still ruled by confirmation bias and unchallenged beliefs.
We see this phenomenon reflected in our society at large, as well. Questions, fields of research, and fields of study are divided into scientific and other. Our thoughts about how nature works are to be challenged and falsified. But, our thoughts about poetry and literature are to be felt from our guts. Our medicines are to be tested and recommended only when there is sufficient evidence, but our educational methods are to be considered experimental based on the amount of tradition supporting them instead of the amount of evidence.
We often hear the message that each person is entitled to their beliefs. This message is often interpreted as meaning that each person’s ideas are just as good or as valuable as anyone else’s; some beliefs are not to be challenged. So, our history of being intolerant toward people with different ideas as us has generated an unfortunate meme that poisons people against embracing science. The mixed message is that some beliefs are not subject to scientific thinking and others are subject to scientific thinking.
To a student suffering from confirmation bias, the message is that you clearly don’t have to practice critical thinking. After all, this is the message your brain prefers to hear.
From an educational point of view, we are making a huge mistake if we simply tell students the definition of hypothesis, show them a flowchart demonstrating the scientific method, then dust our hands and conclude that they have an idea how science works and why it is valuable to practice thinking like a scientist. The cards are generally stacked against students seeing the point of disciplined reasoning. Because science is not a normal way for the mind to work, critical thinking is something that needs to be emphasized over and over throughout a person’s life. It requires frequent practice, and constant encouragement and motivation.
Are we doing a good job of teaching it? Are we sure?