The Scientific Perspective
The 3 Distinctive Aspects of Science by Nathan Nguyen
Facets of Science
It’s difficult to overstate how much more we now know about the natural world compared to people living just a few hundred years ago. At our fingertips everyday we unthinkingly harness what to our not-too-distant ancestors would be considered extraordinary wizardry.
For example, if you gave a Renaissance-era engineer the complete schematics for an air conditioner, they would still be amazed at the result. The reason is because modern air conditioners rely on the fact that when a gas is compressed, its pressure and temperature rise, and when it’s allowed to expand, its pressure and temperature drop. By controlling where in the process gasses are pressurized, air conditioners are able to blow cool air into your room. However, it was only in the late 18th century that we discovered this lawful relationship between pressure and temperature, which is now taken as a given in air conditioner manufacturing.
This is but one of thousands of discoveries whose effects have filtered into our daily lives. We live as kings, not recognizing what precious luxuries we’ve been afforded. With phones, we can instantaneously communicate with people on the other side of the planet. With vaccines, we’re able to prevent illnesses that historically decimated entire societies. With lightbulbs, we can banish the darkness for less than one five thousandth of what it cost just 300 years ago.
How has all this been possible? In a word, science.
It was not beseeching divine revelation. Not politicking. Not war and conquest. Not shaking a magic eight ball. All this we’ve been doing, mostly to no avail, for hundreds of thousands of years. (Well, maybe not the last one.) No, the revolution in our understanding of the natural world comes down to something more recent. It was a change in how we ask and answer questions, what we now call “science.”
There is, of course, no consensus definition of “science,” but we can make headway in describing its contours nonetheless. There is a set of values that scientists collectively embody that helps to explain the remarkable strides we’ve made in revealing nature’s secrets. Here, I’ll describe three of these values: curiosity, skepticism of authority, and responsiveness to evidence. I will also describe examples of scientists failing to live up to these values and explain how they threaten to undermine the progress we’ve enjoyed so far.
1. Curiosity
Curiosity is the desire to better understand the world. Where there is a blank spot on our map, curiosity is what propels us to explore. When a gap in our understanding appears, the curious seek an explanation.
However, this is not the only way a person could arrive at their beliefs. We might, for example, be motivated to believe something simply because it makes us feel good. A person who’s just been given a grim medical prognosis might feel some resistance towards the results of their test and continue believing they’re perfectly healthy. Alternatively, we might be motivated to believe something merely because it paints the group we belong to in a positive light (or our enemies in a negative one).
A Case Study:
It’s commonly thought that liberals and conservatives have One Big Thing from which they derive their political beliefs. Perhaps its equality vs hierarchy, change vs preservation, big vs small government, or idealism vs realism.
However, one group of psychologists argues that this is all misguided.The hodgepodge of views that political parties have had across cultures and across history is too heterogeneous to be explained by any single dimension. No, the content of any political group’s beliefs is highly contingent, and most individuals’ political beliefs are highly inconsistent. Moreover, psychological experiments find that individuals’ evaluations of social policies are highly malleable depending on the political group supporting them.
All this is because our political beliefs are best explained, not as deductions from abstract principles, but rather as a means of fitting in with and mobilizing support from potential allies. So for example, feminists today often have views that are congenial to ethnic minorities, but there was nothing inevitable about this. Indeed, feminists of the early 20th century sometimes explicitly excluded African Americans from the suffrage movement.
So it is with political beliefs more broadly. On this Alliance Theory of political beliefs, we should expect people to indulge in biases that flatter their allies and belittle their opponents. The better to recruit people to your side and repulse people from the other.
These tendencies to allow our personal and groupish biases to distort our thinking were what Francis Bacon called the “idol of the cave” and the “idol of the tribe,” respectively. As Bacon put it, the lies we come to as a result of these idols are like candlelight, warm and comforting, yet they pale as substitutes for the search for and knowledge of the truth, that is, the (sometimes unflattering) daylight of reality. When we engage our curiosity, that is what we are motivated by.
Science (when practiced well) channels this curiosity towards advancing our understanding of the natural world. It is in part what led Darwin to voyage on the HMS Beagle and, ultimately, to discover the process of evolution by natural selection. More recently, it was what led Nobel laureate Katalin Karikó, despite decades of obscurity and a threat of deportation, to pioneer the mRNA technology that allowed for the rapid development of COVID vaccines.
However, not everyone has such admirable courage to explore the unknown. Some, in fact, wish to actively discourage certain types of research. For example, on the subject of racial differences in intelligence, philosophers Ned Block and Gerald Dworkin have written:
What we are saying is that at this time, in this country, in this political climate, individual scientists should voluntarily refrain from the investigation of genotypic racial differences in performance on IQ tests.
Indeed, some scholars have been threatened or physically attacked for their views on this subject, and many take Block and Dworkin’s advice for fear of ostracism or retaliation.
Ostensibly, the rationale for these calls for censorship is a concern that the research might be used to justify the oppression of minority groups. However, as researcher Noah Carl argues, this neglects at least two problems.
One is that this censorship may inadvertently reinforce the odious moral views that the censors were fighting in the first place. Now, it’s true that those who wish to oppress minority groups do indeed wish to find genetic differences between the races. But to fight back against such views, it is not necessary to deny genetic differences or stifle research on these differences. Rather, one can instead argue that such differences are not relevant to how we ought to treat one another—the equality of persons and a respect for rights do not depend on people’s genetics. Indeed, those who fear research on genetic differences in abilities are, in effect, holding their morals hostage to the empirical facts. They are implicitly suggesting that oppression would be warranted if differences in mental abilities turned out to have a genetic origin, thus reinforcing the very views they were fighting against.
A second problem Carl notes is that stifling research may lead scholars to misunderstand salient phenomena, thereby leading to material harms. For example, it is now known that there are significant racial differences in the effectiveness and side-effect profiles of clinically important drugs. If, out of concern for minority groups, we deny that races exist or that there are genetic differences between them, we risk providing people with suboptimal medical care, potentially at the cost of people’s lives.
So it goes with censorship more broadly: when people seek to limit the frontiers of knowledge, they often fail to foresee the negative effects of their actions. They may unintentionally bring about the very harms they were concerned about, and they may engender further harms as a result of the ignorance they cultivated. And so even if we might find some line of research personally repugnant, we should be wary of those who would seek to curtail it. As the saying goes, “Let a thousand flowers bloom.” We may yet be surprised by what’s to be discovered by those with opposing views.
And what is it about curiosity that makes it an effective engine of progress? Why can’t we just believe what we (or our tribe) want to believe?
Simply put, we are more likely to find the truth when we actively search for it. When we set ourselves to some other goal, like satisfying the idols of the cave or the tribe, we follow a path that makes true discoveries merely a matter of happenstance, and so it should be no surprise when this path leads us astray. Even worse if we actively shun the truth or punish others who blaspheme our idols.
An analogy: To drive from NYC to LA, it is very useful to actually try to drive to LA. Driving to some other random spot on the map or closing our eyes entirely is unlikely to get us to LA, since the large majority of other paths do not pass through that destination.
So it is with scientific discoveries as well. To better understand the natural world, it is incredibly useful to actually try to understand it. You have the option of shaking a magic eight ball if it makes you feel good, but then you should not expect to be the next Darwin or Karikó. To create a good map of the world, you must explore the terrain, not draw whatever you wish in the blank spaces.
2. Skepticism of Authority
The motto for the Royal Society, the oldest continuously existing scientific academy in the world, is “Nullius in verba,” meaning “Take nobody’s word for it.” It embodies an attitude of skepticism towards authority that makes science distinct from other ways of learning about the world.
Of course, it does not mean that scientists never rely on others. Scientists are not solipsists. They collaborate with each other and cite each others’ work all the time. Science is, after all, fundamentally a social enterprise. As Isaac Newton put it, he was only able to see further than others “by standing on the shoulders of giants.” And so it is with every scientist.
Yet there is still an important sense in which scientists engage in a skepticism that goes beyond what might be ordinarily practiced. For example, for thousands of years it was common for scholars to cite Aristotle as the definitive authority on how the world works. He was so central a figure in the academy that the 13th century theologian Thomas Aquinas cited him simply as “The Philosopher.” It was precisely this attitude that Galileo mocked when he described one scholar as staying “so closely tied to every phrase of Aristotle’s as to hold it sacrilege to depart from a single one of them.”
More broadly, the impulse to sacralize certain ideas has been a common one throughout history, especially in religious contexts. Catholics, for example, trust the Pope to be infallible on matters of doctrine. Muslims take the Quran to be free of error, and Hindus treat the Vedas similarly. This is not a scientific attitude.
Science (when practiced well) treats all claims as worthy of critical scrutiny. No inquiry is to be taken as sacrilegious. No authority is to be taken as infallible. In many ways, skepticism is the norm:
Empirical testing: Scientists are not in the habit of merely making pronouncements about the world. In the ordinary course of their work, scientists must present data or arguments to support their claims, else they lose the esteem of their peers. And when a scientist sticks their neck out and conducts an experiment that has a high chance of ruling out false hypotheses, this is seen as especially valuable. To pass through a “severe” test with one’s neck unscathed is considered a great feat in science.
Peer review: Scientists submit their work to journals, and this work is then critically evaluated by their peers. In the most prestigious journals, it is extremely common for a paper to be rejected. For example, the rejection rate for Nature, one of the world’s leading science journals, is about 92%.
Replication: Even after a study has been conducted, passed through peer-review, and published in a journal, scientists might nonetheless express skepticism. They might be so skeptical that they attempt to redo the entire study themselves just to make sure the results were correct. In science, this effort is to be welcomed, not scorned.
By way of contrast, consider an exchange between psychologists Scott Lilienfeld and Derald Wing Sue. In a 2017 article Lilienfeld argued that research on “microaggressions” was beset with a number of conceptual and methodological problems, and as such its application in real-world contexts like diversity and cultural sensitivity trainings was premature. In response, Sue did not challenge the merit of any of Lilienfeld’s arguments. Rather, he challenged the value of science more broadly:
In essence, Lilienfeld is applying the accepted scientific principle of skepticism to the study of microaggressions, which may unintentionally dilute, dismiss, and negate the lived experience of marginalized groups in our society.
In effect, Sue was treating the testimony of these marginalized groups as beyond skepticism. To do otherwise, would be “problematic.”
However, Sue’s response does not appear to demonstrate an understanding of why skepticism of authority is such an important part of science.
It’s because humans—including members of marginalized groups—are fallible, finite beings!
When we attach our own beliefs so thoroughly to a particular individual or group, we risk adopting the exact same mistakes they made in the course of their investigations. And it is inevitable that they made mistakes; no one is so wise and so rational that they can avoid all humanity’s biases. Even the great Aristotle made mistakes, for example, believing that there was a hierarchy of nature with humans at the top and other life forms below. It was not until Darwin’s time that we learned to view life rather as a branching tree (or perhaps bush), with no species inhabiting a privileged position at the top.
Moreover, even the most perspicuous humans live finite lives. This fundamental limitation prevents any individual from discovering the secrets to everything; all investigations must at some point end. So it is upon us, the living, to take their work and build on it (or perhaps demolish it) with the new information we gather about the natural world. Collectively, humanity does not have to put down the scientific instruments after 60 or so years. We can continue to make progress in understanding the world far after the end of any individual’s career.
3. Responsiveness to Evidence
The final distinctive characteristic of science is responsiveness to evidence. Just as a good thermometer’s reading will rise or fall in response to the temperature around it, so too, a good scientist’s confidence in their hypothesis will rise or fall in response to the evidence they encounter. Two points are worth emphasizing here.
The first is responsiveness. Humans often have a tendency to act defensive about their beliefs. When encountering contradictory evidence we flinch away as though it would hurt us. Scientists are trained (albeit imperfectly) to make the opposite motion, to embrace the results of their experiments whatever they may be. Charles Darwin neatly expressed this sentiment in his autobiography: “I have steadily endeavoured to keep my mind free, so as to give up any hypothesis, however much beloved (and I cannot resist forming one on every subject), as soon as facts are shown to be opposed to it.”
Relinquishing one’s old beliefs, however, can be difficult. There is always the temptation to massage one’s data in a way that leads to congenial conclusions, which is why many scientists have sought a way of ensuring that they respond appropriately to the evidence. Thus, many scientific journals have begun accepting “registered reports.”
The idea is simple: Scientists submit their study rationale, design, and proposed analyses for peer review prior to conducting their study. Once approved, journals commit to publishing the study regardless of the results, so long as the scientists follow their submitted protocol. As a consequence, scientists can rest easy knowing that they’ll be recognized for their contributions, thus mitigating the incentive to engage in selective reporting of their data. Moreover, by requiring that scientists follow their protocols, registered reports act as a formal safeguard against this selective reporting. Scientists cannot fish around for the one analysis among dozens that favors their hypothesis; rather, they must call it as they see it (or as they saw it in their initial protocol).
The second point to emphasize is evidence. Evidence is not the only thing that might change a person’s beliefs. We might instead adhere to beliefs because we think they’re the kind of beliefs good people hold. Consider, for example, the case of Howard Gardner, the originator of the concept of multiple intelligences. In a lecture explaining his perspective Gardner says:
Even if...the bad guys [who emphasize the importance of IQ] turn out to be more correct scientifically than I am, life is short, and we have to make choices about how we spend our time. And that’s where I think the multiple intelligences way of thinking about things will continue to be useful even if the scientific evidence doesn’t support it.
He is, in essence, admitting that he will continue to champion multiple intelligences come what may, regardless of the scientific evidence. What’s more, he casts his colleagues, those who don’t believe in his theory, as “the bad guys.”
This is the voice of an ideologue, so committed to his theory that he’s given up on science. A good scientist, by contrast, follows the evidence wherever it leads. When the truth happens to contradict their beliefs, so much the worse for their beliefs; it’s time to change them.
An analogy: If you’re driving with a GPS and it tells you a bridge is closed, it behooves you to change directions. You should not continue on towards the bridge heedless of the new evidence you’ve received.
So, too, in science it is immensely important to respond to the evidence. To do otherwise risks driving off a metaphorical bridge, i.e. dedicating your life championing a false theory. If we instead update our beliefs accordingly, we are much more likely to converge towards theories that reliably stand up to scrutiny. And from this foundation, we might build vast edifices that ultimately benefit mankind. We can, for example, build literal bridges that span 100+ miles by combining together our understanding of materials science, geology, hydrology, physics, and more.
A Caveat:
The above discussion presents a somewhat idealized view of science. It is true that scientists will tend to value curiosity, skepticism of authority, and responsiveness to evidence. In practice, however, scientists also frequently fail to live up to these values.
For example, in an infamous leaked draft of a 2016 essay, Susan Fiske—prior president of the Association for Psychological Science—described those who had criticized her friends’ research as “destructo-critics” and “methodological terrorists.” Needless to say, this is not a welcoming attitude towards those pointing out errors in science.
More broadly, many fields show signs of bias in their evaluation of research, for example, selectively publishing findings that seemingly support hypotheses but not those that falsify them. And there are yet many ways that scientists could improve in detecting and disincentivizing negligence, hype, or outright fraud.
Nonetheless, it remains the case that science, with all its warts, has been an amazing engine of human progress. And so the appropriate prescription when a scientist fails to live up to the scientific values is not to give up on the values, but rather to reaffirm them and find ways to make them an even more central part of the scientific enterprise.
Conclusion
It is said that the birth of philosophy in Europe started when people eschewed the mytho-religious explanations they were taught. They instead noticed that there was order in the universe and that this order could be explained without recourse to a divinity. As one historian put it, this new beginning “consisted in the abandonment, at the level of conscious thought, of mythological solutions to problems concerning the origin and nature of the universe and the processes that go on within it.” The first generation of philosophers were those brave enough to leave the idols of the cave and tribe and seek the truth for themselves. With their curiosity and skepticism of received wisdom, we might also say they were early harbingers of science too.
It should be no surprise, then, that an earlier name for science was “natural philosophy.” The two fields, science and philosophy, to this day share many of the same values. Where they differ is more a matter of emphasis than substance. Scientists often focus more on empirical observations rather than the analysis of concepts. And whereas many philosophers can do their work from the armchair, many scientists use specialized equipment to take measurements of the natural world.
What’s essential to science though, and what’s made it so beneficial to humanity, has been the values that scientists collectively embody—curiosity, skepticism of authority, and responsiveness to evidence. Curiosity has motivated scientists to explore the unknown, pushing the frontiers of knowledge into regions that would have otherwise been ignored. Skepticism of authority has led scientists to double check those regions that were thought to be certain. Often, this confidence has been unearned, with authorities simply preaching what’s in their own interests, heedless of the truth. Where curiosity generates hypotheses, skepticism tests them. Finally, a good scientist takes the results of these tests to heart. When scientists are responsive to the evidence they ensure that the map reflects the territory, so that the lines they draw are precise and undistorted by personal bias.
These are not the only values scientists hold, but together, they are instrumental to explaining the remarkable progress we’ve made as a species in understanding the natural world.
By Nathan Nguyen



