ACADEMIA’S ROYAL ORPHAN: THE CURIOUS HISTORY OF CRITICAL THINKING

When you hear the term ‘critical thinking,’ what’s the first thing that comes to mind? Probably something that is important, but vague and elusive. You might free-associate it with words such as logic, rationality, or rigor – as opposed to, say, wisdom, capitalism, or peanut butter. And you’d be on the right track – but it’s still a broad track. Like most general concepts, critical thinking isn’t reducible to a single narrow definition. Further understanding requires analysis –itself arguably a form of critical thinking.

We’ll address the problem of defining CT in a future post. For now, I’ll equate it with rationality, loosely understood as the practice of following rules and reasons to explain actions or to advance claims, assertions, or arguments.

Despite its widespread usage and positive connotation, however, the term “critical thinking” (unlike “rationality” or “reasoning)  has a relatively brief and tortuous history – as well as a much longer pedigree. Examining its historical lineage reveals both the importance of the idea and its conceptual weaknesses, and also tells us something about the culture of learning.

Several curiosities should be noted at the outset. One is that critical thinking (CT) is a royal orphan. It’s an offspring of philosophy, the sovereign parent of learning; but philosophy has quietly disowned it. Philosophers seldom if ever refer to critical thinking, much less acknowledge it as their subject’s progeny. Yet CT clearly derives from the philosophical study of logic and rhetoric. It’s all about improving the quality of thinking, and that’s what philosophy is mainly about.

I believe there are several reasons for this orphaned status. One is that philosophy, by its very nature, takes critical thinking for granted and looks beyond it in mapping the wider reaches of the mind, being, and reality.  Thus, philosophers tend to regard critical thinking the way literary scholars view basic grammar, or mathematicians arithmetic: as something beneath their attention.

Another reason, and the other side of that coin, is that CT’s sheer breadth of application works against it. Its relevance across the liberal arts and beyond makes it seem extra-philosophical. This is ironic, if not perverse; as the attempt to apply philosophical rigor to all learning, CT is an export that philosophers should be proud to share and curate. Indeed, it highlights the essential if not dominant role that philosophy has played in the history of Western learning, going back to Ancient Greece.  But philosophers are reluctant to take credit for this legacy or to dirty their hands in the export business.

         One of the key functions of philosophy, however, if not its most important one, is to be precisely that exporter of methods of reasoning: an arbiter of logical and conceptual rigor and a reminder to non-philosophers that all such rigor is essentially philosophical, based in logic and other forms of rationality.  Without CT, intellectual excellence in any field, from classics to queer studies, is inconceivable.

Few students or scholars seem to recognize this.  They tend to have a misguided stereotype of philosophy as being exclusively the domain of the abstract and the abstruse. (Laypeople have an equally misguided stereotype of philosophy as answering questions about the meaning of life.) The upside, however, is that we don’t need to become philosophers in order to achieve such rigor. Philosophers do indeed address certain questions (some of them broad and abstract, others narrow and technical) that don’t, and needn’t, concern the rest of us. But we all need to be critical thinkers.

The aim of critical thinking isn’t to make us philosophers. The aim of critical thinking, rather, is to make us better philosophers: to broaden and deepen the community of rational thinkers.

Another curious fact about CT is that, in name at least, this offspring of philosophy is barely a century old, and has a somewhat tortured history despite its pedigree. The history of CT is a Cinderella story of a scorned and neglected princess.

Her exact date of birth is uncertain; but her official debut wasn’t until well into the 20th century. One early appearance is in the title A Test of Critical Thinking in the Social Studies by J.W. Wrightstone (1938).  In the same year, the Educational Policy Commission of the National Education Association reported that,

“Critical judgment is developed…by long and continuous practice… The child [sic] must learn to defer judgment, to consider motives, to appraise evidence, to classify it, to array it on one side or the other of his question, and to use it in drawing conclusions. This is not the result of a special course of study, or of a particular part of the educative procedure; it results from every phase of learning and characterizes every step of thinking.”[1]

Sphere of influence: a phrenological map of the human brain. Photograph: Classic Image/Alamy

Three years later, Edward M. Glaser came out with An Experiment in the Development of Critical Thinking (1941).  Earlier works refer to “scientific thinking,” seeking to extend the rationality of scientific method into other fields, although scientific method is just one form of rationality, and not a paradigm for all of them. Indeed, critical thinking appears on the scene along with the rise of social science and the need for new models of rigor. It’s implicit in the writing of John Dewey, for example, who preferred the term “reflective thinking.”

CT, in fact, has always meant different things to different scholars. For many, it simply meant logical or rational thinking – essentially philosophical thinking. And that is a compelling generaldefinition: more specifically, critical thinking is philosophical rigor applied in every field of learning and in the here-and-now: to your thinking or mine, or this or that text.  

Others have taken a narrower view, equating CT directly with informal logic, which is likewise difficult to formulate.  Informal logic (also a relatively recent coinage) in turn has several subdivisions. It nominally deals with rigor in argumentation. That includes rhetoric in general (along with logic and grammar part of the original liberal arts Trivium) and the avoidance of fallacies that don’t involve logical contradiction. 

There are scores of such fallacies, and they can be roughly divided into two categories. There are epistemic fallacies about knowledge (for example, the well-known causal fallacy of post hoc ergo propter hoc – that because A precedes B, A causes B); and there are psychological biases, blunders and blind spots: ways in which we’re almost hardwired to misperceive the world around us or to reason badly about it.

But many definitions of CT go well beyond the avoidance of fallacies. They include guidelines for weighing factual evidence; distinguishing truth from fiction, reality from appearance, and facts from values; examining assumptions and justifying conclusions; standards for sound analytic thinking; and much else.

That still leaves a lot of room for debates, turf wars, intellectual snobbery, and laundry lists of mental skills. Yet it’s no accident that CT, however broad or malleable an idea, has an enduring place in our language and in academia. Logical thinking, clarity, factuality, fallacy avoidance, and reasonableness are unlikely to go out of style no matter how we group or label their components.

Despite the need for rigor across the disciplines (or perhaps because of it), the academic disputes about the definition of CT, and whether it should be taught as a stand-alone field or integrated into others, have left it marginalized within American education. It has become a kind of academic ghetto, largely cut off from the rest of higher learning; its subject matter is all subject matter.

That ghetto has produced some good work, especially in the 1980s and ‘90s, while at times bogging down in narrow pedagogical concerns. The CT literature from those decades, much of it driven by competing attempts to define it once and for all, would terrify any college freshman. Yet useful books on critical thinking, from various angles and with different emphases, continue to appear, like more blind men exploring the proverbial elephant.

 Underlying this permanent identity crisis is the fact that the concept of CT is inherently elastic and contestable. Different stipulative definitions are equally valid and useful in different contexts. As Peter Wood has observed, “The term ‘critical thinking’ is a bit like the Euro: a form of currency that not long ago many were eager to adopt but that has proven troublesome to maintain. And in both cases, the Greeks bear an outsized portion of the blame.” [2]

My own preference is for a broad and flexible definition: one that embraces the full spectrum of rational skills from formal logic to the distinct senses of informal logic and analytic thinking. In short, it should be recognized as roughly synonymous with intellectual (as opposed to purely practical) rationality. Narrower definitions, it seems to me, exclude important skillsets that are integral to the mix.

CT, then, can usefully be seen as (in the best sense) philosophy-on-the-fly. It doesn’t threaten or usurp philosophy; instead, it embodies the crucial ways in which philosophy informs all of our thinking.

In general discourse, we tend to retreat to vague pronouncements that endorse CT but are imprecise about what it is or how it relates to other areas of learning, such as philosophy, psychology, rhetoric, logic, or the liberal arts overall. It remains in active use – but mostly in debased form, as a buzz-word for general rigor or critical intelligence.

What’s the upshot of this complicated history? Attempts to re-brand CT, or to make it a stand-alone discipline, appear to have exhausted themselves. There’s only so much one can say about it without ultimately (in Simon Schama’s words) knowing “more and more about less and less.”  Yet no one argues that we shouldn’t be critical thinkers.

There may be no glass slipper awaiting our Cinderella. But we should acknowledge CT’s royal heritage and importance as a concept that embraces logic, clarity, breadth, depth, relevance, and the proper balancing of facts and values. These are the complementary, if at times competing, aims of all intellectual excellence. And we know this much: such rigor includes, but goes beyond, the laws of formal logic, and involves a set of common standards shared by all academic disciplines. Like many of those disciplines themselves, it is a direct offspring of philosophy. And rigor, however we may struggle to define it, is never a choice and will never be out of date.



[1] Educational Policy Commission, quoted in Glaser, “Critical Thinking: Educating for Responsible Citizenship in a Democracy,” p. 24.

 

[2] Wood,“Some Critical Thoughts” Chronicle of Higher Education (Jan. 6, 2012).

Retrieved from: chronicle.com/blogs/innovations/some-critical-thoughts/31252/.