A Brief History Of Critical Thinking: From Ancient Wisdom to AI Skill
- Bethan Winn

- Apr 8
- 9 min read

This is an extract from my book - The Human Edge: Critical Thinking in the Age of AI - I loved researching and writing this part especially, and wanted to share it beyond the pages of the book
Pre-common era
Where to start? Ancient Greece is a pretty good place. Apart from maths, astronomy and alarm clocks (check out Ctesibius’ clever system), Ancient Greek philosophers are also credited with the beginnings of democracy, debate and critical thinking through conversations. Socrates (470 BC–399 BCE) wandered the streets of Athens, not giving lectures or writing books, but asking questions – lots of them. With a small circle of followers, his approach was more than just gossip; it was a carefully crafted method of inquiry that would shape thinking for centuries to come.
Socrates believed that wisdom begins with admitting what we don't know.
His student Plato captured these dialogues in writing, showing how a series of thoughtful questions could unravel someone's certainty about their beliefs. Imagine being asked "What is justice?" and thinking you have a simple answer, only to find yourself, an hour later, questioning everything you thought you knew. That's the Socratic method in action and this dialogic (dialogue-based) approach to learning is still the basis for a lot of great classroom teaching today.
Socrates wasn't alone in his quest for wisdom through questioning. While the history of critical thinking often reads like an exclusive boys' club, women made significant contributions that have been systematically underrepresented. The Oracle at Delphi, whose proclamations were revered throughout Greece, played a pivotal role in Socrates' philosophical journey when she declared him the wisest of men – not for his knowledge, but for recognising how little he knew. This established intellectual humility as fundamental to critical inquiry. Hipparchia of Maroneia (active c.325 BCE), a Cynic philosopher who rejected conventional social norms and used logical argument to challenge assumptions about women's proper roles, demonstrated critical thinking through both her philosophical positions and her lived example. These women's contributions remind us that critical thinking's development wasn't exclusively male, even though historical constraints and subsequent historical accounts have often pushed their work to the margins.
On the other side of the world, in China, Confucius (551–479 BCE) developed a different approach to critical thinking. While Socrates was questioning Athenians about truth and knowledge, Confucius emphasised the importance of reflection and learning from others. His famous saying, ‘By three methods we may learn wisdom: by reflection, which is noblest; by imitation, which is easiest; and by experience, which is the bitterest’, captures a sophisticated understanding of how we develop wisdom. The Confucian tradition valued questioning too, but within a framework that emphasised harmony and respect – a balance people are still trying to strike in our discussions today.
8th century–19th century
While Europe was in its Dark Ages, an interesting shift occurred. The Islamic world entered its ‘Golden Age’ (from the 8th–13th centuries) when philosophers like al-Fārābī, Avicenna and Averroes preserved and expanded upon Greek philosophical traditions.They developed sophisticated methods of logical analysis and argument, bridging ancient wisdom with emerging scientific thinking. Their work would later help spark the European Renaissance.
The medieval period also gave us new tools for critical thinking. Take Peter Abelard’s (1079–1142) Sic et Non (‘Yes and No’) – a text that presented contradictory arguments from various authorities on theological questions. Instead of telling students what to think, it taught them how to think through presenting opposing positions and complex problems without giving solutions. Although centered on theology, it's not so different from modern case study methods used in law and business schools, designed to get people thinking.
The Scientific Revolution (1500–1700) brought another major shift in critical thinking. English statesman and scholar, Francis Bacon (1561–1626) challenged the traditional reliance on Greek and Roman authorities, advocating for direct observation and systematic experimentation to build our own understanding of the world. His emphasis on empirical evidence and the elimination of biases (what he called ‘idols of the mind’) feels surprisingly modern. When Bacon talked about these mental biases in the 1600s, he was describing what we now know through cognitive psychology as confirmation bias and groupthink.
The Enlightenment period (1685–1815) gave us some heavy hitters in critical thinking. German philosopher Immanuel Kant's Critique of Pure Reason (1781) asked fundamental questions about how we know what we know. John Stuart Mill (1806–1873) championed the importance of challenging our beliefs and engaging with opposing viewpoints. It would be interesting to hear his thoughts on social media comments today!
1900 onwards
Fast forward to the 20th century, and critical thinking gets more systematic attention. John Dewey, who is credited with coining the term ‘critical thinking’ (as well as the decimal system on library book spines), emphasised its practical application in education and democracy. He saw critical thinking not just as an academic exercise but as a vital tool for navigating life and participating in society.
British philosopher Mary Midgley also argued for this practical approach, often comparing philosophy to plumbing:
“[it’s] not just grand and elegant and difficult, but is also needed”, an activity for us all to engage in... Nobody notices plumbing until it goes wrong," she wrote: "Then suddenly we become aware of some bad smells, and we have to take up the floorboards and look at the concepts of even the most ordinary piece of thinking" (1992: 139).
This approach to critical thinking resonates strongly with our current needs, where everyday decisions benefit from thoughtful examination.
These approaches to critical thinking emerged at exactly the right time, because society was changing dramatically. For most of human history, people had looked to authority figures: religious leaders, monarchs, tribal elders, to tell them what to believe and how to live. It wasn't just about power; it was about survival.
In a world where information was scarce and life was uncertain, following trusted authorities was often the safest bet.
These leaders in society, the powerful or wealthy, were also historically the ones with the most access to information. Books were rare, literacy levels much lower, knowledge was precious and critical thinking often meant making the most of limited data.
As societies modernised and democratised, resulting in mass education and increased literacy, people began to think for themselves. People witnessed disastrous consequences as a result of unquestioningly ‘following the leader’. This shift was as revolutionary as the printing press – maybe even more so. Suddenly, individuals weren't just allowed but expected to form their own worldviews. It was exciting, empowering ... and more than a little overwhelming.
Consider that your great-grandparents’ generation might have inherited most of their beliefs from their parents, their community and maybe religious leaders. Your grandparents had a handful of trusted sources – newspapers, books, radio stations. Your parents had television added to the mix. And now?
We're all swimming in an ocean of information, trying to figure out what's worth our attention and worth believing.
The rise of marketing and mass media throughout the 20th century added another layer of complexity. We’ve needed to develop a form of ‘defensive thinking’ or at least a healthy scepticism and information literacy – the ability to question sources, recognise manipulation, interrogate claims and make independent judgements in the face of sophisticated persuasion techniques.
Imagine how much you’d spend (and how disappointed you’d be!) if you believed every claim you ever read in advertising. Critical thinking evolved from a philosophical practice to a vital life skill, as essential for deciding what toothpaste to buy as for forming political opinions or choosing a career path.
As society grappled with a rapidly evolving media landscape, educators worked to create better frameworks for developing thinking skills. In 1956, American educational psychologist Benjamin Bloom (1913–1999) and his colleagues introduced what we now know as Bloom's Taxonomy – a ladder for thinking skills that starts with basic tasks like remembering facts (which a parrot can do!) and climbs through increasingly complex cognitive challenges: understanding concepts, applying knowledge, analysing relationships, synthesising information, then evaluating ideas at the top. While the model has its flaws, and we now see these skills as interconnected rather than a linear progression, it’s useful to note how it has influenced pedagogy and evolved over time.
A group of educators revised the original taxonomy in 2001, making an important shift: they put 'Creating' at the top. This alteration reflects how what we value as a thinking skill has changed. While evaluating information is crucial (especially in our age of information overload), the ability to create new connections (an evolution of what was called ‘Synthesis’ in the original model), generate novel solutions and think beyond existing patterns might be even more valuable. It’s the top half of this pyramid that most university education focuses on. We can all quickly look up facts and ‘pub quiz trivia’ in seconds (dates, names, who won an Oscar in 1986) but it takes time to analyse, evaluate and create something new.
And ‘creating’ we are, at a staggering rate. Predictions for the end of 2024 were that 149 zettabytes of data would have been generated worldwide, with this figure forecast to increase to 394 by the end of 2028 (Taylor 2023), from our daily step counts to our clicking patterns, from social media posts to streaming habits, everything is captured. Publishing content has never been easier or more accessible, but this creates a new challenge:
Attention has become the ultimate commodity.
Every platform, news outlet and social media channel competes to be noticed, deploying increasingly sophisticated tactics to keep us engaged. We've moved beyond simple clickbait to what some call ‘optimising for outrage’ with the outliers and controversial opinions being given much more airtime and debate than ‘vanilla’, reasonable opinions and ideas. From celebrity gossip to scientific findings – new ideas, polarising information and provocative headlines are the name of the game. This has become a digital echo of Oscar Wilde's quip that
‘There is only one thing in the world worse than being talked about, and that is not being talked about’.

If you throw ‘the law of triviality’ into the mix (our attraction to trivial matters rather than the hard, uncomfortable topics), it may go some way to explaining why with all the information at our fingertips, you’ve likely spent longer choosing a new household appliance than learning about climate science and why we are more engaged by celebrity gossip than policy debates. It’s also easier to come up with a ‘right’ answer to the simpler things, which is a much more comfortable place to be.
While our ancestors struggled to access information, we face the opposite challenge: an overwhelming abundance of it.
The critical thinking skills that we need have shifted accordingly – from making the most of limited data to identifying what truly deserves our attention.
Our information landscape tempts us to engage with whatever's most accessible rather than what's most important. ‘Doing your own research’ sounds empowering, but without strong critical thinking skills, we risk falling down rabbit holes of misinformation rather than finding genuine insight.
Online or in-person echo chambers can also emerge, where we are surrounded by people and information that simply reflects and reinforces our existing beliefs, creating an increasingly narrow view of the world and making it harder to understand or empathise with different perspectives. Daniel Kahneman refers to this as WYSIATI (What you see is all there is) and it’s easy to fall into this thinking trap (2011).
Another challenge is what we ‘know’ to be true seems to have an increasingly short shelf life. While our ancestors might have lived their entire lives with relatively stable knowledge about how the world worked, we're confronted with new discoveries and perspectives almost daily. With unprecedented access to higher education and the exponential growth of scientific research, we are collectively learning at a pace that can feel dizzying. This rapid evolution of understanding isn't new, but the speed of the change is.
When Darwin first proposed his theory of evolution, he faced what we might now call the Victorian era's version of 'cancel culture'. Even our most firmly held beliefs might need updating as new evidence emerges. As German philosopher Arthur Schopenhauer once wrote, “All truth passes through three stages: First, it is ridiculed; second, it is violently opposed; and third, it is accepted as self-evident.”
The challenge isn't in acquiring new knowledge, but letting go of outdated beliefs when better information comes along.
It's not just in our personal lives and academia that critical thinking has become essential. The World Economic Forum has consistently listed critical thinking, analytical reasoning, creativity, curiosity and the ability to learn and teach oneself (of which reflection is a crucial part) in their 'Top Ten Skills for the Future' (2020). As AI handles more routine cognitive tasks, the ability to evaluate data, question accepted wisdom and engage with nuanced arguments has become the real differentiator in professional success.
This brings us full circle. The ancient philosophers understood something profound: critical thinking isn't just about processing information – it's about engaging with ideas in ways that transform both our understanding and ourselves. Whether we're having conversations around a dinner table or collaborating with AI assistants, the fundamental skills remain as relevant as ever.

.jpg)


Comments