Getting Kids Wrong

The truth is not in the mirror, it is in the mirror of another.

The widest avenue in the world is located in Buenos Aires, Argentina. It is somewhat difficult to find online agreement among former tourists and other visitors as to just how wide it is, at least as measured in lanes—many say 10, some say 14, and a few go as high as 18. The Guinness World Records, which awarded the avenue the distinction of 'widest' in the 1960s when it was mostly complete helpfully lists its width as a little more than the length of a football field, with 16 lanes of traffic plus landscaped medians [1], while Wikipedia rather unhelpfully provides only the length of the widest avenue: 2.1 miles. [2] The name of this street, La Avenida 9 de Julio, or July 9th Avenue, commemorates the signing of Argentina's declaration of independence on July 9, 1816, and is just one of many places in the city where Argentines annually celebrate—with parades and costumes, choirs and dancing, fireworks and speeches, and of course food, like locro and asado—their freedom from Spanish rule.

Over 5300 miles northwest of La Avenida 9 de Julio is another street that can lay claim to being the world's widest. At just 60 feet wide—a New York City standard—West 120th Street's title is symbolic, not literal, its degree in wideness merely honorary. And rather than covering a great physical distance, West 120th Street spans an even greater conceptual distance: that between Teachers College and the rest of Columbia University on the other side.

History professor Jeffrey Mirel, referencing a May 1954 Time magazine article [3], noted in a 2011 piece for American Educator that:

For at least a half century, education reformers have quipped that 120th Street in New York City . . . 'is the widest street in the world.' Underlying this quip is the belief that Columbia's liberal arts faculty members regularly dismiss the child-centered educational methods promoted by their colleagues at Teachers College as at best misguided and at worst anti-intellectual. In turn, professors at Teachers College routinely denounce their liberal arts colleagues as musty traditionalists who fail to recognize that most elementary and secondary students in American schools find discipline-based education useless and irrelevant to their lives. As cartoon-like as this portrait is, it contains more than a kernel of truth . . . Indeed, for most of the 20th century, dialogues between 'ed school' faculty members and their liberal arts colleagues about how to train prospective teachers in such fields as English, history, mathematics, and science were scarce, with neither side respecting the expertise of the other. With few exceptions, this lack of [productive] dialogue and collaboration in teacher training continues to the present day. It is arguably one of the most important factors contributing to the poor quality of teacher education in this country. [4]

Geraldine Clifford and James Guthrie, in their book Ed School: A Brief for Professional Education, trace this antagonism back even further—to the beginning of the 20th century—although the fight is, in their telling, a bit more one-sided. As early as 1916, education reformers were mocking the courses taught and the dissertations produced at education schools as absurd and trivial. Harvard literary critic Irving Babbitt, in 1929, said of education professors that they "are held in almost universal suspicion in academic circles, and are not infrequently looked upon by their colleagues as downright charlatans."[5]

Up to the present, and clearly going back over 100 years, education schools have struggled to build a unique identity and professional status among their fellow university insiders—and, by extension, the public at large. Since the content of teaching is owned, so to speak, by other academic disciplines at the university, schools of education have turned to pedagogical method as their unique brand. The inevitable downplaying of the role of intellectual content and ongoing detachment from the rigorous intellectual life of the university, while having certainly helped to attract and train the 3.7 million people needed to teach in K-12 schools [6], have also had the effect of diminishing the status of education, as critics frequently rail against the schools' anti-intellectualism [7] (former Secretary of Education Arne Duncan numbers among these critics [8]), their aversion to good science and to scientific investigations into—and occasional debunkings of—their assumptions and assertions [9], their extremist political orientations and intolerances [10], and their presiding over mediocre and declining academic performances of students internationally since at least the 80s. [11] Hirsch writes, in The Schools We Need: And Why We Don't Have Them, that:

The plight of education schools in the universities is counterbalanced by their enormous importance in the sphere of teacher certification and by their huge ideological influence in the nation's schools. It is never a healthy circumstance when people who are held in low esteem exercise dominant influence in an important sphere. The conjunction of power with resentment is deadly. [12]

This is a bit much, sheesh, and education schools should not enjoy all of the blame for creating or sustaining this deep divide. It takes two to tango after all. Liberal arts colleagues in more traditional disciplines tend to keep to themselves academically. Though this is slowly changing in some small ways, there are still many calls today for greater multidisciplinarity at the university. [13] Moreover, if you have ever been in a university or have seen a movie centered on one, you may know that university professors are not so keen on teaching—as opposed to researching and writing in their field. [14] Devaluing pedagogy as an onerous "service" has lead not only to devaluing education scholars but also to creating one of the most boring and least effective pedagogical methods ever: the monologue lecture. If there is anything that educators are taking a swing at when they decry traditionalist attitudes, it is the university lecture. But there is plenty more to swing at. Many university professors look down on the "soft sciences," or the humanities, and they are especially condescending to those "sciences" which don't enjoy a large theoretical foundation assembled and refined over centuries. [15] On top of it all, education faces the realpolitick of funding inequities and political misperceptions. [16]

It is irrelevant to this argument whether education's ostracism is self-imposed, externally imposed, or—most likely—both. What is relevant is that it exists and has existed for a long time and that it can exert a powerful influence on thinking and behavior—enough so that one could reasonably argue, as I do, that education's orientation to its work, at least in key areas, was set up in opposition to so-called 'traditionalist' thinking at the university. That is, were it not for the distance and tension between the education school and the rest of the university, education simply would not believe much of what it does believe. The philosophy and developmental psychologies on which its orientation (don't be anything like traditionalists!) rests are, as we shall see, one-sided views, unjustifiably remote from and vulnerable to common-sense reasoning and questioning.

Such one-sidedness has been evident since at least 1925, the publication year of legendary educator William Heard Kilpatrick's Foundations of Method:

Starting with its title, which placed an emphasis on method, one can find most of the major themes of present-day pedagogical 'reform' in Kilpatrick's book: . . . the dubious claim that it was basing itself on the most advanced scientific research; the insistence upon the individuality of the child and the autonomy of the teacher; the disparagement of mere subject matter and of other nations' educational methods; the admonition to teach children rather than subjects; the claim that knowledge is changing so fast that no specific subject matter should be required in the curriculum; the attack on rote learning; the attack on tests and even report cards; the claim that following the project method would develop critical-thinking skills. Kilpatrick's book even celebrated the whole-language over the phonics approach to reading instruction. [17] Above all, what won general acceptance in the educational community was the idea that a new method, free from the trammels of traditional knowledge, could encompass the whole sphere of education . . . It was this process-over-all concept that helped give the newly emerging professoriate its claim to an independent and separate existence. [18]

This isolation on one side of 'the widest street in the world' has meant that education—doggedly determined to show that it is as much a profession as anything else taught at the university—must occasionally borrow reputation-enhancing terms, ideas, and concepts from the other side of the street, without any help in interpreting them. This is, I believe, what happened with the term we are investigating, constructivism, and the swirl of ideas around it. Education professors stood on their desks and, with high-powered binoculars, peered across 120th Street into the classrooms of the philosophy and psychology departments, where they caught sight of a sophisticated-sounding term that seemed to validate everything they already believed in—anti-traditionalism—yet they didn't and couldn't know what they were looking at.

The constructivism that education borrowed for its own purposes has its roots in the philosophical study of knowledge known as epistemology, which tries to determine, among other things, what knowledge is and how it is we come to possess it. It asks: How can we know that we have knowledge? and are there methods of collecting information that more reliably lead to what we would call knowledge than other methods?

These questions, taken together, represent an epistemological challenge called the problem of the criterion—a problem that divided Greek philosophers in the first century BCE along with centuries of work in epistemology that came after them. The modern problem of the criterion, as formulated by American philosopher Roderick Chisholm in the early 70s, identifies two sets of questions that are central to epistemology. [19] In the first set, we ask what: What do we know? What is the extent of our knowledge? Which particular cases count as knowledge and which do not? In the second set of questions, we ask how: How do we know? How do we decide whether we have knowledge in any particular case? What methods do we use to determine what knowledge is? The problem that gives the problem of the criterion forty percent of its name is that it seems that answering either set of questions requires answering the other set first. Chisholm uses good apples and bad apples to make the point:

If you happen to know which are the good apples and which are the bad ones, then maybe you could explain to some other person how he could go about deciding whether or not he has a good apple or a bad one. But if you don't know the answer to the first of these pairs of questions—if you don't know what things you know or how far your knowledge extends—it is difficult to see how you could possibly figure out an answer to the second.

On the other hand, . . . if you happen to have a good set of directions for telling whether apples are good or bad, then maybe you can go about finding a good one—assuming, of course, that there are some good apples to be found. But if you don't know the answer to the second of these pairs of questions—if you don't know how to go about deciding whether or not you know, if you don't know what the criteria of knowing are—it is difficult to see how you could possibly figure out an answer to the first. [20]

To put it in other words: If you can't, in any way, reliably tell the difference between an apple and an orange, it will be nearly impossible to believe that you could construct a machine that would sort them. If, on the other hand, you couldn't begin to start work on even the crudest sorting device—a simple pipe with a diameter close to that of an average apple (and too narrow for the average orange)—if this were beyond your capabilities, it would be nearly impossible to believe that, were I to hand you an apple and an orange, you could tell them apart.

By Chisholm's reckoning, since neither set of questions can be answered without reference to the other, epistemologists have had to choose in what order they answer the questions. Thus, some epistemologists, past, present, and future, will choose to answer the second set of questions before the first (Chisholm called them 'methodists' or 'generalists'). That is, they will simply decide, or take for granted, that there is widespread agreement about the methods we use—say, via our senses—to gain knowledge. The remainder of their philosophy consists of investigating how well these methods work: how well is our general knowledge explained by, say, resort to only our five senses? Other epistemologists will choose to answer the first set of questions before the second (Chisholm called them particularists). That is, they will simply decide, or take for granted, that there is widespread agreement about what knowledge is and what it is that we can be said to know or not know. The remainder of their epistemological work consists of sorting through methodologies that try to explain all our sources of knowledge.

It is simultaneously unfortunate and unsurprising that education chose a 'methodist' epistemology to ground its thinking about knowledge. It is unsurprising, first of all, because, as we've seen, education has established itself as a discipline focused on pedagogical method (recall Kilpatrick's Foundations of Method), but also because the method in question, constructivism, is incredibly attractive to anti-traditionalist education professors, with their binoculars, spying across 120th Street.

Constructivism is the idea that every individual constructs their own understanding and knowledge of the world, based on their own unique experiences—close to the views of the 'five-senses' empiricists, but different in important ways that offer a lot to the anti-traditionalist educator. There is an emphasis on individuality; it seems also that traditional subject matter knowledge is at least devalued relative to the personal experiences of each child; indeed, children rather than subjects become the focus of educational applications of epistemological constructivism; rote learning is practically impossible in this regime—if students are constructing their own knowledge, they are presumably not memorizing it without meaning; constructivism makes standardized assessment problematic (how can we judge students according to one standard if everyone constructs knowledge individually?); and it is fully consistent with the whole-language approach to reading instruction, where students rely heavily on their own experiences and intuitions when constructing words and their meanings rather than an 'imposed' tuition in phonics. All together, constructivism, then, offers education a sophisticated-sounding philosophical agenda, capable of underwriting Kilpatrick's Foundations of Method, and thus capable of underwriting large swathes of the progressive educational 'reform' program.

It is unfortunate—this choice of 'methodist' epistemology, constructivism—primarily because there are sensible and more common-sensical 'particularist' epistemologies on offer as alternatives. Recall that particularist epistemologies start with the assumption that there is a general consensus about what counts as knowledge rather than a consensus about the methods we use to gain it. Thus, a particularist epistemologist, like Thomas Reid or G.E. Moore (or Chisholm himself), might suggest that just about everyone knows when they were born, for example—and that we simply start with the assumption that this is true knowledge. When these particularists then look around for a method that would bring us each this knowledge, they would be unlikely to land on "actively constructing it based on unique individual experiences." Who, speaking plainly, would say this about learning the date they were born? We were simply told it, and we accepted it. Many things we know are known in this way. Indeed, whatever else we can make of their work, both Reid and Moore make a robust place for testimony [21] as a valid source of knowledge in their epistemologies. 'Methodist' constructivism, on the other hand, at least on a reasonable reading of it, would not count the date you were born as something you know—you did not actively construct this knowledge based on experience—or it would twist itself into unnecessary knots trying to explain how you 'actively processed' the date when you were told it, an explanation usually amounting to nothing more than that your brain was working when you learned the date.

Education, thus, suffers from its choice of epistemologies by making itself unjustifiably counterintuitive (both Reid and Moore were known as "common sense" philosophers)—a move that can really only be explained in the context of education's fight for legitimacy. Without this context, the move only serves to alienate education from the rest of society and frustrate scientific efforts grounded in constructivism.

Two notable attempts to put constructivism on sound scientific footing—and explain all our sources of knowledge—were led, beginning in the 1920s, by developmental psychologists and philosophers Jean Piaget and Lev Vygotsky. Piaget studied how children might construct their knowledge using two processes that Piaget called assimilation (integrating new experiences into existing mental frameworks) and accommodation (modifying those frameworks to fit novel information). The balancing of these two processes, called equilibration, is what drives knowledge construction in individual humans, all of whom pass through four biological and sequential stages of development: sensorimotor (learning through the senses and motor actions), preoperational (symbolic thinking and language development), concrete operational (logical reasoning about tangible objects and events), and formal operational (abstract, hypothetical, and scientific reasoning). [22]

Vygotsky's brand of constructivism is called social constructivism. While he, like Piaget, assumed that knowledge and understanding are constructed actively, through individual experience, for Vygotsky, the interactive social world—rather than the passive natural world studied by 'little scientists'—became the environment for this experience. Cognitive growth, still instigated by the individual learner, now happens whenever a learner stumbles across and struggles with a challenge just outside his abilities (he enters the 'zone of proximal development'). A more knowledgeable other responds from the social environment and provides a temporary support structure, a 'scaffold,' to bridge the zone of proximal development for the learner. The scaffolds are gradually dismantled as learners gain mastery. [23] It is important to note that even in its most 'social' form, Vygotsky’s social constructivism still centers the learner as the builder of knowledge. Vygotsky’s zone of proximal development doesn’t relocate knowledge to a social space—it just widens the perimeter of what the individual might be able to construct, given enough support. In the end, it’s still the learner doing the building. While Vygotsky rightly emphasized that development begins in social interaction, the endpoint remains an individual achievement.

Because these two scientific programs are based on 'methodist' constructivism, they require mechanisms to help them avoid nonsensical outcomes. For example, the four stages are crucial for Piaget's theory. Because knowledge construction happens individually, with humans exploring the world alone like 'little scientists,' Piaget's theory needs a mechanism by which billions of independent humans—billions of little scientists working alone—can all, as adults, have the capacity for logical abstract thought and many of the same foundational schemas that underpin common reasoning. The four stages of biological development attempt to supply this mechanism.

Although Vygotsky's theory does not have the same problems as Piaget's in attempting to explain shared social knowledge, a large number of moments of individual exploration are required by Piaget's theory for much the same reason that a large number of Vygotsky's 'struggles' are required by his. That is, since both theories have, at their center, a self-directed individual learner, some mechanism is required to explain the transmission of knowledge when the only thing moving, so to speak, is the learner. For Piaget, that mechanism is provided by a (presumably large) number of moments of individual exploration. For Vygotsky, it is provided by a (presumably large) number of learner struggles within a dynamically changing zone of proximal development.

Both Piaget's and Vygotsky's theories have been criticized for not including important elements from the other theory: Piaget overlooks the social world [24], and Vygotsky overlooks biological constraints. [25] More importantly, however, both scientists' methods drew heavy criticism. Piaget, for example, relied heavily on verbal tasks, potentially conflating linguistic competence with cognitive ability. [26] And Vygotsky used observational methods without standardized metrics, leading to vague, hard-to-measure constructs. [27]

Most damaging to Piaget's theory, however—at least from the perspective of other researchers—is that his universal stages of cognitive development turned out to be not so universal. [28] Cross-cultural studies have shown that children from different cultures do not all follow Piaget's universal stages. Unsurprisingly, examples that break Piaget's theory often involve cultural teachings. For example, aboriginal children in Australia are trained to navigate using a deep knowledge of the land, relying on specific features of the landscape along with the positions of the Sun and other stars. As a consequence, Australian indigenous children develop spatial awareness much earlier than Western European children [29]—something Piaget might have noticed if his research were not conducted on small samples of European children of high socioeconomic status, including his own children. Even within a culture, Piaget's stages often don't stand up to scrutiny. Research into the human brain's neuroplasticity shows that adults can regress to concrete thinking when they are stressed [30], and they are capable of developing new abstract skills that go beyond formal operations. Moreover, many children reach Piaget's stages earlier than predicted by his theory and do so in a domain-specific way—that is, they move back and forth through stages in different learning areas. A person may be at the concrete operational stage in math, for example, and yet be at the formal operational level in reading. [31] Piaget's notion of development was domain-general with abrupt stage transitions, whereas development is now viewed as domain-specific and continuous.

The modern consensus of learning researchers and theorists around the domain-specificity of learning also runs counter to Vygotsky's theory, which imagines scaffolding and the zone of proximal development to be unaffected by a jump from, say, math to reading, which each has its own internal logic and sequencing. [32] This can't be true, of course. A child often requires a great deal of scaffolding in one subject area and virtually none in another area. The scaffolding itself Vygotsky saw as one-to-one with a learner and a more knowledgeable other, since Vygotsky's theory, recall, is as individualistic as Piaget's, despite his emphasis on the sociocultural 'environment.' Yet we know that establishing effective mental representations of independent performance is key to learning, and this may be accomplished in a variety of different ways and in all kinds of different settings. Finally, and most importantly, it is worth noting that Vygotsky's is a developmental theory, not a learning theory (as is Piaget's), and learning and development are not the same thing. Piaget’s and Vygotsky’s contributions are developmental (how thinking changes over time) rather than direct prescriptions for how to teach or how learning itself should progress in classrooms. [33] Emeritus professor of educational assessment at the University College of London, Dylan Wiliam, writes:

The trouble is that most people who use the term 'zone of proximal development' apply it to learning, not development. In other words, they think that if a child cannot add fractions on their own, but can do so with the help of a higher achieving pair, they are somehow working in the zone of proximal development. They are not. If anything, they are working in their zone of proximal learning, which when you think about it, is a fairly vacuous notion, since the purpose of all education is to help children do things they couldn't do. And this is why I claim that almost everyone who uses the term 'zone of proximal development' is (a) using it in a different sense from Vygotsky, and (b) talking bollocks because they are making something simple sound complicated and academic. [34]

This notion of "making something simple sound complicated and academic" brings us back to the north side of West 120th Street, where educators long ago borrowed the term 'constructivism' from philosophy and psychology to try to make their pre-existing beliefs about learning sound complicated and academic—and respectable. The question we should ask, now, is whether the truth behind all of this is simpler or more straightforward than educators believe. So, let's take a step back and consider seriously education's philosophy of constructivism.

To do that, we will follow the critical analysis of Richard Fox for the remainder of this article. Writing in the Oxford Review of Education in 2001, Fox concluded that educational constructivism functions more as a ‘guiding myth’ than as a legitimate philosophical or scientific theory—more a symbolic identity than a theory. That these arguments are not more widely known may be less a judgment on their quality than on the kinds of ideas the education world is structured to hear. Still, my aim in following Fox’s analysis is not to single out this contribution as uniquely important and valuable, but to draw attention to the quiet clarity of his implied question: “What if constructivism isn’t a discovery, but a story we’ve been telling ourselves about what good teaching is supposed to feel like?”

Constructivism is built on two foundational assumptions. The first, and most important, is often phrased as follows: "learning is an active process." Proponents of both epistemological constructivism and the 'borrowed' version of constructivism deployed in education place a great deal of weight on the term 'active' when describing the constructivist method of knowledge acquisition. As we saw with Piaget and Vygotsky, the activity of the individual learner-child is the primary (or only) engine of knowledge gathering as she moves through and interacts with her natural and social environments. But of course this is a philosophical assumption, not a scientific claim. If we were to swap out the 'active' assumption for its opposite—that human learning is essentially reactive and passive—we would find ourselves on the wrong side of constructivism, certainly, but not necessarily on the wrong side of the scientific evidence. Work in neuroscience, for example [35], suggests that the near-universal concept of 'libertarian free will'—the idea that, were you to rewind the clock, you could have behaved differently or thought differently in any given situation—as an explanaton for human behavior is incoherent and incompatible with the evidence. And if libertarian free will makes no sense—if our behavior and thinking are determined, at bottom, solely by randomness and prior causes—then the notion of a consciously 'active' learner as the driver of their own learning also makes no sense, at least scientifically. [36] The reason educational constructivism in particular needs the assumption of the active learner at its center is not so that it may enjoy a comfier fit with scientific reality; it is so that it may distinguish itself from the strawman thinking of those "musty traditionalists" [37]:

This, the most central and insistent claim of constructivism [that learning is an active process] seems, as it stands, to be either misleading or untrue. Human beings, and animals in general, certainly do acquire knowledge of their environments by acting upon the world about them (for example by investigating habitats and by eating things); however, they are also acted upon. We do things and have things done to us; we act and we react, and clearly we can learn from both types of experience . . . Why, then, should constructivism emphasise only one pole of human experience? I suggest that this follows from its own reactive origins, as a view of learning which was set up in opposition to . . . traditionalist views of education . . .

Traditionalists are said to believe that teaching consists of telling, or instructing, and that the learner is treated as 'an empty vessel' to be (inertly) filled with knowledge . . . But. does anyone actually hold the views ascribed to the 'traditionalist'? Few, surely, would seek to deny the importance of some form of dialogue, if only question and answer, to teaching. It is rather that traditionalists place a greater value on knowledge and its objective status, and on the teacher as knowledgeable expert, as against learners and their existing knowledge and immediate interests. [38]

It seems, then, that the insistence on the self-directed activity of the individual learner is (a) scientifically unnecessary, (b) problematically narrow-minded in and of itself, and (c) motivated, suspiciously, to a significant extent by a cantankerous resentment toward differently minded knowledge professionals. While it may be argued that constructivism supplies a necessary correction to the excesses of 'traditionalist' thinking, which in the extreme would ignore learners' prior knowledge and interests, and that all learning is active because the brain is always active in learning, we really don't need an entire theory to remind us about these things.

The second foundational assumption of constructivism is that knowledge is 'constructed' by the individual learner, not passively absorbed. One of the meekest interpretations of this assumption is that knowledge is never simply transplanted, as is, into learners' brains, either from the environment or from others' minds. Instead, information is a raw material that is shaped by learners' cognitive tools and influenced by their prior knowledge, interests, biases, and unique contexts. The resulting knowledge product is, at best, similar to what was intended to be 'taught' but different in important ways that are personal to the learner. On the other hand, one of the strongest interpretations of the assumption that knowledge is constructed by learners is provided by a subdiscipline of constructivism known as radical constructivism:

It is argued that knowledge is not a copy or a true reflection of some independent reality and that therefore we must adopt some more subjective, idealist or at least conceptually relative view of human knowledge and of the world we can know. Truth as objective correspondence to an independent reality simply does not exist, on this view; we cannot have a 'God's eye' view of the world, or a 'view from nowhere.' We always perceive and know the world from some sociocultural, and historically situated, point of view. Hence, human knowledge is always to be seen as a 'construct,' a product of the human mind. [39]

As far as the weak interpretation goes, here again there is good advice for the blinkered traditionalist—the enemy of educational constructivism—yet nothing we need an entire theory for. For the radical interpretation, Fox offers this excellent analogy:

We can accept that maps of the world, for example, are human constructions, which make different assumptions and simplifications in representing the globe. Different 2-D projections, for instance, notoriously produce different distortions of land areas in different latitudes, and so forth. But we do not have to conclude from this that there is no globe, no planet Earth, which is the subject of those imperfect human representations. Similarly, our conceptual viewpoints are indeed limited but, being views, they are precisely views of something, namely the world or some part of it. That we cannot know 'things in themselves' or 'reality as it is' does not mean that we have to give up our deep assumption of the existence of things in themselves, or of an external world independent of human minds. [40]

It is evident, at this point, the damage done by education's isolation from the rest of the work of the university. Where we should expect a robust multidisciplinary orientation to teaching and learning, full of vibrant and productive debate, we get, instead, a narrow, one-sided view that is both philosophically incoherent and incomplete and scientifically problematic. Educational constructivism's stubborn devotion (and/or inescapable attachment) to the individual learner at its center—the lone epistemic agent constructing the world in his head—whether through a bespoke standalone epistemology or through the quasi-scientific programs of Piaget and Vygotsky, requires a sound justification precisely because it makes very little sense in light of our lived experience, yet the only reasonable justification on offer seems to be that this stubbornness or attachment helps to protect and preserve education's professional identity. It should go without saying that this is not a good enough reason. Of course, we may like the idea and may, on occasion, even feel like this lone epistemic agent in the world—indeed, in times when identity politics are ascendant, constructivism may seem like just what the zeitgeist ordered—but we all know this doesn't correspond to reality. Much, if not most, of our knowledge comes from outside ourselves—via other epistemic agents—and we do, in fact, 'absorb' this knowledge mostly 'passively,' often without full understanding.

A potent (philosophical) example of the latter is provided by (Putnam, 1973). I paraphrase for clarity and to remove jargon:

Suppose you are like me and cannot tell an elm from a beech tree. We still say that my meaning of 'elm' is the same as anyone else's meaning of 'elm', viz., the set of all elm trees, and that the set of all beech trees is what we both mean by 'beech.' Thus what I mean by 'elm' is different from what you mean by 'beech' (as it should be). Is it really credible that this difference in meaning between us is brought about by some difference in our [individual] concepts? For both of us, 'beech' and 'elm' mean the same thing (they have the same 'extension'—i.e., some vague treelike object), because we can't tell the difference between them. Yet, when I say 'beech,' we both recognize that I mean something different from the object you refer to when you say 'elm' . . . What this indicates is that meanings are not inside individual heads. [We know the terms (by rote learning) but nothing else because we offload the conceptual understanding to other people (e.g., experts)—a division of linguistic labor.] [41]

That this reality might threaten our conception of ourselves—or, rather, might threaten the professional identity of education professors—is not an argument against its existence. We are ultimately after the truth, not what makes us feel good about ourselves. We learn some things actively by ourselves, no doubt, but we also learn many things more or less 'passively' from other people. This observation leads, it seems, to a distinct worry that, if we learn so easily from others' say-so, we are all at constant risk of being duped, of being suckers, deferring too easily to others. But if this is true—if we are all too credulous because we learn from others (or when we learn from others)—the cure for the problem of too much deference is not to pretend it isn't true. The cure must ultimately involve investigating the reality of the situation honestly and carefully:

Deference to the notion of authority, the taking of things on trust without regard for understanding, the resignation of individual judgement and so on are all things we have learned to deplore. . . . [Yet] the fact that habits of supine deference [are] commonplace has no tendency to show that the conception of knowledge which sanctions the idea of its authoritative transmission and provides the soil in which these habits are able to flourish is itself wrong . . . A culture in which individual judgement is suspended is eo tanto corrupt. But the cure for this malady is not conceptual revision. The concept of knowledge is what it is come what may . . . The cure rather is to seek a clearer understanding of the nature of knowledge and the mechanism of transmission so that we can see how these attitudes can arise, and how corruption can set in. [42]

Postscript

None of the above makes complete sense without the contextual stages of the Industrial Revolution, the American Civil War, Reconstruction, and the Progressive Era in the United States—not to mention Darwin and Einstein and the Nineteenth Amendment. Here is an interesting biography if one wanted to start from this later stage and move backwards.

References

[1] Guinness World Records. (n.d.). Widest avenue. Guinness World Records. Retrieved March 16, 2025, from https://www.guinnessworldrecords.es/world-records/108080-widest-avenue

[2] Wikipedia contributors. (n.d.). Avenida 9 de Julio. Wikipedia. Retrieved March 16, 2025, from https://en.wikipedia.org/wiki/Avenida_9_de_Julio

[3] Time. (1954, May 3). Education: Change on 120th Street. Time.

[4] Mirel, J. (2011). Bridging the 'Widest Street in the World': Reflections on the history of teacher education. American Educator, 35(2), 6–12.

[5] Clifford, G. J., & Guthrie, J. W. (1988). Ed school: A brief for professional education. University of Chicago Press.

[6] National Center for Education Statistics. Back-to-school statistics. USDOE. Retrieved Mar 16, 2025, from https://nces.ed.gov/fastfacts/display.asp?id=372

[7] Neem, J. (2020, March 26). Anti-intellectualism and education reform. Phi Delta Kappan. https://kappanonline.org/anti-intellectualism-education-reform-public-schools-neem/

[8] ​Leef, G. (2010, January 6). Good news/bad news on education schools. The James G. Martin Center for Academic Renewal. https://jamesgmartin.center/2010/01/good-newsbad-news-on-education-schools/

[9] Seidenberg, M. S. (2017). Language at the speed of sight: How we read, why so many can't, and what can be done about it. Basic Books.

[10] ​Asher, L. (2019, March 6). How ed schools became a menace to higher education. Quillette. https://quillette.com/2019/03/06/how-ed-schools-became-a-menace-to-higher-education/

[11] ​Hanushek, E. A., Peterson, P. E., & Woessmann, L. (2013). Endangering prosperity: A global view of the American school. Brookings Press.

[12] ​Hirsch, E. D., Jr. (1996). The schools we need: And why we don't have them. Doubleday.

[13] Jacobs, J. A. (2013). In defense of disciplines: Interdisciplinarity and specialization in the research university. University of Chicago Press.

[14] ​Bok, D. (2006). Our underachieving colleges: A candid look at how much students learn and why they should be learning more. Princeton University Press.

[15] Snow, C. P. (1959). The two cultures and the scientific revolution (Rede Lecture 1959). University of Cambridge. [See also: Carafoli, E., & Danieli, G. A. (Eds.). (2009). The two cultures: Shared problems. Springer.]

[16] ​Labaree, D. F. (2006). The trouble with ed schools. Yale University Press.

[17] Hanford, E. (Host). (2022–present). Sold a Story [Audio podcast]. American Public Media. https://features.apmreports.org/sold-a-story/

[18] Hirsch, The Schools We Need, 1996.

[19] McCain, K. (n.d.). Problem of the criterion. Internet Encyclopedia of Philosophy. https://iep.utm.edu/problem-of-the-criterion/

[20] Chisholm, R. M. (1982). The Foundations of Knowing. University of Minnesota Press.

[21] In contemporary epistemology, 'testimony' is used as an umbrella term to refer to all those instances where we form a belief, or acquire knowledge, on the basis of what others have told us.

[22] Wikipedia contributors. (n.d.). Jean Piaget. Wikipedia, The Free Encyclopedia. Retrieved March 18, 2025, from https://en.wikipedia.org/wiki/Jean_Piaget

[23] Wikipedia contributors. (n.d.). Lev Vygotsky. Wikipedia, The Free Encyclopedia. Retrieved March 18, 2025, from https://en.wikipedia.org/wiki/Lev_Vygotsky

[24] Cole, M. (1996). Cultural psychology: A once and future discipline. Harvard University Press.

[25] Tomasello, M. (1999). The cultural origins of human cognition. Harvard University Press.

[26] Donaldson, M. C. (1978). Children's minds. HarperCollins.

[27] Miller, P. H. (2016). Theories of developmental psychology (6th ed.). Worth Publishers.

[28] Rogoff, B. (2003). The cultural nature of human development. Oxford University Press.

[29] Levinson, S. C. (2003). Space in language and cognition: Explorations in cognitive diversity. Cambridge University Press.

[30] Lupien, S. J., et al. (2009). Stress effects on the hippocampus: A crucial role for neurotrophins in stress-related damage, mood disorders, and memory deficits. Progress in Neurobiology, 89(4), 279–295.

[31] Flavell, J. H., Miller, P. H., & Miller, S. A. (2002). Cognitive development (4th ed.). Prentice-Hall.

[32] Gredler, M. E. (2012). Understanding Vygotsky for the classroom: Is it too late? Educational Psychology Review, 24(1), 113–131.

[33] Flavell, J. H. (1992). Cognitive development: Past, present, and future. Developmental Psychology, 28(6), 998–1005.

[34] Didau, D. (2017, January 13). Problems with the 'zone of proximal development'. The Learning Spy. https://learningspy.co.uk/featured/problems-zone-proximal-development/

[35] Libet, B. (1985). Unconscious cerebral initiative and the role of conscious will in voluntary action. Behavioral and Brain Sciences, 8(4), 529–566.

[36] Harris, S. (2012). Free will. Free Press.

[37] It is worth addressing here a real confusion on these points. Yes, learning sciences show that, in many situations, increasing cognitive or otherwise behavioral 'activity' during learning can significantly enhance that learning. [For example, see Peter C. Brown, Henry L. Roediger, and Mark A. McDaniel, Make It Stick: The Science of Successful Learning (Belknap Press, 2014)]. But this cannot demonstrate or prove in any way that learning in general is best described or conceived of scientifically as 'actively constructing knowledge.' The point is that the science of 'active' learning is respectably separate from—and, thus, not likely dependent on—the bad philosophy of educational constructivism. Even a broken clock is right twice a day.

[38] Fox, R. (2001). Constructivism examined. Oxford Review of Education, 27(1), 23-35.

[39] Fox, 2001

[40] Fox, 2001

[41] Putnam, H. (1973). Meaning and reference. The Journal of Philosophy, 70(19), 699-711.

[42] Welbourne, M. (1986). The community of knowledge. Aberdeen University Press.​

Previous
Previous

Building Character

Next
Next

Consciousness: The End