AI and the Hard Questions of Education

Why does it feel like the only use-case for AI is education? What does that mean for us?

Welcome to Scholastic Alchemy! I’m James and I write mostly about education. I find it fascinating and at the same time maddening. Scholastic Alchemy is my attempt to make sense of and explain the perpetual oddities around education, as well as to share my thoughts on related topics. On Wednesdays I post a long-ish dive into a topic of my choosing. On Fridays I post some links I’ve encountered that week and some commentary about what I’m sharing. Scholastic Alchemy will remain free for the foreseeable future but if you like my work and want to support me, please consider a paid subscription. If you have objections to Substack as a platform, I maintain a parallel version using BeeHiiv and you can subscribe there.

Everyone wants LLMs to destroy schools

From the moment modern LLMs entered the broader public consciousness in late 2022, one of the biggest themes in the discourse has been about how AI will wreck education. I credit some of that to Ben Thompson’s post about AI Homework. I’ve commented on that post a bit before and offered some alternative views. Thompson’s article convinced a lot of people that school, because it is just about learning facts and mastering a few basic skills, will be rendered worthless by technologies that know all the facts and can itself employ basic skills. The purpose of education in the post-AI world is for students to be verifiers and editors.

Moreover, instead of futilely demanding that students write essays themselves, teachers insist on AI. Here’s the thing, though: the system will frequently give the wrong answers (and not just on accident — wrong answers will be often pushed out on purpose); the real skill in the homework assignment will be in verifying the answers the system churns out — learning how to be a verifier and an editor, instead of a regurgitator.

What is compelling about this new skillset is that it isn’t simply a capability that will be increasingly important in an AI-dominated world: it’s a skillset that is incredibly valuable today. After all, it is not as if the Internet is, as long as the content is generated by humans and not AI, “right”; indeed, one analogy for ChatGPT’s output is that sort of poster we are all familiar with who asserts things authoritatively regardless of whether or not they are true. Verifying and editing is an essential skillset right now for every individual.

I don’t necessarily disagree, but I do think Thompson and damned near everyone writing about AI take a stance with regard to education that is very narrow. Notice the function of learning for Thompson. The purpose of school is to prepare you with a specific skillset that is only important outside of school, the “real” world of jobs and the economy. I’ve argued before that this is a bastardized version of Human Capital. The model should not be one whereby schools are required to chase specific skills for specific kinds of employment but that they inculcate generalized knowledge and skills to enable the acquisition of specific skills later on. Notice also that schools today are, according to Thompson, teaching kids to be regurgitators. I don’t think Thompson means any harm here and his point is one everyone in education should take seriously.

More recently I made a comparison between dead internet theory and the potential for a totally non-human cycle of schooling where kids use AI for schoolwork and then schools grade it with AI. I called it dead schooling theory. The larger point being that we’re in a situation where we have to reject transactional or instrumental views of education, otherwise we’re going to actually get dead schools and a generation of kids wholly dependent on AI products. As more and more attention falls on the use of AI in K-12 and Higher Ed, the impacts will start to be better understood. But, if you’re like Thompson and many many other people who immediately connect AI and schooling, all you can think of is that school is not important on its own. All you can think of is that learning is meaningless without something down the line to give meaning to that learning.

And when that’s the view, it makes a lot of sense to want schools, especially K-12 schools, to look radically different. Often times these are not well thought out requests. People don’t want to ask the hard questions about education. If it needs to be radically different, radically different how? What kinds of critical thinking, evaluation, verification, and editing aren’t happening now AND what would it look like to make those part of the curriculum? How does this idea that schools need to be more rigorous, more focused on high-level analysis and synthesizing across data sources and AI outputs mesh with the growing policy of “back to basics”? If we’re going to cheerlead the demolition of schooling by AI, would it be too much to ask for a clearer and more thought-out replacement?

Ezra’s not listening?

I typically enjoy the Ezra Klein show and the recent episode he recorded with Rebecca Winthrop was no different. Thanks, Alex, for sending that along! The whole point of the episode is that Ezra thinks we have to rethink the purpose of education. He begins with exactly the kind of question I want to hear from people who connect AI and education.

[EK] So I have a 3-year-old and a 6-year-old. I feel like I cannot predict with A.I. what it is that society will want from or reward in them in 15 or 16 years, which leads to these questions in the interim — How should they be educated? What should they be educated toward? — feel really uncertain to me. My confidence is very, very low that schools are set up now for the world they’re going to graduate into.

Hey, great! Instead of just saying it has to be different, he has a guest on to talk about what that difference could look like. One thing I couldn’t shake as I listened though was that Ezra didn’t seem to be listening! Here are a few examples from the transcript.

Winthrop makes the point that school can’t just be about the transaction.

[RW] People always think of education as a transactional transmission of knowledge, which is one important piece of it. But it is actually so much more than that: learning to live with other people, to know yourself and for developing the flexible competencies to be able to navigate a world of uncertainty. Those are the “whys” for me.

I might ask you: What are your hopes and dreams for your kids under the “why,” before we get to the details of the skills?

After initially seeming open to a broader definition, Ezra echoes the idea that education lacks any serious merits of its own:

[EK] The fact that maybe they developed their faculties as a human being or learned things that were beautiful or fascinating, that’s all great. But if they do all that and they don’t get a good job, then we failed them. And if they do none of that, but they do get a good job, then we succeeded.

I think that’s been the reality of education, but I also think that reality relies a little bit on an economy in which we’ve asked people to act very often as machines of a kind. And now we’ve created these machines that can act or mimic as people of a kind, so now the whole transaction is being thrown into some chaos.

Later he regurgitates the old chestnut that schools can’t teach anyone but the average kid.

But it’s not obvious to me that schools should be tuned for me. One thing that I recognize, as someone who studies bureaucracies, is that if you just think of U.S. public education — to say nothing of private education or global education — it’s educating a lot of kids. And its ability to tune itself to every kid is going to be pretty modest.

Winthrop vehemently disagrees and gives pushback, but Ezra is incredulous, demanding examples and then when he gets an example, he connects it to the learning he did out of school, later in life as a political blogger. He seems unable to believe that schools can deliver any experience beyond the one he had in middle and high school. (Side note, Ezra’s account reads like a textbook example of a gifted kid who falls through the cracks and that should tell us something more important about schools than the rest of this conversation.)

He adopts the AI optimist claim that AI products will be better than teachers but when Winthrop asks him what AI will be better at, he can’t say.

[EK]Let me push you on this for a second, because if I’m taking the position of the A.I. optimist, what I’d say is: No, I’m not saying that. I’m saying the A.I. will be better than the teachers.

[RW]Better at what?

If we are saying that A.I. is going to be better than the median for many people at many kinds of work, why would we not assume that this system we’ll be able to build in six years, given how fast these things are developing, won’t per kid be better than the teacher?

I’m not saying I believe this. But I want to make you argue with the A.I. optimist case.

Yes, you’re pushing on it. I get it. But the question is: Better at what?

It keeps going with Winthrop suggesting that teachers-as-facilitators to AI tutors is only one limited vision while Ezra seemingly can’t shake the “ai-pilled” vision of a Young Lady’s Illustrated Primer for every child. Eventually, though, he digresses and let’s his anxiety over his kids’ future take center stage again.

[EK] Here’s where I actually am: I think we’ve just been going through a catastrophic experiment with screens and children.

And right now, we are starting to figure out that this was a bad idea. Schools are banning phones. My sense is that they are not relying very much on laptops and iPads. There was a big vogue for a while that every kid gets their own laptop or tablet. I think that’s beginning to go away, if I’m reading the tea leaves of this right. So I feel a bit better about that as a parent of young kids.

I really feel badly for the parents whose kids have been navigating this over the past 10 years or so. And right now I see A.I. coming, and I don’t think we understand it at all. I don’t think we understand how to teach with it. I don’t think the studies we’re doing right now are good yet — there are too many other effects we’re not going to be measuring.

There’s the narrow thing that a program does, and then there’s what it does for a kid to be staring at a screen all the time in a deeper way. I believe human beings are embodied. And if you made me choose between sending my kids to a school that has no screens at all and one that is trying the latest in A.I. technology, I would send them to the school with no screens at all in a second.

But we’re going to be working through this somehow. And what scares me, putting aside what world my kids graduate into, is their moving into schools at the exact time that educators don’t know what the hell to do with this technology. And they’re about to try a lot of things that don’t work and probably try it badly.

I agree! So does Winthrop. I even have an explanation. But, to the point I want to make about Ezra here, he seems really intent on not listening to Winthrop. She offers examples, numbered lists, and other context for broadening or challenging the ideas Ezra puts forward, but Ezra cuts her off, changes the topic, or flat out ignores what she says.

The Vision

Winthrop is making the case that one of the primary things schools should be focused on developing in their students is the capacity for deep sustained attention on an engaging subject. So, when Ezra again turns the topic back to his parental anxiety and feeling like nothing can be done and the kids are all doomed, Winthrop distills her point a bit and makes it more forcefully.

[RW] We 100 percent want kids to have the capacity for deep attention. You are thinking about your own kiddos, who are young, and I’m thinking about my own teenagers, who are 13 and 16.

And I see the undermining of attentive faculties from when my 16-year-old got his phone. For a long time, he didn’t want a phone because I’d been droning on and on for years — because he has me as a mother — about addiction and opportunity costs and that it’s OK to enjoy it a little bit, but you can’t sacrifice sleep and physical exercise and in-person communication.

And then he did get his phone. And he struggles with it. And he says: Mom, this is really hard.

It’s eroding his ability to do his homework or to follow something he wants to do. The only thing that it doesn’t seem to distract him from doing is playing the piano. Because he loves playing the piano.

So anything that we can do to actually ensure young people are developing the muscle. And it’s not just attention. Attention is the entry point, the doorway that gets you through. It’s actually reflection and meaning-making, which is what you get from deep reading and reading full books, which a lot of young people struggle to do today.

You also can get it from other means. You could get it from long Socratic dialogues in community with diverse people over time.

But it has to be an experience where you reflect. You think about meaning. You think about different perspectives. And it changes how you see the world.

And what you really want are some feedback loops that are beyond just grades and behavior to know: Is my kid developing agency over their learning?

And what I mean by that is: Are they able to reflect and think about things they’re learning in a way that they can identify what’s interesting and they can have the skills to pursue new information?

That right there is, I think, going to be the core skill. It is the core skill for learning new things in an uncertain world, which is I think one of the No. 1 things we think about.

In addition to that, I would say make sure kids are learning to interact with other human beings — any school that has them working with peers but even connecting with community members.

There you have it. Put your kids in an environment where they have access to things that are interesting to them, where they have the time and space to think deeply about those things, and where there is some embedded connection with the world at large. It takes Winthrop the whole episode to get there and Ezra seems to be having a really hard time comprehending why anyone would learn anything if it didn’t land them a high paying job. I’m glad, though, that Ezra did finally allow a clear answer to sift through.

Sadly, I am skeptical that this is the path we will take. Too many people want a return to rote learning, foundational skills, and a focus on career preparation. This approach would foreclose the kind of open-ended exploratory learning that Winthrop envisions. Moreover, AI today seems perfectly suited to supporting rote learning meaning those will likely be the applications we see deployed by schools first.

Thanks for reading!