[00:00:15] Jon M: Hi, I’m Jon Moscow.
[00:00:17] Amy H-L: I’m Amy Halpern-Laff. Welcome to Ethical Schools. Our guest today is independent journalist, speaker and blogger, Audrey Watters. Her new book, “Teaching machines. The history of personalized learning” was just published by MIT Press. Welcome, Audrey, and congratulations on the book!
[00:00:35] Audrey W: Thank you. Thank you for having me.
[00:00:37] Jon M: What is the significance of your title?
[00:00:41] Audrey W: I liked the kind of play on words in the title. And of course, teaching machines refers to the actual machines, physical machines, that were invented and promoted in mid 20th century by educational psychologists like B.F. Skinner. But that there’s also the idea that schools are becoming quite mechanized. And that, in fact, students feel as though they’re kind of the machines. They’re part of this larger machine. And then, of course, it’s also a nod to some of the stuff that’s happening today, but it’s not talked about in my book, with the idea of machine learning and artificial intelligence and that somehow the future of teaching and the future of learning are very much going to be this, the realm of the machine.
[00:01:33] Jon M: What is personalized learninig?
[00:01:37] Audrey W: I guess I didn’t talk about the part after the colon. I love the academic titles, the colon and then the rest of it. Personalized learning is something that’s fascinating to me because , in so many ways, it appeals to the fundamental belief, an American belief, in the individual and individualism, and that schools perhaps are not individualized enough. Schools are a mass institution. A public education, or is responsible for educating the public. And so a complaint about schools for a very long time, this isn’t a new complaint, although it’s associated with computers today, this idea that schools are not very good at individualizing education. And the rise of the teaching machines in the mid 20th century was very much intertwined with this idea of making sure that we could personalize learning. And strangely, paradoxically, personalized learning by automating it through machinery.
[00:02:49] Jon M: So it’s an alternative to differentiated instruction. People talk about the need to personalize the instruction by teachers being able to address students’ individual needs within the classroom setting. And this is saying, well, just we’ll give everybody a machine and they can go do it on their own.
[00:03:10] Audrey W: Exactly. That the idea. And that the concept of personal or the phrase personalized learning, it is appealing to a lot of people. And when you drill down into it, there are some very different visions of what people talk about. Some of the personalized learning that we see today, that’s particularly associated with algorithms and with computer-based education, it isn’t what I would think of as personalized at all. It’s not as though, if you are a seven-year-old who loves penguins, that the algorithm is going to engage your curiosity and inquiry and send you down a path to penguin research. It’s going to just deliver, ideally, the programs work by delivering the standardized curricul just at a pace that is supposedly geared to you. So it’s not personalized in the way of every one of us gets to follow our own curiosity, rather. It’s just that the pace of instruction is supposedly geared towards each of us individually, so perhaps in some ways, but not entirely, closer to the differentiation that you talked about.
[00:04:19] Amy H-L: Is that as true of today’s computers as it was of the teaching machines?
[00:04:26] Audrey W: Despite all of this hype that we hear today, that computers, partially because they collect so much data about students, they know not just where students are in terms of their potential mastery of a subject, but they know things like, what time of day is the student working on a subject, for how long do they have the application open, where is their computer physically located? There’s just a massive amount of data that computers collect about students. And so supposedly all of this is going into feed some magical algorithm that’s going to dial things even further, but a lot of that’s hype and not true, and I’m not sure we’d want it if it was true. But I do think that it, in a lot of ways, what we’re seeing is very reminiscent of the stuff that happened in the mid 20th century despite, again, this talk from a lot of ed tech entrepreneurs that talk about how innovative and exciting and brand new this. It’s quite familiar. It’s, it’s actually quite reminiscent of the stuff that, Skinner, for example, was doing pre-computer.
[00:05:40] Jon M: Processing your penguin example and this idea of personalized and differentiated instruction. Because if you’re doing project based learning, then you are giving students a chance to go off and follow their bent around penguins, where somebody else can go off and follow their bent around pigeons, say. Are there any of these computer based programs that try to do that kind of stuff, that try to let you go off in your own direction?
[00:06:12] Audrey W: Not really, and we can think about the ways in which human teachers help students in finding those interests as well. The computer based education is much less likely to be project-based. Sometimes it can be used in conjunction with a more project-based focus. But I think that by and large, computer assisted instruction is going to move students through oftentimes much more, I don’t want to say basic because that’s not the right word for it, but most like the fundamentals that in some ways computers are better at doing. People are better at doing the project, doing and helping students find their interests and passions for project based learning. Computers aren’t so good at that. Computers can, as Seymour Papert talked about when he wrote his book Mindstorms in 1980, computers are very good at running students through their paces, getting them to learn their multiplication tables quickly, things like that. A computer is unlikely to suggest to students if you’re interested in penguins, here are people to talk to. And perhaps here are five other flightless birds that might interest you as well.
[00:07:28] Amy H-L: I don’t know, Audrey. Amazon seems to do that pretty well.
[00:07:31] Audrey W: Does it? I’m not sure it does. Cause I buy one thing on Amazon and it suggests like five of the same thing.
[00:07:40] Amy H-L: In terms of books, they really can track your interests. And Netflix will figure out that I mainly watch documentaries. I’m not sure that the technology isn’t there, if they wanted to do it.
[00:07:53] Audrey W: I disagree. In some ways these are still very bounded, these are still very bounded recommendations, right. Amazon isn’t going to suggest to you books from your local bookstore, for example. It’s only within the Amazon world. It’s only within the Netflix world. Netflix is only going to suggest to you things that it has the licensing to. And in that way, some of the educational software is quite similar because it’s only gonna suggest to students, even if it were, say, able to have the more personalization like we think of in terms of shopping recommendations or viewing recommendations, it’s still very likely to be circumscribed by the way in which the curriculum is already laid out for different grade levels. So again, it’s unlikely to suggest to a fourth grader to do trigonometry. It’s unlikely to suggest anything about ornithology because ornithology doesn’t live in the K 12 curriculum as currently outlined by 50 states. Part of the problem, though, or the challenge, is that in these they don’t ask for your interests.
[00:09:01] Amy H-L: It’s not as though you have a lot of choices in the way that these, that Amazon or Netflix…
[00:09:06] Audrey W: Yeah. Sometimes they do. And it’s also, it’s always, I think, very insulting to students, the ways in which those choices are represented. They’ll say, for example, some of the so-called personalized study tools around grammar or math, will ask students like, what are your interests? And they’ll give them a selection of like, are you interested in sports, or are you interested in music? And then the worksheets that are generated, digital worksheets that are generated, have a fill-in-the-blank look as though… They’re not personalized. I saw one example fairly recently. Students could say that they were interested in music and the worksheet generated a bunch of questions. The examples of the singers in, in the worksheet were like John Cougar Mellencamp, who, I’m a child of the eighties, right. I know who John Cougar Mellencamp is, but I’m going to bet my bottom dollar, very few third graders today know “Jack and Diane.” It’s insulting the way in which this kind of stuff. Say you’re interested in sports. It might suggest, geo located say, for you and I like, oh, well you must be a Giants fan or you must be an A’s fan, but that maybe that’s not the sport that we’re interested. Maybe we are super interested in tennis or marathon running.
[00:10:29] Amy H-L: So are you saying that the idea is bad or that it’s badly executed?
[00:10:34] Audrey W: One of the responsibilities of adults in general, educators specifically, is to help students experience things that they don’t even know that they don’t know. Many children aren’t going to know that perhaps that they are interested in the cello. It’s not something that you come up with in your day-to-day life. Some children, perhaps from some households, will have cello experience, but that’s the responsibility of the adults to help expand their horizons and introduce them things that they wouldn’t know that they were interested in. If we’d let six and seven year olds dictate what their line of inquiry might be, we might have a lot of serious penguin research. I don’t doubt that. But we might have a lot of research into other things that perhaps aren’t going to give children and give future adults a rich understanding of the world around them. So I do think it’s our responsibility to help introduce children to things that they might not ordinarily know. And so that idea of supporting people’s interests and helping them uncover what their interests might even be, is crucial. But that is the role of humans and less something that we can program.
Traditionally, often that has been the role of the librarian. And unfortunately, we are too quick to cut that kind of role in a school and much too eager, to introduce these kinds of algorithmic suggestions to students that I don’t think are going to uncover new and exciting inquiry and foster open thinking among kids.
[00:12:10] Jon M: So a lot of this, it sounds like, goes to some of the fundamental questions about what the purpose of education is, or how people conceive of education, whether it’s to simply pass on the basics of received knowledge or whether, in a Deweyan sense, it’s to encourage exploration, encourage relationships, encourage thinking. And it’s interesting that some of this seems to come wrapped up in the promises that it’s going to be expanding education whereas it sounds as though it may be more suited to simply these are the facts that you’re supposed to know, and I’m going to help you — or not I, because it’s the computer — but the computer will help you learn these facts.
[00:13:06] Audrey W: Yeah. There’s a historian of education, Ellen Condliffe Lagemann, who has often says that to help students, to help her students understand the history of education in the United States, she says that Dewey lost and Thorndike won, that John Dewey’s vision of what we’ve been talking about, project based learning, inquiry based learning, this idea that as a student, one is thinking about one’s future role as a citizen, that’s not the model that has won out. The model that won out is the model of Thorndike, whom I talk a little bit about in my book. Thorndike was a very early education psychologist, a professor at Columbia University, and one of the earliest in the idea of studying education as a science, quantifying something that can be quantifiable. So of course his legacy is all wrapped up in standardized testing. Like many researchers at his time, he did a lot of work on animals and inferred from animal behavior how humans learned. He’s responsible for the, the phrase “the learning curve,” for example, which is how quickly it takes a mouse to figure out how to get out of the maize. And you can graph that and that’s the curve and that’s the learning curve. But Thorndike had this idea of this measurable standard, like using standardized tests, that education is a science and that if we just have more data and we test better, we can dial in the teaching, is Thorndike’s vision. And Ellen Condliffe Lagemann talks about that.
And I like to add a little corollary to that, which is to understand the history of education technology in the US, you have to understand that Skinner won and that Seymour Papert lost, right. And so Skinner maps quite neatly onto Thorndike, this idea that not only is stuff quantifiable, but again, we can engineer with machines and dial that in, dial in the instruction. Papert’s vision, , is much more about the learner, much more closely associated with Dewey. What we see in ed tech is much more about what, again, what Papert said, machine programming the child and much less about this open-ended inquiry the children might have when it comes to technology or any other way of knowing the world.
[00:15:35] Jon M: Skinner’s obviously a super important figure in [inaudible] some of this. Can you talk a little bit more for people who aren’t as familiar?
[00:15:46] Audrey W: Okay. Yeah. I feel cursed. I feel like the kid in the Sixth Sense movie that would lie there saying “I see dead people.” I feel like since studying B.F. Skinner that I see behaviorism everywhere. Skinner, was arguably one of the most well-known public scientists of the 20th century. He was a professor at Harvard. He was professor elsewhere first, but he was a professor at Harvard. He was on television, on the covers of magazines. People knew his name. He was a well-known public intellectual. I joke sometimes that if there were TED talks during his time, he would’ve given, he would have definitely had some TED talks, probably about education technology, but possibly about how to train your dog, one or the other.
Perhaps the [inaudible] for training pigeons. He did a lot of work with pigeons to help develop his idea of operant conditioning, fundamental to his idea of behaviorism, which is that positive reinforcement is how you can shape behaviors. And for Skinner, behavior equals learning. That’s how we know if someone’s learning, through behavior. I don’t mean just like physical behavior, like how one acts, but Skinner thought you couldn’t see the mind. You could only see behavior. So writing and speaking were also behaviors for Skinner.
He developed a teaching machine. He didn’t like how his daughter’s class worked. He thought that kids don’t get immediate positive reinforcement in the classroom and the way to get immediate positive reinforcement is through machines. So he went home and built a little machine that would do that – give students a problem, and when they answered it, they would get immediate feedback. And the way he designed it, immediate positive feedback because they get the question right, and that was the basis for how he developed his notion of programmed instruction. And it’s foundational for how education technology has been built since. He wasn’t working with computers, but computer based education looks an awful lot like Skinner’s vision.
[00:17:58] Amy H-L: All this technology must have some economic impact.
[00:18:04] Audrey W: Schools spend a tremendous amount of money on technology. It’s interesting to debate. You can always ask “Is it worth it?”, but it’s been a potential market for a very long time. Businesses have seen schools as being a market for their devices. It was something that Skinner himself struggled with. Many of the educational psychologists in the early and mid 20th century that were interested in developing these machines and selling them to schools, struggled for a variety of reasons in getting the product into the classroom. And in some ways, education technology entrepreneurs have faced that for a very long time. Do you sell to individual teachers? Do you sell to school districts? Do you sell to affluent parents who can afford this supplemental device? Entrepreneurs and businesses have taken different pathways to get their devices into the classroom. But for a very long time, throughout the 20th century, schools have invested in a variety of technologies in order to make education, again, this very American idea, make it better, cheaper, faster, more efficient. That’s always the promise of the machine — cheaper, faster, better, more efficient.
And now, of course, things like Chromebooks and so forth. Even before the pandemic, districts were buying them because they seemed like they were the path to the future. They were glitzy. Nobody would say, oh, I don’t want a computer, that kind of thing. So that, at this point there’s a huge market in the computers themselves. As I alluded to with Thorndike, the history of education, technology and educational testing are tightly intertwined. And one of the reasons why pre pandemic, we saw a real explosion in the adoption of computers was, of course, because standardized testing was starting to move to the computer, mandated in some ways by some of the things that the Obama administration did with Race to the Top, for example, “encouraging” with air quotes around it, “encouraging” states with money to make the move for standardized testing to be administered on devices, on computers. And so again, the relationship between the testing and the tech is tightly intertwined. And then you can think about what does that mean? That the tech ends up looking like? If the technology that students are given is technology that’s designed to facilitate standardized tests, that circumscribes perhaps the kinds of devices and the kinds of things that students get to do with those devices when they do indeed end up getting their hands on them.
[00:21:04] Jon M: In a 2020 talk that you shared on your blog, you said that the technology shapes education and education shapes technology, and the two are shaped by ideologies, particularly capitalism and white supremacy. Could you unpack that for us?
[00:21:19] Audrey W: Yeah. Sure. Technology is developed by companies with ideas in mind that are about profit. I will say I am not a fan of B.F. Skinner. I’m not a fan of behaviorism despite finding operant conditioning to be a great way to train my dog. I don’t think it’s a great way to train children. I don’t think we can see children and pigeons as this interchangeable entity for us to use his practices. That said, I didn’t want to make Skinner into being the villain of my book. I wanted to paint a richer picture of his own interests and his own struggles in getting his ideas to market. And one of the things that you can see in my book and that Skinner himself struggled with is here’s a guy who thought that he had the best ideas. He was working on the cutting edge of educational psychology. He knew the science. He knew the science of how students/pigeons learn. And he was going to build a machine that works for the best possible science. But that came into conflict with the financial interests of the companies that he tried to work with. They didn’t give a damn about the science. They want to have a machine that would be cheap to produce, that they could sell a bunch to schools and that they could turn a profit on. That’s how companies work.
And that we still see that today. What are the driving interests that dictate the kinds of machines, the kinds of software that ends up in the classroom? It’s not that the best science wins. It’s not the person, the ideas that are fully fleshed out and peer reviewed by the best cognitive science, top scientists and educational researchers. Those aren’t the technologies that get adopted. They’re the technologies that school districts can afford. For example, you mentioned the Chromebooks. There’s a reason that the Chromebook is so appealing to many school districts. And there’s a reason that the Google Suite of educational tools is appealing to so many districts. And the reason is the Chromebooks aren’t free, but you can either pay to have email and a suite of productivity software, or you can use this company that gives it for free. So the best tools in terms of science don’t win out. And that that’s, that’s something that that education technology reflects.
The white supremacy part is intertwined, frankly, with our schools as well. These are institutions that have a long history of racism, that are very much the teaching practices, the ways in which students are seen and disciplined and taught or not taught, the resources that go into schools. This is also a reflection of white supremacy. And I mentioned early that the earlier the kinds of things that you can do with technology.
The things that students get to do with technology can be very different, based on the socioeconomic status of the student, the kind of student, the kind of school. And race plays a role as well. Again, you’re much more likely to see education technology that’s putting students through their paces in schools that have lower socioeconomic status students and students of color. Privileged students, white students, affluent students tend not to have a lot of software that is drill and kill. They tend to have opportunities to do more of the kind of hands-on exploratory project based learning. They tend not to be so [inaudible]. The pedagogy and the technology tends to not be driven by the standardized testing.
And increasingly because technology is so very much intertwined with data collection and data analysis, we do have to think about the ways in which school discipline surveillance and school discipline also plays out in education technology. One of the popular tools, unfortunately, over the pandemic, has been the online test proctoring software that many school districts and colleges have decided to implement. And this software is rife with all sorts of problems for students of color. It uses facial recognition technology, for example, that tends not to recognize darker skin, for example. And so students of color are often flagged as cheating by the software because there’s not enough light on their face because the software struggles to see and surveil their faces. And so there are lots of ways in which education technology, again, reflect some of the deep problems that our institutions have had pre tech as well. That was a very long winded answer. It’s very difficult to untangle these strands.
[00:26:38] Amy H-L: So do you see any constructive use of personalized learning software?
[00:26:43] Audrey W: No, I don’t think so. There is potentially lots of interesting uses of technology, of digital technology, but a nything that’s rests on this idea of algorithms is something that I’d be I’m super hesitant about, right. In some ways algorithms are the antithesis of public education. One of the things that makes public education valuable is that it is accessible and in some ways, not entirely, but in some ways quite transparent. We know, for example, what’s taught in schools because there are public hearings, public debates, there are public school board meetings in which this kind of stuff is hashed out.
Of course, the class door gets closed and things. This is one of the interesting things about pandemic based learning. A lot of parents have seen into the classroom in ways that perhaps they hadn’t before,. But by and large, the public part of public education means that it’s responsible to and transparent to the public, not just to parents, not just to students, but to all of us. Algorithms aren’t. Algorithms tend to be with their black box. We don’t see what goes into them. And so I’m hesitant to have a future in which we don’t actually know where these recommendations are coming from. You can see this play out, not just in the kinds of content recommendations that algorithms might make for students, the penguin question, but algorithms that recommend students’ careers and students’ colleges to them. So this is another example. We don’t know. We don’t know the biases that are embedded in the software that might recommend to a Latina student, for example, that she go to community college. And we don’t know why she is recommended community college and not an Ivy. Is it her grades? Is it her zip code? Is it her language spoken at home? And that stuff is not transparent for school districts to be able to evaluate. They buy software that makes recommendations, right. The algorithm isn’t something that school districts can evaluate. We don’t know why it would recommend certain things. And so I’m wary of those because because the biases are absolutely baked into the software.
Humans have biases as well, right. A guidance counselor might also make pretty crappy, potentially racist recommendations. But we have a different kind of view into that than we would an algorithm, particularly folks who would trust and say, well, the, the algorithm said, and so we don’t get to to question why someone is placed in a particular class or why someone is recommended to do something. And this is already playing out. This is definitely already playing out, often at the college level, but certainly with software that recommends what colleges students apply for. But at the college level, recommending certain majors for students, recommending certain classes for students. And it’s not entirely clear why that recommendation occurred.
[00:30:01] Jon M: This is a different question from what you’re talking about with the algorithms, making recommendations, for example, for colleges and so on, but can any of this software be a useful supplement in a constructivist or project-based learning?
[00:30:17] Audrey W: This is a interesting question. I’m enamored with this penguin idea, like, right. So what happens if our little penguin lover wants to pursue something that struggles with… actually I don’t know enough about penguins to continue this analogy. Well, let’s just say, she or he struggles with algebra. How do we remediate the algebra in such a way that we aren’t actually impeding their continued projects around studying penguins? I do think that that’s possible. I’m just not sure that that’s what we see happening. And again, for me, and this is my own, I would say, philosophical and political bent, II’m always interested in asking why is it that we ought to use the machine and not the human, right. Is it a lack of resources? Why aren’t we investing more resources in humans, right. If we are at a state where so many students need remediation, what’s happened along the way before we jump in and say, well, we can use software to catch everyone up to speed. We also need to diagnose what’s happening along the way.
And perhaps there are different human resources that we need to invest in as well. And I mean in terms of reinvesting in the student, but also investing in students feeling as though they have humans who care about and support them and are actually interested in helping them move forward. It’s very common to have students who struggle be put on computers. It’s very common for remediation, and it tends to be those credit recovery classes that aren’t actually great in terms of teaching and learning. But our further drudgery are having students who are struggling end up spending more time doing digital worksheets and even less time doing the kinds of things that might interest them, partially because they’re spending more time with a computer that’s just running them through the paces rather than engaging with somebody. A tutor, a mentor, can actually help them and meet them where they’re at, in terms of both the practical skills that they should be learning, but also the other things that they’re interested in.
[00:32:35] Jon M: So this goes beyond the scope of your book, although it’s obviously implied by it, but why do you think that we as the country so consistently pick these paths? Why does Dewey lose out to Thorndike and Pappert lose out to Skinner?
[00:32:54] Audrey W: That’s such a good question. There is so much of this that is bound up in our, as I said at the outset, it’s very American, it’s very American ideology, right. That this idea of individualism, for example, we’re, we’re seeing that so painfully right now and throughout the pandemic, not just in terms of schools, but in terms of lots of areas in life where people aren’t able to think about the community, think about the society, are much more interested in only thinking about themselves. We’re very individualistic.
Americans also, we love our gadgets. And we believe, and this is a positive thing, we do believe that technology is going to be the answer. And so we’re very quick to try to techno fix something rather than look at, say, the other structural issues that might actually be harder, but more meaningful in terms of rectifying problems. If Americans could push a button and fix problems, we would sign up for that in a heartbeat. We love the idea that a gadget is going to fix things, and technology, computer technology is able to sell itself as a gadget that’s gonna fix things and Americans love that, not just in terms of school, but in terms of anything. We would love to be able to buy something that m akes loud noises and whirls about, and it’s colorful and fixes things rather than do the deep soul searching and the structural shifts that have to occur to actually make the world better.
[00:34:39] Amy H-L: You’re so right. I also that we talk a lot about relationships and I suppose there are some relationships that people can have with gadgets that we’re starting to see, but SEL, and not just in a remedial sense, is so important. And I don’t think that we should be downplaying the significance of relationships.
[00:35:03] Audrey W: Yeah. That that’s one of the things to me that is so fundamental about education, too, is that in some ways it is a practice of care. At its best, it’s a practice of caring for individuals, students, caring for the class, but also caring for the future, caring for society at large. Machines don’t care. We can program things that make it look like they care. You can get a little check-in, your phone can give you a push notification every morning and say, you know how you’re feeling today, but your machine doesn’t, the phone doesn’t care. Caring is something that’s important, and it’s something that we should be very careful about as we increasingly adopt these ways of engineering education that further distances and actually diminishes the role, the important role, of care surveillance and data analysis.
That the other piece. Many of the school administrators are being sold this idea, they want you to adopt this technology in order to better track, better manage, keep an eye on the students, perhaps from a place, a good place. But again, surveillance is not care. Surveillance is something different from caring about your students. You notice when things are awry. It’s not that you get a blip in the database that says, “Johnny’s scores have dipped by 3%.” That’s data, data analytics indicator. Rather you can tell, you can tell by the way they carry their body, you can tell by the way that they look, by the way they engage with you and with one another, but that’s an act of care and machines machines won’t ever, ever do that. No matter what the science fiction says.
[00:36:51] Amy H-L: Yeah. I don’t think there’ll be too many writers talking about the fourth grade software that encouraged them. So, now that we’re, at least we hope we’re, coming out of the pandemic, what are your thoughts on the experience of remote learning? Are there lessons to be learned from it?
[00:37:13] Audrey W: This is so interesting and sad for me, on one hand, cause it could have been an opportunity for many of us to ask some… well, I should back up. It has been an opportunity for us to ask some pretty important questions and raise some pretty important questions about what do we expect schools to do? At the outset, I remember at the very beginning of the pandemic, you would see on Twitter, exasperated parents saying like, teachers should earn a million dollars a year because like I have two kids and how the hell does a teacher do this with 30? And so, there was a brief moment when people were like, wow, teaching is hard and teachers are amazing. We also saw the ways in which we have shifted onto schools so much more than just instruction, right. Schools are responsible for, even though the funding, the funding doesn’t and the regulatory mechanism doesn’t account for this. We know now the schools are responsible and teachers are responsible for a lot more than just making sure that the curriculum is imparted to a classroom, right. They’re responsible for lunch. They’re responsible for this social development. They’re responsible for physical care, keeping an eye out for abuse. The school is a community center. The school is a place where students can get medical care. Schools are responsible for a lot more than just instruction. I hope we remember that and decide to fund schools and support, think about what other institutions we need to fund in order to bolster community health.
But in terms of remote learning, it’s interesting. We can see that as we’re recording this now, this is the week everyone’s heading back to school from the winter break and many schools are facing another round of shutdowns. And parents rightly so are exasperated, reluctant to go back to being online. We’re going to have to develop more robust solutions than just expecting students from kindergarten onwards to spend all day on Zoom and think that that cuts. Students have probably not learned a lot in terms of what we expect students to learn. They’ve learned a lot of other things, good or bad, but I don’t think that remote learning, I don’t think the future is online learning. It’s something that a lot of technologists have been saying for a very long time, isn’t it going to be great when we can all just go to school online from our pajamas and spend all day on the internet rather than having to go to school. And the consensus from I’ll say, like 97% of students, is oh God, no, it’s the worst. And so, I hope that maybe we can dial in some techno, unexamined techno optimism, and have more practical supports, realistic supports, for the kinds of things that we expect schools to do.
[00:40:23] Jon M: Thank you so much, Audrey.
[00:40:26] Audrey W: Thank you.
[00:40:27] Jon M: Thank you, listeners. If you enjoyed this podcast, please share it with friends and colleagues. Subscribe wherever you get your podcasts and give us a rating or review. This helps other people to find the show. Check out our website, ethical schools.org for more episodes and articles and to subscribe to our monthly emails. We post annotated transcripts of our interviews to make them easy to use in workshops or classes.
We work with consultants to offer customized SEL programs, with a focus on ethics, for schools and youth programs in the New York City and San Francisco Bay areas. Contact us at hosts@ethicalschools.org. We’re on Facebook, Instagram, and Twitter @ethicalschools. Our editor and social media manager is Amanda Denti. Until next week.