Kevin Sun


Obligatory post on ChatGPT

How will AI affect higher education?

December 26, 2022

Tons of people have thoughts on ChatGPT, so I'll try to make this interesting. In this post, I'll speculate on the impacts that AI (artificial intelligence) could have on higher education.

The usual disclaimer found in this type of blog post: I think most people didn't expect something like ChatGPT to arrive so soon, so it's possible that other unexpected things will happen really quickly and this post becomes totally irrelevant within months. [1] At the same time, maybe there will be technical or political hurdles that slow the development of AI. As the saying goes, "It's difficult to make predictions, especially about the future." I'll try not to say "maybe" too much.

So let's start with the present. What is higher education, or more broadly, school? School is a place where teachers help students learn and evaluate how well they learned. Traditionally, students attend class to watch lectures, and outside of class, they independently answer questions. The flipped classroom emerged in the 1990s alongside things like peer instruction, the Internet, and remote learning. As its name suggests, this model flips things around: students attend class to answer questions, and outside of class, they independently watch lectures. In a flipped classroom, the teacher goes from being a "sage on the stage" to being a "guide on the side."

Given the development of AI, maybe the next step in the evolution of education is that teachers fade into the background and students become "taught by a bot." After all, it is well known that having a tutor is extremely effective for learning. An ideal AI tutor would be like a human tutor augmented with infinite patience and memory, mountains of stored facts, 24/7 instant access, and the ability to cater its approach to every tutee. Students with these AI tutors (who are also kind and benevolent, unlike their grumpy human counterparts) wouldn't need to attend class at all, whether it's a flipped classroom or not. And if students don't need to attend class, then perhaps schools won't need to hire teachers. [2]

That sounds pretty wild, and maybe it's a bit far-fetched. MOOCs were poised to disrupt higher education a decade ago, but they didn't. There's also a pretty lengthy Wikipedia article on "intelligent tutoring systems," which sounds a lot like AI tutors, but at least in my circles, those haven't taken off either. Finally, even after the release of WolframAlpha, students are still tested on arithmetic, graphing parabolas, and applying the power rule to compute derivatives. For better or worse, the core education system has remained fairly resilient to all of this technological innovation.

In fact, the flipped classroom model is another innovation that has not been widely adopted, despite its apparent benefits. This may be partly due to the effort required to "flip" a classroom, and the fact that this effort can be met with "significantly lower" student satisfaction. I can confirm this based on comments I've seen online: some students complain that the instructors of flipped classrooms don't actually teach, so they feel like their tuition is being wasted on a course that they could've taken online for free. Since students want to feel that they're getting their money's worth, I'm not sure they'd happily embrace the idea of paying tuition for an AI tutor.

So to me, it seems like AI tutoring could be a total game changer, but it won't happen overnight. (Again, it's difficult to make predictions, especially about the future.) However, AI is already a game changer with respect to certain mini-games in education. A big one that people have been talking about is grading and homework, especially in the context of essays and programming.

If ChatGPT can write decent essays and programs while evading plagiarism detection tools, then will students rely on it to do all of their writing assignments? (Should teachers even ask students to write anything at all?) The obvious solution is for teachers to mandate that all writing will be done under supervision, but there's not enough time in a typical one-hour-ish class session to write something significant. The COVID-19 pandemic somewhat popularized the use of virtual supervision, but the whole eyeball-tracking thing felt draconian. So supervising all student writing probably won't work.

Another obvious solution is for teachers to simply ban AI technology from their courses. After all, teachers ban things all the time: students aren't supposed to search online for solutions to homework problems, ask their friends, pay someone, use their phones when taking a test, or share screenshots in group chats. But of course, these things happen anyway, and it's up to the teachers to figure out how they handle these situations. ChatGPT could just be the latest item on the list of banned resources.

However, based on previous technological developments, it seems that shunning technology is not the "right" thing for teachers to do. One famous example is writing itself: it has completely permeated our culture, but Socrates wasn't a huge fan. I think something similar could be said about Wikipedia; I remember hearing my teachers make it sound like it was full of lies, but I suspect they've come around. Another example is graphing calculators (but not WolframAlpha, in my experience); instead of banning them or pretending that they didn't exist, educators incorporated them into the curriculum by teaching students how to use them. These examples suggest that somehow, educators should at least acknowledge, or possibly embrace, the existence of things like ChatGPT. [3] Again, this probably won't happen overnight. Even if AI gets really good at writing essays and code, I'd guess that students will still have to do these things "by hand" for a while.

I don't have magical solutions regarding cheating, but I do have some thoughts on the bigger picture. In my opinion, the main goal of an education system is to boost students' motivation to learn. College is a mandatory marathon, and the students are the runners. If they are sufficiently motivated to run/learn, then they won't feel as tempted to cheat. With this in mind, schools and teachers should ask themselves: What should we teach, and why? This can be challenging, given the variety of teaching philosophies among faculty, the range of students' goals, administrative and financial pressures, rapid changes in the job market, and all sorts of other factors. But even if AI disrupts everything, as long as there is some sort of education system, I imagine that these questions will remain relevant (even if AI provides all the answers).

I tend to worry a lot, and it's easy to become sad about life after ChatGPT. But I want to end on an optimistic note. I've seen articles say something like, "Humans and AI both have strengths and weaknesses. By properly working with technology, rather than against it, we can develop better solutions for the future of humanity." But whenever I hear about "people and robots working together," I think about what happened in the chess world. For a while, after the robots surpassed humans, a human assisted by a robot could still beat a robot, so humans were still able to make a positive contribution at the superhuman levels. But now it's not so clear; some believe that robots have gotten so good that humans have nothing left to contribute.

On the other hand, chess as a sport among humans is still very much alive! There's an optimistic ending, maybe.

Footnotes

[1] Hmm, will AI take over the world? I recommend this lecture by Scott Aaronson on this topic.

[2] In K-12 classrooms, behavioral issues and classroom management are a bigger deal, so I think those teachers' jobs are more AI-proof. Also, K-12 schools do a lot more than just educating (e.g., child care, socialization), though I guess the same goes for colleges.

[3] "How?" is a good question for future work. One idea is to ask students to critique the output of ChatGPT on a given prompt. Another is for students to ask ChatGPT for feedback on their work.