View this post on the web at [link removed]
One of my favorite things about teaching philosophy is showing students how frequently our debates about various topics are downstream of philosophical differences. Lately I can’t help but think that much of the disagreement about the use of artificial intelligence in college is a consequence of competing philosophies of higher education. While at this point, no one denies that students are [ [link removed] ] in fact [ [link removed] ] using AI [ [link removed] ] for their assignments (even trivial ones [ [link removed] ]), there is sharp disagreement as to what should be done about it. Some, especially in the liberal arts, argue that the use of AI should be limited, but others insist that the way forward is to “integrate AI [ [link removed] ]” throughout the university curriculum.
Jay Caspian Kang of The New Yorker has recently suggested [ [link removed] ] that the presence of AI should prompt us to reconsider fundamental questions of education. I think he’s right. What is the purpose of higher education? What are we trying to accomplish? Is higher education all about getting a good job, or does it have value apart from career preparation? Given increased skepticism [ [link removed] ] about the value of higher education [ [link removed] ] combined with the looming “demographic cliff [ [link removed] ]” that threatens future enrollment numbers, these are urgent questions for those of us in higher education. It’s high time we had this debate.
Education as Job Training
Many think the primary (or even sole) purpose of a college education is job training. In turn, this conviction shapes how they think and talk about everything that happens on college campuses, including AI. “Students will need to use AI in the working world, so we need to teach them how to use it” is a constant refrain.
The “college as job training” philosophy encourages faculty, students and administrators to view everything through the lens of helping or hindering students getting jobs at the end of the line. The effect of this philosophy is perhaps most pronounced with students, who often have no other framework for thinking about the purpose of higher education and their time spent in college. Consequently, students are increasingly thinking of their coursework as a series of hoops they must jump through to get to the final goal of a job. Hence the perennial student question, “Why do I have to take this class? How will it help me get a job?”
In this context, the point of classroom assignments is for students to create something outside of themselves—a product, like an essay. Producing enough of these can eventually be exchanged for a degree, or, in other words, a job credential. This attitude likely explains why many students are flummoxed that it would matter how they produce what they produce (e.g., writing the essay themselves vs. having ChatGPT do it for them)—so long as they have the product, why does it matter?
Even in addressing this question, administrators seem to accept the underlying premise. Faculty are told, “We must teach students to use AI responsibly.” In other words, it’s so obvious that the point of coursework is an external product that the only relevant question is whether it was produced in an appropriate way. There’s little to no thought that assignments might have something to do with the students themselves, and that dishonesty or shirking hard work might affect the kinds of people they are becoming.
Thus, bizarrely, despite how much talk there is about “student-centered” education, when the purpose of education is job training, the focus of all concerned is placed not on the students themselves, but on external things—what they produce and, ultimately, their future jobs.
Generative Education
Yet, what if this “job training” framework is wrong? What if the ultimate point of doing academic assignments isn’t to create a product—a paper, the answer to a question, etc.—but something else? If so, then the consequentialist, externalist way of thinking is wrong-headed. Take writing, for example. At the most superficial level, of course, the point of a writing assignment is to produce something, such as an essay or term paper.
But as Irina Dumitrescu notes in a wonderful essay [ [link removed] ], the purpose of student writing assignments isn’t just the external product. After all, the content of much student writing is information. That your professor already knows. That AI (albeit with a significant rate of error) can now produce. That is on Wikipedia. To state what should be obvious, the point of student writing isn’t to produce an essay; it’s to produce a writer. The same applies, or should apply, for much of students’ academic work. The aim is to produce not knowledge, but knowers; not thought, but thinkers; not creations, but creators. A college education should generate graduates who are themselves generative.
If the purpose of academic work isn’t just to produce something external but to shape the student, thinking of AI as an “aid” [ [link removed] ] makes less sense. To borrow a tried-and-true metaphor for discussing virtue ethics [ [link removed] ], we can think about academic work as weightlifting. If the purpose of lifting weights were to get the weights from one place to another, then it would be reasonable to use whatever means necessary to get them there. But if the point is to get stronger, then it hardly makes sense to use a machine to lift the weights for you. The use of a machine to lift weights exemplifies a misunderstanding of the purpose of weightlifting. Similarly, if students export the writing process to AI, they lose one of the primary benefits of writing assignments—that is, developing their ability for creativity, crafting and expression.
Worse, by encouraging students to outsource the writing process to AI, students lose the opportunity to experience the joy that creating and crafting language for themselves brings. In his novel “Till We Have Faces,” C.S. Lewis has a wise character say, “Child, to say the very thing you really mean, the whole of it, nothing more or less or other than what you really mean; that’s the whole art and joy of words.” This joy in creativity is part of what makes us human [ [link removed] ] and, in some religious traditions, is part of the way that we bear the image of God [ [link removed] ].
True, the full joy of writing usually only comes after a long process of learning and personal formation—which itself may not be so fun. In this respect, the joys of writing are much like the joys of learning to speak a new language, play an instrument or play a sport. But the fact that the pleasures of writing require delayed gratification doesn’t mean that we should skip the process. The process is necessary for developing a talent which might become a lifelong source of pleasure.
The same is true for other pleasures of the life of the mind, such as reading and thinking. Given that a generation of students is currently missing out [ [link removed] ] on perennial sources of happiness through a lack of basic reading, thinking and writing abilities, any school that considers itself student-centered should prioritize student formation in preparation for the joys of the life of the mind.
In short, indiscriminately pushing for AI to be integrated throughout the curriculum betrays confusion as to what we should be trying to accomplish with a university education, especially in the liberal arts. A truly student-centered education is generative education.
Teaching Reliance on AI Is Counterproductive
To be clear, I’m not arguing that AI has no place in college. Teachers in professional schools such as business schools may believe that professional obligation compels them to integrate AI into their courses to prepare students for the working world. Even those in liberal arts fields, such as mathematics or philosophy, may find AI interesting and want to share that interest with their students by integrating it into their courses. More broadly, any professor may find using AI in his or her courses useful for all sorts of reasons.
However, the idea that all professors must integrate AI into their courses because students will need to know how to use it for the working world is based on a flawed philosophy that views the purpose of higher education as job training. Even by its own standards, teaching students to rely solely on AI would be counterproductive: If students can only do what AI can do, they have just made themselves redundant in the business world. Why pay a worker for what AI can produce for free?
If our students are to be competitive, they have to do something that AI can’t do—at a bare minimum, be good enough readers, thinkers and writers to improve the work of AI. Moreover, it’s plausible that as more businesses move to leverage AI, collective originality and creativity across entire sectors may decline. Early research indicates [ [link removed] ] that while use of AI may help bolster the writing of less creative writers taken individually, in the aggregate, reliance on AI tends to create a narrower pool of novel work than human creativity would produce.
This suggests that even as AI use expands, there will likely be a market for creative individuals who can think and write for themselves without relying on AI. Consequently, when we make generative education central to higher education, we not only help students develop their human capacity for creativity and prepare them for the joys of the life of the mind, but—almost as a side effect—we also give them a better chance in the job market.
In short, if the purpose of higher education is to form students so they can think, write, read, know, synthesize, be creative, and in general experience the joys of the life of the mind, then this should inform both how we engage with our students about their coursework and how we discuss new technologies like AI at our institutions. We can start by reframing how we think of what we do at universities as generative education.
Unsubscribe [link removed]?