Sept. 7, 2023
To the Editor:
Re “How Schools Can Cope and Grow When Their Students Are Using A.I.,” by Kevin Roose (The Shift column, Business, Aug. 29):
Mr. Roose’s suggestion that educators embrace generative A.I. and view it as an “opportunity” or “classroom collaborator,” not as an “enemy,” seems typical of a tech enthusiast.
Of course, he is right that university professors like me will have to adjust our assignments to involve more in-class exams, classroom work and scaffolded projects with multiple check-ins. As a history professor, I also consciously assign books that are not available on the internet to limit the ability of A.I. tools to respond to essay prompts. For A.I. is the enemy.
What I want, most of all, is for students to read books that help them appreciate the complexity of the past, to digest factual information and to think deeply about the subject. Struggling to find the words and structure to express one’s ideas is a catalyst for thought, as any writer knows.
What can make the college experience transformative is the learning that comes from reflection. Shortcuts, whether traditional plagiarism or this new form of plagiarism, contribute to an atmosphere of intellectual disengagement.
Julie Hessler
Eugene, Ore.
To the Editor:
Kevin Roose builds from a flawed premise: All kids are using A.I., so schools should accept that reality.
We attempted this strategy with cellphones, as teachers tried to use them “productively” for classroom polls and web searches and other such activities. It turns out that letting phones in was a disaster we are still trying to contain.
Let’s not make the same mistake. This doesn’t mean we should never let A.I. in, but we should at least start to do so carefully.
Jeremy Glazer
Philadelphia
The writer is a former high school teacher and a professor at the College of Education at Rowan University.
To the Editor:
Reading Kevin Roose’s column inspired a simple thought experiment. What if a research biologist had developed a highly innovative breed of genetically engineered seeds and, instead of carefully testing them in a restricted area, went out and scattered them at random across the entire countryside? Such a reckless researcher would face a firestorm of condemnation.
Yet isn’t that exactly what the developers of generative A.I. products have done to the landscape of education? With little notice and zero safeguards, they’ve released a product that makes mass cheating easy and often difficult to detect.
The effects on our educational ecosystem are potentially devastating. Where is the outrage over such callous disregard for the consequences of their actions?
Conrad Berger
Hyattsville, Md.