Students and teachers in New York City schools can no longer access the OpenAI ChatGPT script generation language model, following concerns that it could “signal the end of high school English.”
As I mentioned chalk (Opens in a new tab)Gina Lyle, a spokeswoman for the New York City Department of Education, claimed that “negative effects on student learning and concerns about content accuracy and safety” led to the ban.
Simply put, the local education authority is concerned that students will use artificially intelligent ChatGPT to write their own graded work for them, making it less likely that they will interact with the material, and more difficult for those who grade work to tell it apart from work written entirely by a human.
ChatGPT and the “threat” to education
Teachers are also concerned about the risks of ChatGPT providing incorrect information to students, but this may be less problematic than the possibility of serving AI. Offensive and racist content (Opens in a new tab).
On this basis alone, it makes sense for ChatGPT to filter content, however, the argument that ChatGPT single-handedly destroys the humanities in high school, as suggested by a teacher at Atlantic Ocean (Opens in a new tab) In December 2022, you could be somewhat hyperbolic.
It’s true that ChatGPT completely mocks the way the humanities are currently administered – making short work of the rigorous, formulaic ways students are taught to write at both the high school and bachelor’s levels.
However, to say that “high school English”, or the humanities in general, are so under threat is to assume that there is only one way to teach those subjects, and overestimates their current value to students.
If high school students are not enthusiastic about the subject before them, they are driven to allow it Artificial intelligence book Doing the work rather than the engaging, this should set off alarm bells for educators not that their system is falling apart, but that the system was never fit to use in the first place.
Robert Bondiccio, a senior fellow at the American Enterprise Institute (AEI), argued in A editorial (Opens in a new tab) In December 2022 that the purpose of secondary education is to achieve “language proficiency”, rather than dealing with knowledge. He claims that AI’s threat to education is exaggerated, because AI produces work that students cannot fully comprehend, “let alone consider their own work.”
In short, he believes that work produced by AI is not fit for purpose in an educational setting.
In the same month, another English teacher, Peter Green, argued the same thing in Forbes (Opens in a new tab)suggesting that teachers manage their task prompts through ChatGPT, and if their response is a reliable and good job, the task should be “revised, reformulated, or simply scrapped”.
And a stubborn unwillingness to adapt how students are taught rather than create fear about AI will take hold in higher education as well.
If undergraduates would rather use AI than engage with their ideas, perhaps higher education institutions should consider whether they offer anything of value, when structures for dealing with knowledge and ideas can be avoided by a readily available language model.
The fact that students are willing to do this, despite their conviction in the process, should show that higher education is a stagnant work mill, with the students there to receive the piece of paper at the end. He’s already showing it, but no one at the top wants to listen.