I’ve got a buddy who is a professor and he catches llm cheaters by asking them difficult questions like “what was your essay about?” and “what were the three points you made in your essay?”. I’m sure llm proponents will offer some bromide about “tools aren’t inherently good or bad”, but it seems like the reality in college is llm tools are used for cheating.
I can’t imagine the nightmare that is being a teacher nowadays, but also that is absolutely the fault of the individual. I had many ways I could cheat in school prior to AI and did not.
It’s pretty scary: I am seeing it in the IT sector as well. It’s not just knowledge; anyone can look up things, even Einstein did it. “I never memorize anything that I can look up,” he said once, about the why he never memorized cosine tables and such. But it’s basic logical flow of thought and problem solving. Like the skills behind the knowledge, that I see less and less of.
When education focuses on correct answers and standardized tests and not developing logic, critical thinking, leaning strategies and cultivating understanding of concepts there’s no reason to not use AI, it’s not like students are being educated when they’re being taught how to pass tests as the top priority so the school can secure more funding. The bane of the game is “give me the right answer” not “demonstrate your understanding” students are simply being efficient