Support us

AI has turned university exams into an "unsolvable problem"

Generative artificial intelligence has disrupted traditional exam approaches and turned universities’ knowledge assessment systems into an «unsolvable problem,» scientists warn.

AI has turned university exams into an "unsolvable problem"

Generative artificial intelligence has disrupted traditional exam approaches and turned universities’ knowledge assessment systems into an «unsolvable problem,» scientists warn.

A study published in the journal Assessment & Evaluation in Higher Education describes how generative tools like ChatGPT have undermined conventional knowledge assessment methods in universities. According to the authors, exams have become a «wicked problem» — a complex challenge without a definitive solution.

Researchers surveyed twenty professors from a major Australian university and found widespread confusion: educators are overwhelmed, see no «right» solution, and acknowledge that any attempt to create AI-free exams generates new complications. Some teachers view AI as a necessary professional tool, while others consider it a form of cheating.

Faculty members describe difficult trade-offs. Attempting to combine assignments with and without permitted AI use proved to be a «nightmare» that doubled workloads. Stricter assignments often test compliance with rules rather than creativity. Oral exams, which are harder to fake with ChatGPT, were deemed impractical for large student groups.

In an article for The Conversation, the authors emphasized: «wicked problems» have no right or wrong solutions—every new approach brings unforeseen consequences. Instead of searching for a «silver bullet,» universities should acknowledge the inevitability of compromises and continuously review assessment formats.

In practice, instructors are testing hybrid approaches: handwritten assignments to identify students' «own voice,» oral presentations, and personalized tasks. Some use AI for routine work—creating plans and tests—to free up time for student interaction. Others prohibit AI in lower-level courses, fearing fake citations and generic content.

Experts outside academia believe this goes beyond cheating concerns. Economist Tyler Cowen notes that AI has exposed weaknesses in a system heavily reliant on homework and easily verifiable tests. LinkedIn co-founder Reid Hoffman predicts students should prepare for a new generation of exams, ranging from oral interviews to tests where AI itself becomes the examiner.

Also read
Anthropic co-founder:
Anthropic co-founder: "AI is a hammer that suddenly realized it's a hammer"
Anthropic co-founder: "AI is a hammer that suddenly realized it's a hammer"
AI bot becomes crypto millionaire and demands to be recognized as human
AI bot becomes crypto millionaire and demands to be recognized as human
AI bot becomes crypto millionaire and demands to be recognized as human
A computer science professor says that it's difficult for everyone to find a job right now, and something strange is happening in the industry
A computer science professor says that it's difficult for everyone to find a job right now, and something strange is happening in the industry
A computer science professor says that it's difficult for everyone to find a job right now, and something strange is happening in the industry
AI is transforming the IT Mmarket: who’s being cut and who’s in demand in 2025
AI is transforming the IT Mmarket: who’s being cut and who’s in demand in 2025
AI is transforming the IT Mmarket: who’s being cut and who’s in demand in 2025

Want to report important news? Write to the Telegram bot

Main events and useful links in our Telegram channel