menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

How to Teach Critical Thinking When AI Does the Thinking

26 0
yesterday

Big Four consulting firm Deloitte just repaid $291,000 to the Australian government after admitting it used ChatGPT to produce a compliance review riddled with errors. The report contained nonexistent references, fabricated citations, and invented court cases. University of Sydney academic Christopher Rudge said that there were multiple "hallucinations" that appeared unsupported by any actual evidence.

This wasn't a student cheating on a homework assignment with ChatGPT. This was a multi-billion-dollar consultancy firm whose employees decided to outsource their expertise to an algorithm. They got exactly what happens when highly-paid professionals stop thinking. Garbage wrapped in professional formatting.

AI wasn’t the failure here. It did what it always does and completed the user's request. The consultants failed because they didn't know how to think with the tool. They treated it like what Paulo Freire called a "banking education" system: deposit your request, withdraw your answer, never question the transaction.

The Deloitte example should put every educator on notice. If professionals of this caliber are outsourcing cognition to AI, why would we expect students to behave differently?

Unfortunately, we are entering a period of time where everyone is using AI and very few people want to admit it. Institutions teaching students are doing the exact same thing.

According to Anthropic's 2025 education report, professors rate AI-assisted grading as the "least effective" educational application. Yet 48.9% of their grading conversations with AI involve full automation, letting the algorithm do the work they're paid to do. They're also using AI to create........

© Psychology Today