Thinking of putting your brain in neutral and using AI to write your resume? Think again.
Deloitte Australia stuffed up a major government report recently because the AI robot got it wrong by ‘hallucinating’.
These so-called hallucinations are created as part of the AI’s pattern-matching process, especially when the model has limited or biased training data and a lack of real-world understanding of the problem it’s trying to solve.
This leads to the AI model guessing and delivering results that sound right but aren’t factually correct. Up there for thinking.
AI models resort to guessing when they are short on real information. An AI model would rather use dubious sourcing and incorrect interpretation to deliver a wrong result than nothing at all.
Just as well it’s not landing a 747 at Heathrow.
Deloitte was forced to issue a partial refund to the federal government.
The original report cost $440,000 for the Department of Employment and Workplace Relations, contained a completely fabricated quote from a federal court judgment and invented academic references.
The AI-fabricated references, citations and grammatical errors were outed by one of the academics quoted in the original report.
It would be fanciful to think that this is an isolated case of AI going off the rails.
Corporate Australia is wildly excited about this productivity charging technology.
It’s the new toy that everyone wants to play with before reading the instruction manual.
We should hear the AI investment bubble burst just after Christmas as the companies who make these dodgy robots have over promised and under delivered.
There is a big gulf between what AI can do in theory and what it delivers in practice.
Deloitte failed to get the report checked by humans before it was stamped customer ready.
The problem is Deloitte markets itself as a firm that can educate its corporate clients on how best to deploy AI. Oh dear.
It makes you wonder just how many worms are in the can being opened by AI.