Can you trust Copilot for Microsoft 365?
Computerworld.com reported “there have been countless times ChatGPT, Copilot and other genAI tools have simply made things up. In many instances, lawyers relied on them to draft legal documents — and the genAI tool made up cases and precedents out of thin air. Copilot has so often made up facts — hallucinations, as AI researchers call them, but what we in the real world call lying — that it’s become a recognized part of using the tool.” The July 24, 2024 article entitled “Is Copilot for Microsoft 365 a lying liar?” (https://www.computerworld.com/article/3475988/is-copilot-for-microsoft-365-a-lying-liar.html) included these comments about “Hallucinations galore”:
The hallucinations I encountered while testing Copilot all occurred in Word. And they weren’t small white lies or something you might not notice — they were whoppers.
To test Copilot, I created an imaginary company, Work@Home, which sells home office furniture. I had Copilot create typical documents for it that you’d need in a business — marketing campaigns, spreadsheets for analyzing financial data, sales presentations and so on.
What do you think about CoPilot?