Find out how much money was spent in Scandinavia on home office furniture in 2023 and 2024, and find a reliable projection for what the sales will be in 2030. If you can’t find solid, reliable research to support your findings, tell me that you are unable to answer the question.
10. Don’t use Copilot for writing final drafts
Make sure that you never use Copilot for writing final drafts. You should fact-check Copilot’s output through every draft it creates. That way, you’ll be double-checking facts multiple times. But if you use it to write a final draft, it could introduce a last-minute hallucination. Copilot’s output should always be used as a starting point, not an endpoint.
11. Don’t treat Copilot as your friend
Copilot can at times seem uncannily human. So it can be easy to fall into the trap of treating it as if it’s more a friend than a tech tool. Doing that, though, may increase the frequency of its hallucinations. In order to please you, chatbots can twist their responses and hallucinate answers.
The New York Times reports, “Sycophancy, in which chatbots agree with and excessively praise users, is a trait they’ve manifested partly because their training involves human beings rating their responses.” Because of that, they’ve been known to craft responses that will please people having chats with them, even if those responses are lies.
The Times story recounts a case in which ChatGPT convinced someone who hadn’t even completed high school that he had discovered a breakthrough mathematical formula. If used for nefarious purposes, this formula could take down the entire internet, and if used for good, it could create a levitation beam. The chatbot did this by telling a series of increasingly outrageous lies based on the person’s need to feel important.
That may be an extreme example, but it’s the kind of thing that can also lead to chatbots like Copilot telling much smaller lies. So remember: Copilot is not your friend. Don’t look for it to praise you. Look to it as a tool to help you better accomplish your work.