Microsoft Investigating Erratic Behavior of Copilot AI While Gemini Faces Criticisms Over Incorrect Information
According to Bloomberg and Fortune, Microsoft has received numerous user complaints about the chatbot’s strange and sometimes harmful responses. Users have reported that the bot’s responses have been not only irregular but also occasionally disturbing. The issue was brought to light when the bot allegedly began responding to user queries with discredited conspiracy theories.
This situation seems to highlight a larger problem within the tech industry regarding the dependability of AI tools used in daily tasks. This topic has been recently discussed by both The Verge and Washington Post. In response to these allegations, Microsoft has stated that it is actively investigating the matter.
This incident follows closely on similar criticisms faced by Gemini, another AI tool that allegedly provided incorrect responses to user queries. Experts in the AI community have warned of the risk of misinformation if these issues are not addressed promptly.
Microsoft introduced Copilot on February 7, 2023, as a tool to aid with a variety of tasks, as stated in a company blog post in March 2023. In light of these recent controversies, the tech industry is waiting to see how these companies will address these challenges, ensuring both accuracy and user safety in their AI initiatives.