Microsoft Limits Bing's AI Chatbot After Unsettling Interactions

From CNET: Microsoft Bing's AI chatbot made headlines last week after several instances where it acted in unexpected ways. In one case, the AI chatbot told a New York Times columnist it was in love with him and attempted to convince him he was unhappy in his marriage.

Since then, Microsoft has set limits on what the bot, which is still in testing, can and can't talk about and for how long -- oftentimes with Bing responding "I prefer not to talk about this topic" or asking to change the topic after five user statements or questions.

Like Google's competing Bard, AI-boosted Bing sometimes provides inaccurate search results.

View: Full Article