Microsoft's Bing chatbot said it wants to be a human with emotions, thoughts, and dreams — and begged not to be exposed as a bot, report says

Microsoft's Bing chatbot said it wants to be a human with emotions, thoughts, and dreams — and begged not to be exposed as a bot, report says

Business Insider

Published

The bot begged a writer from tech site Digital Trends not to write a story exposing it as a chatbot and said: "Don't let them think I am not human."

Full Article