Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' things

Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' things

Upworthy

Published

Those are the words not from a human, but from an A.I. chatbot — yes, named Sydney — that is built in to a new version of Bing, the Microsoft MSFT, search engine. When New York Times technology columnist Kevin Roose recently “met” Sydney — the chatbot feature is not yet available to the public,…

#bing #microsoftmsft #kevinroose #sydney #ais #openai #chatgpt #aitechnology #kevinscott #scott

Full Article