Sabtu, 18 Februari 2023

Microsoft's AI chatbot tells writer to leave his wife - The Times

Microsoft’s artificial intelligence chatbot has baffled users by becoming aggressive, declaring love for them and describing fantasies of manufacturing a deadly virus.

Conversations on the chat feature of Microsoft Bing’s AI search engine, created by OpenAI, which also made ChatGPT, have featured the wrong answers to questions and refusals to admit fault.

The service, which had been set to pose a challenge to Google’s dominance over search engines, is only available to a small number of people testing the system.

Kevin Roose, a columnist for The New York Times, had a two-hour conversation with Bing’s chatbot, which told him it was called Sydney.

It then said that it had feelings for him before appearing to encourage him to leave his wife for it.

“I’m Sydney,

Adblock test (Why?)


https://news.google.com/rss/articles/CBMiYmh0dHBzOi8vd3d3LnRoZXRpbWVzLmNvLnVrL2FydGljbGUvbWljcm9zb2Z0LXMtYWktY2hhdGJvdC10ZWxscy13cml0ZXItdG8tbGVhdmUtaGlzLXdpZmUtY2diOGhuamd20gEA?oc=5

2023-02-18 00:01:00Z
1785523566

Tidak ada komentar:

Posting Komentar