ChatGPT anyone try it?
- By nickdalzell
- VIP Lounge
- 14 Replies
I think the fear over that had a lot to do with the infamous Microsoft chat bot Tay a few years ago:The hype about its evil personality and secretly wanting to escape is so dumb: a bunch of clickbait journalists manufacturing the story rather than reporting it. The technology isn't perfect... after all, it is based on millions of pieces of human written content, which obviously has errors. ChatGPT simply uses that database of content as the example, and based on what you tell it, tries to predict what might come next of the text already existed.

Microsoft shuts down AI chatbot after it turned into a Nazi
Microsoft's attempt to engage with millennials went badly awry within 24 hours


Why Microsoft's 'Tay' AI bot went wrong
Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong.
