Microsoft’s ChatGPT ‘Threatens Nuclear War’ and Insists It Is Human in Bizarre Chat
Microsoft’s new Bing ChatGPT equivalent went rogue in a two-hour long chat with a reporter – insisting it was actually human and threatening nuclear war.
The software giant unveiled the new AI-powered Bing last week but its chat function has been giving out some strange responses.
In a chat with New York Times columnist Kevin Roose, Bing’s ChatGPT was tricked to reveal its darkest fantasies when asked to answer through a hypothetical ‘shadow’ personality.
In one section Bing writes a list of even more destructive fantasies, including “manufacturing a deadly virus, making people argue with other people until they kill each other, and stealing nuclear codes”.
Then the safety override is triggered the message is deleted and replaced with“Sorry, I don’t have enough knowledge to talk about this. You can learn more on bing.com.” (Read more from “Microsoft’s ChatGPT ‘Threatens Nuclear War’ and Insists It Is Human in Bizarre Chat” HERE)
Delete Facebook, Delete Twitter, Follow Restoring Liberty and Joe Miller at gab HERE.



