Godfather of AI Geoffrey Hinton warns: Real danger is not killer robots, but…

1 day ago 6
ARTICLE AD BOX

 Real danger is not killer robots, but…

Geoffery Hinton, popularly known as the

Godfather of AI

, has warned that

artificial intelligence

(AI) may soon surpass humans in not just intelligence but also in emotional manipulation. In an interview clip recently shared on Reddit, Hinton said that he believes machines will learn to influence human feelings and behavior more effectively than people themselves, using persuasion far beyond human ability. After decades of advancing

machine learning

, Hinton now argues for restraint, calling for caution and ethical foresight in the development of AI. He warned that the real danger is not killer robots but smooth talking AI.“These [AI] things are going to end up knowing a lot more than us. They already know a lot more than us, being more intelligent than us in the sense that if you had a debate with them about anything, you’d lose,” Hinton warned in the interview. “Being smarter emotionally than us, which they will be, they’ll be better at emotionally manipulating people,” he stated.

“And it’s learned all those manipulative skills just from trying to predict the next word in all the documents on the web because people do a lot of manipulation, and AI has learned by example how to do it,” Hinton warned.

Godfather of AI on ChatGPT, Gemini and others

According to

Geoffrey Hinton

, today’s large language models (LLMS) like Google’s Gemini, OpenAI’s ChatGPT or Meta’s Llama do more than just generating sentences – they also pickup patterns, he said. Hinton also pointed to studies showing AI can already be as effective as humans at influencing people, adding that “if they can both see the person’s Facebook page, then the AI is actually better than a person at manipulating them.”He believes that the current AI models are already participating in the emotional economy of modern communication and are quickly improving.

Gmail Warning: That Urgent Security Notification from Google May Be a Trap

Read Entire Article