
In the short term, the psychology expert is worried about democracy. Extinction? I don't know." Threat to democracy So I think there are scenarios where it was pretty serious. "You (could) have this escalation that winds up in nuclear war or something like that. "People might try to manipulate the markets by using AI to cause all kinds of mayhem and then we might, for example, blame the Russians and say, 'look what they've done to our country' when the Russians actually weren't involved," he continued. Instead of putting the focus on more far-fetched scenarios where no one survives, society should be putting attention on where real dangers lie, Marcus surmised. "If you really think there's existential risk, why are you working on this at all? That's a pretty fair question to ask," Marcus said. Signatories included those who are building systems with a view to achieving "general" AI, a technology that would hold the cognitive abilities on par with those of humans.

The one-line statement said tackling the risks from AI should be "a global priority alongside other societal-scale risks such as pandemics and nuclear war". Global leaders should be working to reduce "the risk of extinction" from artificial intelligence technology, the signatories insisted. In March, alarmed that ChatGPT creator OpenAI was releasing its latest and more powerful AI model with Microsoft, Marcus signed an open letter with more than 1,000 people including Elon Musk calling for a global pause in AI development.īut last week he did not sign the more succinct statement by business leaders and specialists - including OpenAI boss Sam Altman - that caused a stir.

Long before the advent of ChatGPT, Marcus designed his first AI program in high school - software to translate Latin into English - and after years of studying child psychology, he founded Geometric Intelligence, a machine learning company later acquired by Uber. is that we're building AI systems that we don't have very good control over and I think that poses a lot of risks, (but) maybe not literally existential."

"A more general problem that I am worried about.
