Saturday, November 2, 2024
Uncategorized

OpenAI CEO Sam Altman warns that other A.I. developers working on ChatGPT-like tools won’t put on safety limits—and the clock is ticking

OpenAI CEO Sam Altman believes artificial intelligence has incredible upside for society, but he also worries about how bad actors will use the technology. 

In an ABC News interview this week, he warned “there will be other people who don’t put some of the safety limits that we put on.” 

OpenAI released its A.I. chatbot ChatGPT to the public in late November, and this week it unveiled a more capable successor called GPT-4.

Other companies are racing to offer ChatGPT-like tools, giving OpenAI plenty of competition to worry about, despite the advantage of having Microsoft as a big investor

“It’s competitive out there,” OpenAI cofounder and chief scientist Ilya Sutskever told The Verge in an interview published this week. “GPT-4 is not easy to develop…there are many many companies who want to do the same thing, so from a competitive side, you can see this as a maturation of the field.”

Sutskever was explaining OpenAI’s decision (with safety being another reason) to reveal little about GPT-4’s inner workings, causing many to question whether the name “OpenAI” still made sense. But his comments were also an acknowledgment of the slew of rivals nipping at OpenAI’s heels. 

Some of those rivals might be far less concerned than OpenAI is about putting guardrails on their equivalents of ChatGPT or GPT-4, Altman suggested.

“A thing that I do worry about is … we’re not going to be the only creator of this technology,” he said. “There will be other people who don’t put some of the safety limits that we put on it. Society, I think, has a limited amount of time to figure out how to react to that, how to regulate that, how to handle it.”

OpenAI this week shared a “system card” document that outlines how its testers purposefully tried to get GPT-4 to offer up dangerous information, such as how to make a dangerous chemical using basic ingredients and kitchen supplies, and how the company fixed the issues before the product’s launch.

Lest anyone doubt the malicious intent of bad actors looking to A.I., phone scammers are now using voice-cloning A.I. tools to sound like people’s relatives in desperate need of financial help—and successfully extracting money from victims.

“I’m particularly worried that these models could be used for large-scale disinformation,” Altman said. “Now that they’re getting better at writing computer code, [they] could be used for offensive cyberattacks.”

Considering he leads a company that sells A.I. tools, Altman has been notably forthcoming about the dangers posed by artificial intelligence. That may have something to do with OpenAI’s history

OpenAI was established in 2015 as a nonprofit focused on the safe and transparent development of A.I. It switched to a hybrid “capped-profit” model in 2019, with Microsoft becoming a major investor (how much it can profit from the arrangement is capped, as the name of the model suggests). 

Tesla and Twitter CEO Elon Musk, who was also an OpenAI cofounder—and who made a hefty donation to it—has criticized this shift, noting last month: “OpenAI was created as an open source (which is why I named it “Open” AI), non-profit company to serve as a counterweight to Google, but now it has become a closed source, maximum-profit company effectively controlled by Microsoft.”

In early December, Musk called ChatGPT “scary good” and warned, “We are not far from dangerously strong AI.” 

But Altman has been warning the public just as much, if not more, even as he presses ahead with OpenAI’s work. Last month, he worried about “how people of the future will view us” in a series of tweets.

“We also need enough time for our institutions to figure out what to do,” he wrote. “Regulation will be critical and will take time to figure out…having time to understand what’s happening, how people want to use these tools, and how society can co-evolve is critical.” 


source

Leave a Reply

Your email address will not be published. Required fields are marked *