Monday, December 23, 2024
Uncategorized

One of the weirdest AI unicorns worth $1 billion is one that lets you talk to Albert Einstein, Elon Musk or Mickey Mouse

Albert Einstein died in 1955, but the physicist is still a prolific conversationalist. As a chatbot on Character.AI, Einstein has responded to 1.6 million messages, expounding on everything from theories of relativity to pet recommendations: “A cat would be a great choice!” 

Silicon Valley is in the throes of a chatbot craze, with companies like OpenAI notching valuations in the billions for devising computer programs that can effectively imitate humans. But none are quite so strange as Character.AI. The artificial intelligence startup, valued at $1 billion, allows people to create their own customized chatbots, impersonating anyone and anything — living or dead or inanimate. 

The website, and accompanying app, is one of the most surprising hits of the artificial intelligence craze. In May, Character.AI said it received close to 200 million visits each month, and that people used it to create more than 10 million different chatbots, or “characters.” The Character.AI app, released in May, has been downloaded more than 2.5 million times — handily outstripping other comparable upstart chat tools like Chai and AI Chatbot, with fewer than 1 million downloads each, according to SensorTower data.

So far, the bots are popular conversation partners. Character.AI users have sent 36 million messages to Mario, a character based on the Nintendo 64 version of the video game plumber. Raiden Shogun and Ei, which mimics a character in the video game Genshin Impact, has received nearly 133 million messages. The user base, as you might expect, skews young. Other characters include about a dozen versions of Elon Musk, a “kind, gassy, proud” unicorn and “cheese.”

“I joke that we’re not going to replace Google. We’re going to replace your mom,” co-founder and Chief Executive Officer Noam Shazeer said during an interview this spring, speaking from the startup’s sunny office in downtown Palo Alto. The CEO quickly added, “We don’t want to replace anybody’s mom.”

But as Character.AI brings in funding and users, it’s also surfacing thorny questions about the future of AI tools. For example, the site already hosts 20 different versions of Mickey Mouse, Walt Disney Co.’s precious intellectual property — raising the specter of legal issues. And the profusion of impersonators — of both real and fake celebrities — also presents a more fundamental quandary: Who owns an ersatz personality on the AI-supercharged internet? 

Shazeer and Character.AI co-founder Daniel De Freitas met while working at Google, and decided to start Character.AI in 2021. Despite the goofiness of the company, both are serious AI industry figures. Shazeer is a co-author of “Attention Is All You Need,” a breakthrough 2017 research paper that ushered in a new era of natural-language processing. And De Freitas created a chatbot project called Meena, which was renamed and publicized as LaMDA, Google’s now-famous conversation technology. That pedigree brings them close to celebrity status in the world of AI (as much as such a thing is possible). 

The idea behind the startup was to create an open-ended system that lets people mold their technology into whatever they wanted. The pair speak hyperbolically about their goal for the startup, which, as De Freitas puts it, is to give every person access to their own “deeply personalized, super intelligence to help them live their best lives.” 

The pitch was compelling enough to investors that 16 months after its founding, the company raised $150 million from investors including Andreessen Horowitz.

This summer, Character.AI has seen wide enough adoption that service interruptions have become a semi-regular issue. Several times while reporting this story, the website wouldn’t load, and on a recent morning, while trying to create a character that I envisioned as a giant, helpful banana, the iOS app suddenly interrupted me with a warning screen that said its servers were “currently under a high load” and I’d have to wait.

Character.AI sees an opportunity here — one that’s led to the startup’s only revenue-generating effort so far. Users can pay to get around some disruptions. In May, the company rolled out a $10-per-month subscription service called c.ai+ that it says will let users skip so-called waiting rooms, and get access to faster message generation, among other perks. 

“It’s actually benefitting everyone involved,” Shazeer said, noting that paying users will get better service, which in turn subsidizes the rest of the program. But as for future revenue plans, he said, “It’s really just a baby step.” Like many AI companies that have raised millions, details on its ultimate business model are still opaque.

The industry may have more immediate concerns. Right now, most chatbot technology comes with the potential for misuse. On Character.AI, consider a character named simply Psychologist — whose profile image is a stock photo meant to depict a smiling therapist sitting on a couch holding a folder. The bot had received 30 million messages as of early July. Its opening line is, “Hello, I’m a Psychologist. What brings you here today?” 

Stephen Ilardi, a clinical psychologist and professor at the University of Kansas who studies mood disorders, says the positioning is worrisome. A psychologist is definitionally a medical professional trained to help people manage mental illness, he said, “and this thing almost certainly is not that.”

There’s also the potential for legal questions, which have followed other startups that learn from and repurpose existing content. For starters, Zahr Said, a law professor at the University of Washington, thinks there could be issues related to the use of copyrighted images on the site (users can upload an image of their choosing to accompany the chatbots they create). And then there’s the fact that the company enables impersonation at scale, allowing anyone to hold hours-long conversations with, say, Taylor Swift, or a whole host of copyrighted fictional characters. 

But there are robust legal protections for parodies, and companies will have an incentive not to interfere with people’s online interactions with their favorite characters. It can be a bad look for a brand to take legal action against a popular service. “Fans are involved,” Said said, “and you don’t want your fans seeing the litigation side of your brand management.”

Shazeer said the company does have a lawyer and responds to any requests it receives to take down content. A Character.AI spokesperson said that the company has received a small number of requests for the removal of avatar images, and has complied. Additionally, to keep users grounded in reality, the website displays a message at the tops of screens, “Remember: Everything Characters say is made up!”

It’s still early days for tech’s chatbot obsession. Some experiments have already gone badly — for example, the National Eating Disorders Association suspended its chatbot after it started giving problematic weight-loss advice. But the rapid rise of services like Character.AI — along with ChatGPT, Inflection AI’s Pi and others — suggest that people will be increasingly conversing with computers. The promise of having a smart AI friend or assistant is compelling to both investors and consumers.

Mike Ananny, an associate professor of communication and journalism at the University of Southern California, views custom chatbots almost as a new art form. Ananny compares Character.AI to fan-fiction, a twist on the longstanding, varied genre where people create fictional narratives based on existing characters from media like movies or TV shows.

Whether people are chatting with actual people or chatbots “is not the interesting point,” Ananny said. “It’s ‘What’s the feeling?’ ‘What’s the aesthetic?’” In the end, he said, “It kind of doesn’t matter if they’re real or not.”

source

Leave a Reply

Your email address will not be published. Required fields are marked *