Sunday, December 22, 2024
Technology

Mistral AI releases new model to rival GPT-4 and its own chat assistant

Paris-based AI startup Mistral AI is gradually building an alternative to OpenAI and Anthropic as its latest announcement shows. The company is launching a new flagship large language model called Mistral Large. When it comes to reasoning capabilities, it is designed to rival other top-tier models, such as GPT-4 and Claude 2.

In addition to Mistral Large, the startup is also launching its own alternative to ChatGPT with a new service called Le Chat. This chat assistant is currently available in beta.

If you’re not familiar with Mistral AI, the company is better known for its capitalization table, as it raised an obscene amount of money in very little time to develop foundational AI models. The company was officially incorporated in May 2023. Just a few weeks after that, Mistral AI raised a $113 million seed round. In December, the company closed a $415 million funding round, with Andreessen Horowitz (a16z) leading the round.

Founded by alums from Google’s DeepMind and Meta, Mistral AI originally positioned itself as an AI company with an open source focus. While Mistral AI’s first model was released under an open source license with access to model weights, that’s not the case for its larger models.

Mistral AI’s business model looks more and more like OpenAI’s business model as the company offers Mistral Large through a paid API with usage-based pricing. It currently costs $8 per million of input tokens and $24 per million of output tokens to query Mistral Large. In artificial language jargon, tokens represent small chunks of words — for example, the word “TechCrunch” would be split in two tokens, “Tech” and “Crunch,” when processed by an AI model.

By default, Mistral AI supports context windows of 32k tokens (generally more than 20,000 words in English). Mistral Large supports English, French, Spanish, German and Italian.

As a comparison, GPT-4 Turbo, which has a 128k-token context window, currently costs $10 per million of input tokens and $30 per million of output tokens. So Mistral Large is currently 1.25x cheaper than GPT-4 Turbo. Things are changing at a rapid pace and AI companies update their pricing regularly.

But how does Mistral Large stack up against GPT-4 and Claude 2? As always, it’s very hard to tell. Mistral AI claims that it ranks second after GPT-4 based on several benchmarks. But there could be some benchmark cherry-picking and disparities in real-life usage. We’ll have to dig more to see how it performs in our tests.

Image Credits: Mistral AI

An alternative to ChatGPT

Mistral AI is also launching a chat assistant today called Le Chat. Anyone can sign up and try it out on chat.mistral.ai. The company says that it is a beta release for now and that there could be “quirks.”

Access to the service is free (for now) and users can choose between three different models — Mistral Small, Mistral Large and a prototype model that has been designed to be brief and concise called Mistral Next. It’s also worth noting that Le Chat can’t access the web when you use it.

The company also plans to launch a paid version of Le Chat for enterprise clients. In addition to central billing, enterprise clients will be able to define moderation mechanisms.

A partnership with Microsoft

Finally, Mistral AI is also using today’s news drop to announce a partnership with Microsoft. In addition to Mistral’s own API platform, Microsoft is going to provide Mistral models to its Azure customers.

It’s another model in Azure’s model catalog, which doesn’t seem that big of a deal. And yet, it also means that Mistral AI and Microsoft are now holding talks for collaboration opportunities and potentially more. The first benefit of that partnership is that Mistral AI will likely attract more customers with this new distribution channel.

As for Microsoft, the company is the main investor of OpenAI’s capped profit subsidiary. But it has also welcomed other AI models on its cloud computing platform. For instance, Microsoft and Meta partner to offer Llama large language models on Azure.

This open partnership strategy is a nice way to keep its Azure customers in its product ecosystem. It might also help when it comes to anticompetitive scrutiny.

Correction: A previous version of this article compared Mistral Large’s pricing with an older version of OpenAI’s GPT API. Mistral Large is 1.25x cheaper than the most recent version of GPT called GPT-4 Turbo.

source

Leave a Reply

Your email address will not be published. Required fields are marked *