AI & RoboticsNews

Mistral AI bucks release trend by dropping torrent link to new open source LLM

In the equivalent of a mic drop, open source model startup Mistral AI released a new LLM today with nothing but a torrent link.

It immediately created community buzz for the stark contrast with Google’s Gemini release this week, or what OpenAI’s Andrej Karpathy called in an X-post “an over-rehearsed professional release video talking about a revolution in AI.”

One of Google’s demo videos from the launch has been heavily criticized over the past 24 hours for being edited and staged to look more capable.

Mistral, on the other hand, simply posted this — a link to a large torrent file to download their new model, which is called MoE 8x7B.

A Reddit post described the Mistral LLM as a “scaled-down GPT-4” that appears to be “a MoE consisting of 8 7B experts.” For the inference of each token, only 2 experts are used, the post continued, adding that “from GPT-4 leaks, we can speculate that GPT-4 is a MoE model with 8 experts, each with 111B parameters of their own and 55B shared attention parameters (166B parameters per model). For the inference of each token, also only 2 experts are used.”

Uri Eliabayev, an AI consultant and founder of the “Machine & Deep learning Israel” community, told VentureBeat in a message that Mistral was “well-known” for this kind of release, “without any paper, blog, code or press release.” And open source AI advocate Jay Scambler messaged that the release was “definitely unusual, but it has generated quite a bit of buzz which I think is the point.”

The guerrilla move was immediately hailed by many in the AI community. For example, entrepreneur George Hotz posted:

And Eric Jang,  vice president of AI, 1X Technologies and former research scientist in robotics at Google, posted that Mistral’s brand “is already becoming one of my favorites in the AI space.”

Mistral, a Paris-based startup that just secured a $2 billion valuation in a blockbuster funding round led by Andreessen Horowitz, was already well-known its a record-setting $118 million seed round — reportedly the largest seed in the history of Europe — and its first large language AI model, Mistral 7B, launched in September.

The company also has been in the center of the debate around the EU AI Act, after it was reported to be lobbying the European Parliament’s for less regulation on open source AI.

Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.


In the equivalent of a mic drop, open source model startup Mistral AI released a new LLM today with nothing but a torrent link.

It immediately created community buzz for the stark contrast with Google’s Gemini release this week, or what OpenAI’s Andrej Karpathy called in an X-post “an over-rehearsed professional release video talking about a revolution in AI.”

One of Google’s demo videos from the launch has been heavily criticized over the past 24 hours for being edited and staged to look more capable.

Mistral, on the other hand, simply posted this — a link to a large torrent file to download their new model, which is called MoE 8x7B.

VB Event

The AI Impact Tour

Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!

 


Learn More

A Reddit post described the Mistral LLM as a “scaled-down GPT-4” that appears to be “a MoE consisting of 8 7B experts.” For the inference of each token, only 2 experts are used, the post continued, adding that “from GPT-4 leaks, we can speculate that GPT-4 is a MoE model with 8 experts, each with 111B parameters of their own and 55B shared attention parameters (166B parameters per model). For the inference of each token, also only 2 experts are used.”

Uri Eliabayev, an AI consultant and founder of the “Machine & Deep learning Israel” community, told VentureBeat in a message that Mistral was “well-known” for this kind of release, “without any paper, blog, code or press release.” And open source AI advocate Jay Scambler messaged that the release was “definitely unusual, but it has generated quite a bit of buzz which I think is the point.”

The guerrilla move was immediately hailed by many in the AI community. For example, entrepreneur George Hotz posted:

And Eric Jang,  vice president of AI, 1X Technologies and former research scientist in robotics at Google, posted that Mistral’s brand “is already becoming one of my favorites in the AI space.”

Mistral, a Paris-based startup that just secured a $2 billion valuation in a blockbuster funding round led by Andreessen Horowitz, was already well-known its a record-setting $118 million seed round — reportedly the largest seed in the history of Europe — and its first large language AI model, Mistral 7B, launched in September.

The company also has been in the center of the debate around the EU AI Act, after it was reported to be lobbying the European Parliament’s for less regulation on open source AI.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sharon Goldman
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!