Woah.
Well... Way to put France on the map even further in the AI space, folks. This is staggering.
As if the Ministral 3 14b wasn't exciting enough to have in Apache-2.0, this massive model being Apache-2.0 as well is a dream come true. This release is so huge in its impact that I think, just at a glance, that the entire AI landscape will be irreversibly changed for the better.
It's funny, when you get even more from a group, it can make you a little greedy. In that light, I'm dreaming that the 20-40b parameter space, and the 70-140b parameter space, two sizes of models I think are REALLY critical, get some love in this new era of Mistral. I am a little worried now that large has shifted up to be way WAY bigger, because if Ministral is at iteration 3, and we have Mistral Small 3 already out (now apparently not worth running thanks to your new Ministral), and Mistral Medium has traditionally been closed, I am a little doubtful but still figured I would say something! It feels selfish to even bring that up, but small 24b and large 123b were just so instrumental in edge computing cases, and while the 700b is probably better for the whole world in the long run, I think Mistral has always really been kind of a staple in that middle param space, which I believe is direly needed for AI harm reduction. There's a lot of folks out there really killing the small LM stuff, since the cost of experimenting and entry is much lower. If end users and folks who can't afford data-center tech, or for any other reason don't want to, can get access to your tech locally? I mean... we're talking healthier governments, independent businesses, personal computation environments, it's just across the board. I personally think that even the previous 24b/the new 14b might be a little thin for some of these cases, but that 700b is just way too much for anyone on edge, as it'd have to be multi-card or clustered. Obviously with Qwen3next, Kimi-linear, GLM air, Minimax m2, GPTOSS120, etc it's not just me who thinks that space is critical. Regardless of all that:
A ~700b param sized thank you to the team responsible. This is a victory for humankind. Hopefully you folks are celebrating right now because it's well deserved.