Latest

Meta Open-Sources Llama 4 With 400 Billion Parameters, Challenging Closed AI Models

AY

Amit Yadav

Mar 7, 20262 min read0 views
Meta Open-Sources Llama 4 With 400 Billion Parameters, Challenging Closed AI Models

Meta AI has released Llama 4, its most powerful open-source large language model featuring 400 billion parameters. Available for free download under a permissive research licence, the release challenges the dominance of closed proprietary models from OpenAI and Google.

Meta AI has dropped Llama 4, the latest generation of its open-weight large language model, featuring a massive 400 billion parameter architecture trained on over 30 trillion tokens. The model is freely available on Hugging Face and Meta's official AI portal, continuing the company's bet on open-source AI as a strategic differentiator against rivals like OpenAI and Anthropic.

In internal evaluations, Llama 4 scores competitively with GPT-4o and Claude 3.5 Sonnet on standard benchmarks, while outperforming both on multilingual tasks and coding in low-resource programming languages. The model uses a mixture-of-experts (MoE) architecture, meaning only a fraction of its 400 billion parameters are active during any single inference — making it significantly more efficient to run than its parameter count suggests.

Meta has simultaneously released a smaller 70 billion parameter variant optimised for on-device and edge deployment, targeting mobile phones and embedded systems. This opens the door for developers building privacy-first applications that cannot send data to the cloud.

"We believe the future of AI is open," said Yann LeCun, Meta's Chief AI Scientist, in a blog post accompanying the launch. "Closed models concentrate power. Open models distribute it." The statement is a direct jab at OpenAI's decision to keep GPT-5 closed-source, a choice that has sparked debate across the research community.

The release is expected to accelerate fine-tuning activity across academia and industry. Within hours of publication, the model had already been downloaded over 200,000 times on Hugging Face, setting a new record for the platform. Developers are already experimenting with domain-specific fine-tunes in legal, medical, and financial verticals.

Share: