Meta

Llama 4 Maverick

Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...

Meta
1,048,576 tokens
Vision
Launch: 4/5/2025
Hello there!
How can I help you today?
Meta
Llama 4 Maverick