All Models
Meta: Llama 4 Scout
Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...
Benchmarks
Available Providers (1)
| Provider | Model ID | Input Cost | Output Cost | Context | Max Output | Docs |
|---|---|---|---|---|---|---|
| | meta-llama/llama-4-scout | $0.08/MTok | $0.30/MTok | 327.7K | 16.4K |
Capabilities
Reasoning
Tool Calling
Attachments
Open Weights
Structured Output