meta-llama

Meta: Llama 4 Scout

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...

Quality Score
99/100
composite of price, context, capability
Input Price
$0.08
per 1M tokens
Output Price
$0.30
per 1M tokens
Context Window
327,680
tokens
Model ID
meta-llama/llama-4-scout
Vendor
meta-llama
Tokenizer
Llama4
Input Modalities
text, image
Output Modalities
text
Max Output
16,384 tokens
Tool Calling
✓ supported
Structured Output
✓ supported
Reasoning Mode
not supported
Vision
✓ accepts images
Audio
no
Moderated
no

Similar models