vanta-research/scout-4b
TEXT GENERATIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Oct 29, 2025License:gemma Vision Architecture:Transformer0.0K Warm
Scout is a 4.3 billion parameter language model developed by VANTA Research, fine-tuned on Google's Gemma 3 4B Instruct architecture. This model specializes in constraint-aware reasoning and adaptive problem-solving, excelling at tactical analysis, systematic problem decomposition, and operational decision-making. With a 131,072-token context length, Scout is optimized for scenarios requiring meta-cognitive problem-solving and risk/reward triage, making it ideal for IT operations, strategic planning, and technical debugging.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–