AtlaAI/Selene-1-Mini-Llama-3.1-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 22, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

AtlaAI/Selene-1-Mini-Llama-3.1-8B is an 8 billion parameter small language model-as-a-judge (SLMJ) developed by Atla. Post-trained from Llama-3.1-8B, it excels in evaluation tasks, outperforming larger models like GPT-4o on RewardBench, EvalBiasBench, and AutoJ. This model is optimized for general-purpose evaluation, supporting absolute scoring, classification, and pairwise preference tasks across a 128K context length.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p