Sao10K/L3-8B-Stheno-v3.2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Jun 5, 2024License:cc-by-nc-4.0Architecture:Transformer0.4K Open Weights Warm

Sao10K/L3-8B-Stheno-v3.2 is an 8 billion parameter language model developed by Sao10K, fine-tuned from a Llama-3 base. This iteration, Stheno-v3.2-Zeta, is optimized for balanced SFW/NSFW storywriting and narration, alongside improved assistant-style tasks. It features enhanced multi-turn coherency and better adherence to prompts and instructions, making it suitable for diverse conversational and creative generation applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p