Delta-Vector/Austral-70B-Winton
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Jun 25, 2025License:apache-2.0Architecture:Transformer0.0KOpen Weights Warm

Delta-Vector/Austral-70B-Winton is a 70 billion parameter Llama-based language model fine-tuned by Delta-Vector. This model is specifically optimized for generalist roleplay and adventure scenarios, enhancing coherency and intelligence while maintaining creative capabilities. It utilizes KTO (Kahneman-Tversky Optimization) training to refine its performance, building upon the Austral-70B-Preview base model. With a 32768 token context length, it is designed for engaging and consistent narrative generation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p