Sao10K/72B-Qwen2.5-Kunou-v1
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Dec 6, 2024License:qwenArchitecture:Transformer0.0K Warm

The Sao10K/72B-Qwen2.5-Kunou-v1 is a 72.7 billion parameter causal language model based on the Qwen2.5 architecture, developed by Sao10K. This model is designed as a generalist with a particular focus on roleplay and creative instruction tasks, building upon a refined dataset from previous Euryale and Stheno lineage models. With a context length of 131072 tokens, it offers extensive conversational capabilities for complex interactions. It serves as a successor to earlier models, utilizing a cleaned and improved dataset for enhanced performance in creative domains.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p