Sao10K/32B-Qwen2.5-Kunou-v1 is a 32.8 billion parameter causal language model based on the Qwen2.5 architecture, developed by Sao10K. This model is designed as a generalist with a focus on roleplay capabilities, leveraging a refined dataset from previous Euryale and Stheno lineage models. It features a substantial 131072-token context length, making it suitable for extended conversational and creative text generation tasks.
No reviews yet. Be the first to review!