Sao10K/14B-Qwen2.5-Kunou-v1
Sao10K/14B-Qwen2.5-Kunou-v1 is a 14.8 billion parameter language model based on the Qwen2.5 architecture, developed by Sao10K. This model is designed as a generalist and roleplay-oriented language model, serving as a smaller variant in the Kunou series. It utilizes a refined dataset, building upon previous smaller models by the same creator. With a context length of 131072 tokens, it aims to provide robust performance for conversational and creative text generation tasks.
Loading preview...
Sao10K/14B-Qwen2.5-Kunou-v1 Overview
Sao10K/14B-Qwen2.5-Kunou-v1 is a 14.8 billion parameter model built on the Qwen2.5 architecture, developed by Sao10K. This model is positioned as a "little sister" variant within the Kunou series, which also includes larger 32B and 72B versions. It is primarily intended as a generalist language model with a specific focus on roleplay scenarios.
Key Characteristics
- Architecture: Based on the Qwen2.5 model family.
- Parameter Count: 14.8 billion parameters.
- Context Length: Supports a substantial context window of 131072 tokens.
- Dataset: Utilizes a cleaned and improved dataset, described as a successor to those used in earlier models like Euryale and Stheno.
Recommended Usage
For optimal performance, the creator suggests using the ChatML prompt format. Recommended inference settings include a temperature of 1.1 and a min_p value of 0.1, emphasizing that the system prompt plays a crucial role in its output quality. This model is particularly well-suited for applications requiring general text generation and engaging in roleplay-centric conversations.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.