Sao10K/14B-Qwen2.5-Kunou-v1

Warm
Public
14.8B
FP8
131072
License: qwen
Hugging Face
Overview

Sao10K/14B-Qwen2.5-Kunou-v1 Overview

Sao10K/14B-Qwen2.5-Kunou-v1 is a 14.8 billion parameter model built on the Qwen2.5 architecture, developed by Sao10K. This model is positioned as a "little sister" variant within the Kunou series, which also includes larger 32B and 72B versions. It is primarily intended as a generalist language model with a specific focus on roleplay scenarios.

Key Characteristics

  • Architecture: Based on the Qwen2.5 model family.
  • Parameter Count: 14.8 billion parameters.
  • Context Length: Supports a substantial context window of 131072 tokens.
  • Dataset: Utilizes a cleaned and improved dataset, described as a successor to those used in earlier models like Euryale and Stheno.

Recommended Usage

For optimal performance, the creator suggests using the ChatML prompt format. Recommended inference settings include a temperature of 1.1 and a min_p value of 0.1, emphasizing that the system prompt plays a crucial role in its output quality. This model is particularly well-suited for applications requiring general text generation and engaging in roleplay-centric conversations.