GaLLM-14B-v0.1 Overview
GaLLM-14B-v0.1 is a 14.8 billion parameter language model developed by CjangCjengh, fine-tuned from the Sakura-14B-Qwen2.5-Base-ParallelPT-v1 architecture. This model is specifically trained on galgame (gal-game) data, enabling it to effectively role-play various characters from a wide range of games across Japanese, Chinese, and Korean languages. It supports a substantial context length of 131072 tokens, allowing for extended and coherent character interactions.
Key Capabilities
- Specialized Role-Playing: Excels at adopting specific character personas from galgames, including both male and female roles.
- Multilingual Support: Capable of generating dialogue in Japanese, Chinese, and Korean, with character lists provided for each language.
- System Prompt Control: Character and game context are easily defined using a structured system prompt, allowing users to specify the model's role and the user's role within a given game scenario.
- High Context Length: A 131072 token context window facilitates longer, more intricate, and consistent conversations.
Good for
- Interactive Storytelling: Ideal for applications requiring dynamic character interactions within a game-like narrative.
- Galgame Simulation: Perfect for simulating conversations with specific characters from popular galgames.
- Character-driven Chatbots: Suitable for creating chatbots that maintain a consistent persona based on predefined game characters.
- Multilingual Role-Play: Developers needing character role-play in Japanese, Chinese, or Korean will find this model particularly useful.