CjangCjengh/GaLLM-14B-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Mar 8, 2025License:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Cold

CjangCjengh/GaLLM-14B-v0.1 is a 14.8 billion parameter language model developed by CjangCjengh, fine-tuned from SakuraLLM/Sakura-14B-Qwen2.5-Base-ParallelPT-v1. This model specializes in role-playing characters from Japanese, Chinese, and Korean galgame data, offering a 131072 token context length. Its primary use case is generating character-specific dialogue and interactions based on provided game and character names.

Loading preview...