yoonicorn/kor_historyModel
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The yoonicorn/kor_historyModel is a 4.3 billion parameter instruction-tuned causal language model developed by yoonicorn, based on the Google Gemma-3-4b-it architecture. It is specifically fine-tuned on the yoonicorn/kor_history dataset, making it highly specialized for tasks related to Korean history. With a context length of 32768 tokens, this model excels at generating accurate and contextually relevant text in Korean concerning historical topics.

Loading preview...