Saxo/Linkbricks-Horizon-AI-Korean-Pro-27B
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Jul 5, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
Saxo/Linkbricks-Horizon-AI-Korean-Pro-27B is a 27 billion parameter Korean language model developed by Linkbricks Horizon-AI, fine-tuned from the gemma-2-27b-it base model. It underwent Continued Pre-training (CPT), Supervised Fine-tuning (SFT), and Direct Preference Optimization (DPO) using 90 million Korean news corpus and cross-lingual data for Korean, Chinese, English, and Japanese. This model excels in high-dimensional analysis of customer reviews and social posts, coding, writing, mathematics, and complex logical reasoning tasks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–