OLAResearch/OLAF2-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Jan 21, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
OLAResearch/OLAF2-14B is a 14.8 billion parameter Korean language model developed by OLAResearch, designed for complex reasoning, mathematical problem-solving, and general language understanding. It features a specialized Reasoning Mode for STEM applications and detailed step-by-step reasoning, achieving performance levels that can surpass GPT-4o with Test-Time Scaling. The model supports a long context of up to 32K tokens, making it suitable for Retrieval-Augmented Generation (RAG) and tasks requiring extensive context comprehension.
Loading preview...