kakaocorp/kanana-1.5-8b-instruct-2505
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 21, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

Kanana 1.5-8B-Instruct-2505 is an 8 billion parameter instruction-tuned language model developed by kakaocorp, part of the Kanana 1.5 model family. It features enhanced capabilities in coding, mathematics, and function calling, and natively supports up to 32K token context length, extendable to 128K using YaRN. This model is designed for complex real-world problems requiring robust reasoning and extended conversational coherence.

Loading preview...