ingeol/kosaul_v0.2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:mitArchitecture:Transformer0.0K Open Weights Cold

The KoSaul-8B model, developed by Ingeol Baek, is an 8 billion parameter Llama 3-based language model continuously pre-trained on Korean legal and medical datasets. It features an 8192-token context length and demonstrates improved perplexity on legal data compared to other Korean LLMs. This model is optimized for tasks requiring specialized knowledge in Korean legal and medical domains.

Loading preview...