shleeeee/mistral-ko-OpenOrca-Platypus-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:otherArchitecture:Transformer Cold

shleeeee/mistral-ko-OpenOrca-Platypus-v2 is a fine-tuned language model based on the Mistral-7B architecture, developed by shleeeee (Seunghyeon Lee) and oopsung (Sungwoo Park). This model is specifically optimized for Korean language processing, leveraging the Mistral-7B foundation. Its primary application is in tasks requiring strong Korean language understanding and generation capabilities.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p