ifuseok/sft-solar-10.7b-v1.1
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jun 20, 2024Architecture:Transformer Warm

The ifuseok/sft-solar-10.7b-v1.1 model is a 10.7 billion parameter language model fine-tuned from the Upstage SOLAR-10.7B-Instruct-v1.0 base model. It is specifically trained on a diverse set of Korean instruction datasets, including nlpai-lab/databricks-dolly-15k-ko and kyujinpy/KOR-OpenOrca-Platypus-v3. This model excels at generating text based on Korean instructions, making it suitable for various Korean natural language processing tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p