AIdenU/SOLAR-10.7b-ko-Y24_v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Warm
AIdenU/SOLAR-10.7b-ko-Y24_v0.1 is a 10.7 billion parameter causal language model developed by AIdenU, based on the SOLAR-10.7B-v1.0 architecture. This model is specifically fine-tuned for Korean language processing, leveraging its base model's capabilities for general language understanding and generation. It is designed for applications requiring robust performance in Korean conversational AI and text generation tasks, with a context length of 4096 tokens.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–