kyujinpy/SOLAR-Platypus-10.7B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Dec 13, 2023License:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Warm
kyujinpy/SOLAR-Platypus-10.7B-v2 is a 10.7 billion parameter auto-regressive language model developed by Kyujin Han, based on the Llama2 architecture and fine-tuned from upstage/SOLAR-10.7B-v1.0. This model was trained using Q-LoRA on the garage-bAInd/Open-Platypus dataset, featuring a 4096-token context length. It is designed for general text generation tasks, leveraging its Llama2 foundation and specific fine-tuning for improved performance.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p