megastudyedu/M-SOLAR-10.7B-v1.3
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-nd-4.0Architecture:Transformer0.0K Open Weights Warm
M-SOLAR-10.7B-v1.3 is a 10.7 billion parameter language model developed by megastudyedu, Prediction, and MICE. This model is an enhanced version of megastudy/M-SOLAR-10.7B-v1.1-beta, further fine-tuned with additional in-house datasets and increased random shuffle generation data. It is designed for general language understanding and generation tasks, building upon its predecessor's capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p