rishiraj/smol-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 3, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
rishiraj/smol-7b is a 7 billion parameter instruction-tuned causal language model developed by rishiraj. It is a fine-tuned version of openchat/openchat_3.5, trained on the HuggingFaceH4/no_robots dataset. This model achieves a notable 65 on the MMLU benchmark, making it the highest-ranked 7B chat model on MMLU at its release. It is optimized for general chat applications and reasoning tasks.
Loading preview...