sedrickkeh/mistral_openhermes_v3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The sedrickkeh/mistral_openhermes_v3 is a 7 billion parameter language model, fine-tuned from mistralai/Mistral-7B-v0.1. This model was trained with a 4096 token context length and achieved a final loss of 0.5579 on its evaluation set. While specific differentiators are not detailed, its fine-tuning process suggests potential for specialized applications based on its unknown training dataset.
Loading preview...