macadeliccc/Mistral-7B-v0.2-OpenHermes
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

macadeliccc/Mistral-7B-v0.2-OpenHermes is a 7 billion parameter language model, fine-tuned from alpindale/Mistral-7B-v0.2 using the teknium/OpenHermes-2.5 dataset. Developed by macadeliccc, this model is proficient in Retrieval Augmented Generation (RAG) use cases, offering a specialized foundation for applications requiring enhanced factual grounding. It was trained efficiently in 13 hours on an A100 GPU, leveraging Unsloth and Huggingface's TRL library.

Loading preview...