asapse/DIOD-Mistral-0.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

DIOD-Mistral-0.2 is a 7 billion parameter language model developed by asapse, fine-tuned from OpenHermes-2-Mistral-7B. This model leverages the argilla/distilabel-intel-orca-dpo-pairs dataset for its training, focusing on enhancing its conversational and instruction-following capabilities. With a 4096-token context length, it is designed for general-purpose text generation and understanding tasks.

Loading preview...