traversaal-ai/traversaal-2.5-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 31, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

traversaal-ai/traversaal-2.5-Mistral-7B is a 7 billion parameter language model developed by traversaal-ai, fine-tuned using Direct Preference Optimization (DPO) from the teknium/OpenHermes-2.5-Mistral-7B base model. It features a 4096 token context length and incorporates hyperparameter optimizations. This model is designed for general language tasks, building upon a base model that was supervised fine-tuned with LoRA using QWEN-72B.

Loading preview...