jondurbin/airoboros-m-7b-3.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 3, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The jondurbin/airoboros-m-7b-3.0 is a 7 billion parameter experimental language model developed by jondurbin, built upon the Mistral-7B architecture. It is primarily fine-tuned using synthetic data from the airoboros-3.0 dataset, focusing heavily on instruction following rather than casual chat or roleplay. This model incorporates unique features like MathJSON for deterministic calculations and enhanced multi-turn coherency through human-generated roleplay conversations, making it suitable for complex instruction-based tasks and context-obedient question answering.

Loading preview...