jondurbin/airoboros-l2-7b-3.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

The jondurbin/airoboros-l2-7b-3.0 is a 7 billion parameter experimental language model developed by jondurbin, built on the Llama-2 architecture with a 4096-token context length. It is fine-tuned primarily on synthetic data from the airoboros-3.0 dataset, focusing heavily on instruction following rather than casual chat. This model incorporates unique features like MathJSON for deterministic calculations and enhanced context-obedient question answering.

Loading preview...