jondurbin/airoboros-l2-13b-2.2
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

jondurbin/airoboros-l2-13b-2.2 is a 13 billion parameter experimental language model, fine-tuned by jondurbin, primarily using synthetic data generated by the Airoboros project. This model, built on the Llama-2 architecture with a 4096-token context length, focuses heavily on instruction response pairs and general-purpose tasks. It features a "clean" dataset version without de-alignment data and is optimized for context-obedient question answering, coding, agent/function calling, and chain-of-thought reasoning.

Loading preview...