jondurbin/airocoder-34b-2.1
TEXT GENERATIONConcurrency Cost:2Model Size:34BQuant:FP8Ctx Length:32kPublished:Aug 30, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold
jondurbin/airocoder-34b-2.1 is a 34 billion parameter language model based on CodeLlama-34B, fine-tuned specifically for code-related tasks. This model leverages an expert adapter from LMOE, focusing its capabilities on code generation and understanding. With a 32,768 token context length, it is optimized for developers requiring robust code intelligence.
Loading preview...
jondurbin/airocoder-34b-2.1: Code-Optimized Language Model
jondurbin/airocoder-34b-2.1 is a 34 billion parameter model built upon the CodeLlama-34B architecture. Its core differentiator lies in its specialized fine-tuning, which incorporates a "code" expert adapter derived from LMOE (Large Mixture of Experts) techniques. This targeted training enhances its proficiency in programming-related tasks.
Key Capabilities
- Code-centric Performance: Optimized for understanding, generating, and assisting with code.
- Specialized Fine-tuning: Utilizes an LMOE "code" expert adapter for enhanced code intelligence.
- Large Context Window: Features a 32,768 token context length, suitable for handling extensive codebases or complex programming problems.
Good For
- Code Generation: Creating new code snippets or functions.
- Code Completion: Assisting developers by suggesting relevant code.
- Code Understanding: Analyzing and interpreting existing code structures.
- Developer Tools: Integration into IDEs or other programming environments for intelligent assistance.