ajibawa-2023/Code-290k-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jan 16, 2024License:cc-by-nc-nd-4.0Architecture:Transformer0.0K Open Weights Cold

The ajibawa-2023/Code-290k-13B is a 13 billion parameter language model, fine-tuned from the Llama-2 base model, specifically designed for multi-language code generation with detailed explanations. Trained on a dataset of 290,000 code examples across Python, Java, JavaScript, Go, C++, Rust, Ruby, SQL, and more, it excels at providing both functional code and comprehensive accompanying explanations. This model is optimized for developers seeking not just code, but also a clear understanding of its logic and implementation.

Loading preview...