emre/llama-2-13b-code-chat
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

emre/llama-2-13b-code-chat is a 13 billion parameter Llama 2 model, fine-tuned from llama-2-13b-chat-hf using QLoRA on the mlabonne/CodeLlama-2-20k dataset. This model is specifically designed for code generation and understanding, serving as a Llama 2 version of CodeAlpaca. It excels at generating Python code and is primarily intended for educational purposes, with a context length of 4096 tokens.

Loading preview...