Changlong1/ttLlama-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 21, 2023License:llama2Architecture:Transformer Open Weights Cold

Changlong1/ttLlama-7b is a 7 billion parameter Code Llama model, fine-tuned using QLoRA on the mlabonne/Evol-Instruct-Python-1k dataset. This model is specifically optimized for general code synthesis and understanding, building upon the foundational capabilities of Code Llama. It offers specialized performance for tasks requiring Python code generation and comprehension.

Loading preview...