mlabonne/PyLlama-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

PyLlama-7b is a 7 billion parameter CodeLlama-based model developed by mlabonne, fine-tuned using QLoRA (4-bit precision) on the Evol-Instruct-Python-26k dataset. This model specializes in Python code generation and understanding, leveraging its base architecture for robust performance in programming tasks. It is optimized for developers seeking an efficient, specialized LLM for Python-centric applications.

Loading preview...