ashwincv0112/codellama-python7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Code Llama is a collection of pretrained and fine-tuned generative text models developed by Meta, ranging from 7 billion to 34 billion parameters. This specific model, ashwincv0112/codellama-python7b, is the 7 billion parameter Python specialist version, optimized for general code synthesis and understanding within the Python programming language. It is designed for commercial and research use, excelling at Python code completion tasks. The model utilizes an optimized transformer architecture and was trained between January and July 2023.

Loading preview...