codellama/CodeLlama-70b-hf

TEXT GENERATIONConcurrency Cost:4Model Size:69BQuant:FP8Ctx Length:32kPublished:Jan 29, 2024License:llama2Architecture:Transformer0.3K Open Weights Cold

CodeLlama-70b-hf is a 69 billion parameter base model from Meta's Code Llama family, designed for general code synthesis and understanding. This auto-regressive transformer model supports a 32768 token context length and was fine-tuned with up to 16k tokens. It excels at code completion and is intended for commercial and research use in English and relevant programming languages.

Loading preview...

CodeLlama-70b-hf: A Foundation Model for Code

CodeLlama-70b-hf is the 69 billion parameter base model within Meta's Code Llama family, an optimized transformer architecture for code generation. This model is specifically designed for general code synthesis and understanding, supporting a context length of 32768 tokens and fine-tuned with up to 16k tokens. It is part of a larger collection of models ranging from 7B to 70B parameters, including specialized variants for Python and instruction-following.

Key Capabilities

  • Code Completion: The model is proficient in completing code snippets.
  • General Code Understanding: Designed to comprehend and process various programming language structures.
  • Scalable Architecture: Utilizes an auto-regressive transformer architecture, trained between January 2023 and January 2024.

Intended Use Cases

  • Commercial and Research Applications: Suitable for a wide array of code-related tasks in both research and commercial settings.
  • Code Synthesis: Ideal for generating new code based on prompts or existing context.
  • Code Understanding: Can be adapted for tasks requiring comprehension of code logic and structure.

This model is governed by a custom commercial license from Meta. More details can be found in the research paper "Code Llama: Open Foundation Models for Code".