OpenAssistant/codellama-13b-oasst-sft-v10
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 26, 2023License:llama2Architecture:Transformer0.1K Open Weights Cold

OpenAssistant/codellama-13b-oasst-sft-v10 is a 13 billion parameter causal decoder-only transformer language model fine-tuned by OpenAssistant from Meta's CodeLlama 13B. Optimized for conversational code-related tasks, this model utilizes a ChatML-standard prompt template for enhanced compatibility with chat applications. It is designed for generating helpful, respectful, and honest responses, particularly in programming contexts, and supports a 4096-token context length.

Loading preview...