allenai/codetulu-2-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 13, 2023Architecture:Transformer0.0K Cold

Codetulu 2 7B by AllenAI is a 7 billion parameter instruction-tuned language model, fine-tuned from CodeLlama-7b-hf with a 4096 token context length. It is designed to act as a helpful assistant, trained on a diverse mix of publicly available, synthetic, and human-created datasets, with a primary focus on English. This model is particularly suited for conversational AI applications requiring assistant-like behavior.

Loading preview...