allura-org/Teleut-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Nov 24, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Teleut-7b by allura-org is a 7.6 billion parameter language model, a replication attempt of the Tulu 3 model built upon the Qwen 2.5 base architecture. It features a 131072 token context length and demonstrates strong performance across various benchmarks, particularly in reasoning and instruction-following tasks. This model is optimized for general-purpose conversational AI and complex instruction adherence, making it suitable for applications requiring robust understanding and generation.

Loading preview...