prithivMLmods/Bellatrix-Tiny-1B-R1
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Jan 31, 2025License:llama3.2Architecture:Transformer0.0K Warm

Bellatrix-Tiny-1B-R1 by prithivMLmods is a 1 billion parameter auto-regressive language model based on an optimized transformer architecture. It is instruction-tuned and optimized for multilingual dialogue use cases, including agentic retrieval and summarization tasks. This model is designed to outperform many open-source options in reasoning-based applications, particularly for multilingual dialogue.

Loading preview...