Stopwolf/Tito-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Stopwolf/Tito-7B-slerp is a 7 billion parameter language model created by Stopwolf, formed by merging gordicaleksa/YugoGPT and mlabonne/AlphaMonarch-7B using the slerp method. This model is specifically optimized for performance in Serbian language tasks, demonstrating strong results across various benchmarks. It excels in Serbian LLM evaluation suites, making it suitable for applications requiring robust Serbian language understanding and generation.

Loading preview...