cxrbon16/turkish-llama-MSFT-0.7
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 26, 2026License:llama3Architecture:Transformer Cold

The cxrbon16/turkish-llama-MSFT-0.7 is an 8 billion parameter language model, fine-tuned from ytu-ce-cosmos/Turkish-Llama-8b-v0.1. This model is specifically adapted for Turkish language tasks, building upon the Llama architecture. It was trained with a context length of 8192 tokens, demonstrating a validation loss of 0.3326. Its primary application is in Turkish natural language processing, leveraging its fine-tuned capabilities for improved performance in the language.

Loading preview...