tartuNLP/Tahetorn_9B
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Dec 30, 2025Architecture:Transformer0.0K Cold
The tartuNLP/Tahetorn_9B is a 9 billion parameter language model developed by tartuNLP. This model is a general-purpose language model, though specific differentiators, training data, or primary use cases are not detailed in its current model card. Its 16384-token context length suggests suitability for tasks requiring extensive contextual understanding. Further information is needed to identify its unique strengths or specialized applications.
Loading preview...