Tweeties/tweety-7b-tatar-v24a
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Tweeties/tweety-7b-tatar-v24a is a 7 billion parameter language model developed by François Remy (UGent) and Alfiya Khabibullina (BeCode) et al., based on the Mistral-7B-Instruct-v0.2 architecture. This model is specifically trans-tokenized and fine-tuned for the Tatar language, utilizing a novel native tokenizer. It is designed for basic language modeling operations in Tatar and can be further fine-tuned for more complex tasks, functioning best in few-shot settings.

Loading preview...