rccmsu/ruadapt_mistral_7b_v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

rccmsu/ruadapt_mistral_7b_v0.1 is a 7 billion parameter Mistral-7B-v0.1 model fine-tuned by rccmsu for Russian language adaptation. This model underwent tokenization replacement and targeted training of embeddings and the language model head on a 33GB Russian dataset, with additional LoRa training. It is specifically designed to improve performance on Russian language tasks, though its metrics are noted to be slightly worse than the original Mistral-7B on various datasets.

Loading preview...