simecek/cswikimistral_0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 15, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The simecek/cswikimistral_0.1 is a 7 billion parameter Mistral-7B model, fine-tuned by simecek using 4-bit QLoRA on Czech Wikipedia data. This model is specifically adapted for Czech-specific Natural Language Processing tasks, offering enhanced understanding of the Czech language. Its primary use case is as a base for further fine-tuning for applications like summarization and question answering in Czech.
Loading preview...