IlyaGusev/saiga_mistral_7b_merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 22, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

IlyaGusev/saiga_mistral_7b_merged is a 7 billion parameter language model based on the Mistral architecture, developed by IlyaGusev. This model is a merged version of a LoRA fine-tune, indicating its specialization for specific tasks rather than general-purpose use. It is designed for deployment in various quantized formats, making it suitable for efficient inference on diverse hardware.

Loading preview...