Ishwaryas/mongo-mistral-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2026Architecture:Transformer Cold

Ishwaryas/mongo-mistral-merged is a 7 billion parameter language model developed by Ishwaryas. This model is a merged variant, likely combining characteristics from a base Mistral model with additional fine-tuning or merging techniques. With a context length of 4096 tokens, it is designed for general language understanding and generation tasks, offering a balance between performance and computational efficiency.

Loading preview...