jisukim8873/mistral-7B-alpaca-case-3-2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The jisukim8873/mistral-7B-alpaca-case-3-2 is a 7 billion parameter language model, likely based on the Mistral architecture, fine-tuned for specific applications. With a context length of 4096 tokens, this model is designed for tasks requiring moderate context understanding. Its specific fine-tuning suggests optimization for particular use cases, though details are not provided in the model card.

Loading preview...