jisukim8873/mistral-7B-alpaca-case-2-2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The jisukim8873/mistral-7B-alpaca-case-2-2 is a 7 billion parameter language model based on the Mistral architecture. This model is a fine-tuned version, likely for instruction-following tasks, given the 'alpaca-case' in its name. With a 4096-token context length, it is suitable for general text generation and understanding applications where a compact yet capable model is required.

Loading preview...