jisukim8873/mistral-7B-alpaca-case-1-2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The jisukim8873/mistral-7B-alpaca-case-1-2 model is a 7 billion parameter language model based on the Mistral architecture. This model is a fine-tuned version, likely for instruction-following tasks, given the 'alpaca-case' naming convention. With a context length of 4096 tokens, it is designed for general-purpose text generation and understanding, suitable for various natural language processing applications.

Loading preview...