jisukim8873/mistral-7B-alpaca-case-0-2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The jisukim8873/mistral-7B-alpaca-case-0-2 is a 7 billion parameter language model, likely based on the Mistral architecture, fine-tuned for general language generation tasks. This model is shared by jisukim8873 and has a context length of 4096 tokens. Its primary utility lies in its ability to process and generate human-like text, making it suitable for a variety of natural language processing applications.

Loading preview...