abacaj/mistral-7b-sft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 2, 2023Architecture:Transformer0.0K Cold
The abacaj/mistral-7b-sft model is a 7 billion parameter Mistral-based language model fine-tuned for instruction following. This model is designed to process prompts and generate coherent responses, demonstrating capabilities in logical reasoning and general question answering. With a 4096-token context length, it is suitable for tasks requiring understanding and generation of moderately long texts. Its primary strength lies in its ability to follow instructions and provide direct answers to queries.
Loading preview...