smotoc/foxy_mistral7B_unsloth
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 7, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The smotoc/foxy_mistral7B_unsloth is a 7 billion parameter Mistral-based causal language model developed by smotoc. It was fine-tuned from unsloth/mistral-7b-bnb-4bit and optimized for faster training using Unsloth and Huggingface's TRL library. This model is designed for general language generation tasks, leveraging its Mistral architecture for efficient performance. Its 4096 token context length supports a range of applications requiring moderate input and output sequences.
Loading preview...