Tpuser32/mistral-7b-rl-resumeur-struct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer Open Weights Cold
Tpuser32/mistral-7b-rl-resumeur-struct is a 7 billion parameter Mistral-based causal language model developed by Tpuser32. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its Mistral architecture and 4096 token context length.
Loading preview...
Model Overview
This model, Tpuser32/mistral-7b-rl-resumeur-struct, is a 7 billion parameter language model based on the Mistral architecture. It was developed by Tpuser32 and fine-tuned from unsloth/mistral-7b-bnb-4bit.
Key Characteristics
- Architecture: Mistral-7B base model.
- Training: Fine-tuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
- License: Distributed under the Apache-2.0 license.
Good For
- Users seeking a Mistral-7B variant that has undergone specific fine-tuning.
- Exploring models trained with Unsloth for efficiency benefits.
- General text generation and understanding tasks where a 7B parameter model is suitable.