doandune/LexGuard-Mistral-Risk-Merged

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

LexGuard-Mistral-Risk-Merged is a 7 billion parameter Mistral-based causal language model developed by doandune, fine-tuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. Its primary use case is for applications requiring a Mistral 7B model with efficient training characteristics.

Loading preview...

LexGuard-Mistral-Risk-Merged Overview

LexGuard-Mistral-Risk-Merged is a 7 billion parameter language model developed by doandune, building upon the Mistral architecture. It is fine-tuned from the unsloth/mistral-7b-instruct-v0.2-bnb-4bit base model, leveraging the Unsloth library and Huggingface's TRL for efficient training.

Key Capabilities

  • Efficient Fine-tuning: Achieves 2x faster training speeds due to its optimization with Unsloth.
  • Mistral 7B Foundation: Inherits the strong general language understanding and generation capabilities of the Mistral 7B Instruct v0.2 model.
  • Apache 2.0 License: Provides flexibility for commercial and research use.

Good For

  • Developers seeking a Mistral 7B model with a focus on accelerated fine-tuning.
  • Applications where efficient resource utilization during training is critical.
  • Projects requiring a robust, instruction-tuned base for further specialization.