rahulpuri54/Merge_base_model_30_adapters
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The rahulpuri54/Merge_base_model_30_adapters is a 7 billion parameter Mistral-based instruction-tuned causal language model, developed by rahulpuri54. This model was finetuned from unsloth/mistral-7b-instruct-v0.3-bnb-4bit using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general instruction-following tasks, leveraging its Mistral architecture and efficient training methodology.

Loading preview...