Mphuc213222/Ai_interview_merged

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Mphuc213222/Ai_interview_merged is a 7 billion parameter instruction-tuned causal language model developed by Mphuc213222. This model is finetuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit, leveraging Unsloth and Huggingface's TRL library for accelerated training. It is optimized for tasks requiring instruction following, building upon the Mistral architecture's efficiency and performance.

Loading preview...

Overview

Mphuc213222/Ai_interview_merged is a 7 billion parameter instruction-tuned language model, developed by Mphuc213222. It is a finetuned version of the unsloth/mistral-7b-instruct-v0.2-bnb-4bit model, indicating its foundation on the Mistral architecture and its instruction-following capabilities.

Key Characteristics

  • Base Model: Finetuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit.
  • Training Efficiency: The model was trained using Unsloth and Huggingface's TRL library, which enabled a 2x faster training process.
  • Parameter Count: Features 7 billion parameters, offering a balance between performance and computational requirements.

Use Cases

This model is suitable for applications requiring a 7B instruction-tuned model, particularly where efficient training and the Mistral architecture's strengths are beneficial. Its finetuned nature suggests proficiency in understanding and executing given instructions.