Mphuc213222/Ai_interview_merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Mphuc213222/Ai_interview_merged is a 7 billion parameter instruction-tuned causal language model developed by Mphuc213222. This model is finetuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit, leveraging Unsloth and Huggingface's TRL library for accelerated training. It is optimized for tasks requiring instruction following, building upon the Mistral architecture's efficiency and performance.

Loading preview...