sharthakdey/interviewer-model
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The sharthakdey/interviewer-model is a 7 billion parameter Mistral-based instruction-tuned causal language model developed by sharthakdey. Finetuned from unsloth/mistral-7b-instruct-v0.2-bnb-4bit, this model was trained using Unsloth and Huggingface's TRL library for accelerated finetuning. It is designed for conversational tasks, particularly those mimicking an interview scenario, leveraging its Mistral architecture for effective instruction following.

Loading preview...