Remostart/Plutus_Tutor_model
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm
The Remostart/Plutus_Tutor_model is a 4 billion parameter instruction-tuned causal language model, fine-tuned from Qwen/Qwen3-4B-Instruct-2507. With a context length of 40960 tokens, this model is designed for specific applications, though its exact training data and primary differentiators are not publicly detailed. It leverages the Qwen3 architecture, making it suitable for tasks requiring a compact yet capable language model.
Loading preview...