Shiyu-Lab/Inputoutput_SFT_Qwen3_4B
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Nov 2, 2025License:mitArchitecture:Transformer Open Weights Warm

Shiyu-Lab/Inputoutput_SFT_Qwen3_4B is a 4 billion parameter language model based on the Qwen3 architecture, developed by Shiyu-Lab. This model is instruction-tuned and features a substantial 32,768 token context length, making it suitable for tasks requiring extensive contextual understanding. Its primary application is in general-purpose language generation and understanding, leveraging its large context window for complex prompts.

Loading preview...