haihp02/environment-ttt_Qwen_Qwen3-4B-Instruct-2507
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 15, 2026Architecture:Transformer Cold

The haihp02/environment-ttt_Qwen_Qwen3-4B-Instruct-2507 model is a 4 billion parameter instruction-tuned language model developed by haihp02, based on the Qwen architecture. It features a substantial 32,768 token context length, enabling it to process and generate longer, more complex sequences of text. This model is designed for general-purpose instruction following, making it suitable for a wide range of natural language processing tasks.

Loading preview...