Thrillcrazyer/QWEN7_THIP
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Nov 27, 2025Architecture:Transformer Cold

Thrillcrazyer/QWEN7_THIP is a 7.6 billion parameter instruction-tuned causal language model, fine-tuned by Thrillcrazyer from the Qwen-7B_THIP base model. This model leverages a 131,072-token context window, making it suitable for tasks requiring extensive contextual understanding. It is specifically optimized through Supervised Fine-Tuning (SFT) using the TRL framework, enhancing its ability to follow instructions and generate coherent text.

Loading preview...