tiyupi-ece/HeyTUP
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Dec 4, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

HeyTUP is a 7.6 billion parameter Qwen2.5-based causal language model developed by tiyupi-ece. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging its Qwen2.5 architecture for efficient performance. The model has a context length of 32768 tokens, making it suitable for processing longer inputs.

Loading preview...