cyLee-g/fyp-qwen
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 7, 2026License:afl-3.0Architecture:Transformer Cold

The cyLee-g/fyp-qwen is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen/Qwen2.5-7B-Instruct architecture. This model is designed for general text generation tasks, leveraging its base model's capabilities for diverse applications. It features a notable context length of 32768 tokens, making it suitable for processing and generating longer sequences of text.

Loading preview...