glorgao/Qwen2.5-7B-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jul 14, 2025Architecture:Transformer Cold
glorgao/Qwen2.5-7B-SFT is a 7.6 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture, developed by glorgao. This model is designed for general-purpose language understanding and generation tasks, leveraging a substantial context length of 131072 tokens to handle extensive inputs and maintain coherence over long conversations or documents. Its primary use case is serving as a robust foundation for various natural language processing applications requiring strong performance in text completion, summarization, and question answering.
Loading preview...