nbtpj/summ_Qwen0b5_tldr_xsum
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Jan 24, 2026Architecture:Transformer Cold

The nbtpj/summ_Qwen0b5_tldr_xsum model is a fine-tuned version of Qwen's Qwen2.5-0.5B, a 0.5 billion parameter language model. This model has been specifically trained using Supervised Fine-Tuning (SFT) with the TRL framework. Its primary purpose is likely summarization, given its name suffix "tldr_xsum", indicating an optimization for generating concise summaries from longer texts.

Loading preview...