sachiniyer/Qwen2.5-1.5B-SFT-Schwinn
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 17, 2026Architecture:Transformer Warm

The sachiniyer/Qwen2.5-1.5B-SFT-Schwinn is a 1.5 billion parameter language model based on the Qwen2.5 architecture, fine-tuned for specific tasks. With a notable context length of 131072 tokens, it is designed for applications requiring extensive contextual understanding. This model is intended for direct use in various natural language processing tasks, leveraging its fine-tuned capabilities.

Loading preview...