ytu-ce-cosmos/tr-Qwen2.5-0.5B-SFT-v1
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 3, 2025Architecture:Transformer0.0K Warm

The ytu-ce-cosmos/tr-Qwen2.5-0.5B-SFT-v1 is a 0.5 billion parameter instruction-tuned causal language model based on the Qwen2.5 architecture. This model is fine-tuned for specific tasks, indicated by 'SFT' (Supervised Fine-Tuning), and features a substantial context length of 131072 tokens. It is designed for applications requiring efficient processing of long sequences with a smaller parameter footprint.

Loading preview...