Haicaochi/Qwen_05_txtt_V2
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 1, 2025Architecture:Transformer Warm

Haicaochi/Qwen_05_txtt_V2 is a 0.5 billion parameter language model developed by Haicaochi, fine-tuned from Haicaochi/Qwen2.5-0.5B-txtt using the TRL framework. This model is optimized for text generation tasks, leveraging a substantial 131,072 token context length. Its primary use case is generating coherent and contextually relevant text based on user prompts, making it suitable for various conversational and creative applications.

Loading preview...