CoconutEmb/SFT-Qwen2.5-1.5B-Instruct-TongSearch
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 23, 2026License:otherArchitecture:Transformer Warm
CoconutEmb/SFT-Qwen2.5-1.5B-Instruct-TongSearch is a 1.5 billion parameter instruction-tuned causal language model, fine-tuned by CoconutEmb from the Qwen/Qwen2.5-1.5B-Instruct architecture. This model has a context length of 32768 tokens and is specifically optimized through supervised fine-tuning on the TongSearch_Coconut@16_v2 dataset. It is intended for applications requiring a compact yet capable model for tasks aligned with its specialized training data.
Loading preview...