beachcities/qwen3-4b-sft-v5h-hybrid-merged
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 17, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The beachcities/qwen3-4b-sft-v5h-hybrid-merged is a 4 billion parameter causal language model, fine-tuned from Qwen/Qwen2.5-3B-Instruct. It is specifically optimized for converting unstructured text into strictly formatted structured data (JSON, YAML, TOML, XML, CSV) using zero-shot prompting. This model serves as an experimental artifact for evaluating the "Empty Think Injection" mechanism and aligning small-scale LLMs for precise structured data generation, featuring a 32768 token context length.

Loading preview...