qingy2024/Formatter-1.7B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:May 19, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

Formatter-1.7B is a 2 billion parameter Qwen3-based causal language model developed by qingy2024. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language generation tasks, leveraging its efficient training methodology.

Loading preview...