PetarKal/Qwen3-4B-Base-ascii-art-v5-e3-lr8e-5-ga16-ctx4096
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm

PetarKal/Qwen3-4B-Base-ascii-art-v5-e3-lr8e-5-ga16-ctx4096 is a 4 billion parameter language model, fine-tuned by PetarKal from Qwen/Qwen3-4B-Base. This model is specifically trained using SFT with TRL, focusing on generating ASCII art. It leverages a 32768 token context length, making it suitable for tasks requiring detailed textual output in an ASCII art format.

Loading preview...