PetarKal/Qwen3-4B-Base-ascii-art-v5-e3-lr5e-5-ga16-ctx4096

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 24, 2026Architecture:Transformer Warm

PetarKal/Qwen3-4B-Base-ascii-art-v5-e3-lr5e-5-ga16-ctx4096 is a 4 billion parameter language model fine-tuned from Qwen/Qwen3-4B-Base. This model was specifically trained using SFT with TRL for generating ASCII art, leveraging a context length of 32768 tokens. It is designed for creative applications requiring text-to-ASCII art conversion and generation.

Loading preview...

Overview

This model, PetarKal/Qwen3-4B-Base-ascii-art-v5-e3-lr5e-5-ga16-ctx4096, is a specialized 4 billion parameter language model. It is a fine-tuned version of the base model Qwen/Qwen3-4B-Base, developed by PetarKal. The training process utilized the TRL library and specifically employed Supervised Fine-Tuning (SFT).

Key Capabilities

  • ASCII Art Generation: The model's primary specialization is the generation of ASCII art, indicating a focus on creative text-to-image representations using characters.
  • Base Model: Built upon the robust Qwen3-4B-Base architecture, providing a strong foundation for language understanding and generation.
  • Extended Context Window: Features a context length of 32768 tokens, which can be beneficial for generating more complex or detailed ASCII art patterns.

Good For

  • Creative Applications: Ideal for developers and artists looking to integrate ASCII art generation into their projects.
  • Text-to-ASCII Art Conversion: Suitable for tasks where textual descriptions need to be transformed into character-based visual outputs.
  • Experimentation with Fine-tuning: Demonstrates the application of SFT with TRL for niche generative tasks.