SamirXR/yzy-python-0.5b
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 22, 2026License:mitArchitecture:Transformer Open Weights Warm

The SamirXR/yzy-python-0.5b is a 0.5 billion parameter language model fine-tuned for Python code generation and instruction-following, built upon the Qwen2-0.5B-Instruct base model. Optimized for lightweight scripting help and small coding copilots, it leverages QLoRA (4-bit) fine-tuning on an Alpaca-format Python instruction dataset. This model is designed for fast local inference and experimentation, making it suitable for hackathons and resource-constrained environments.

Loading preview...