XSCP/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-lithe_plump_mammoth
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 17, 2025Architecture:Transformer Warm
XSCP/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-lithe_plump_mammoth is a 0.5 billion parameter instruction-tuned language model. Developed by XSCP, this model features a substantial 131,072 token context length, indicating a capacity for processing extensive inputs. While specific differentiators are not detailed in the provided information, its instruction-tuned nature suggests suitability for following complex commands. The model's primary application would likely involve tasks requiring understanding and generation based on long-form instructions.
Loading preview...