sabirjdjdjd/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-territorial_lazy_prawn
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 8, 2025Architecture:Transformer Warm
The sabirjdjdjd/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-territorial_lazy_prawn is a 0.5 billion parameter instruction-tuned model based on the Qwen2.5 architecture. With a context length of 32768 tokens, this model is designed for general language understanding and generation tasks. Its compact size makes it suitable for applications requiring efficient inference and deployment.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–