OCHone/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-powerful_prehistoric_lizard
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 21, 2025Architecture:Transformer Warm
OCHone/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-powerful_prehistoric_lizard is a 0.5 billion parameter instruction-tuned model based on the Qwen2.5 architecture. This model is shared by OCHone and is part of the Gensyn Swarm initiative. Due to the lack of specific training details, its primary differentiators and optimal use cases are not explicitly defined, suggesting it may serve as a foundational or experimental model within the Gensyn Swarm ecosystem.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–