Psiyolbin/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-leggy_fleecy_whale
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Dec 9, 2025Architecture:Transformer Warm

Psiyolbin/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-leggy_fleecy_whale is a 0.5 billion parameter instruction-tuned language model developed by Psiyolbin. This model is part of the Qwen2.5-Coder family and features a substantial 131072-token context length. Its primary differentiator and intended use case are not explicitly detailed in the provided information, suggesting it may be a base or experimental model within the Gensyn Swarm initiative.

Loading preview...

Model Overview

This model, Psiyolbin/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-leggy_fleecy_whale, is a 0.5 billion parameter instruction-tuned language model. Developed by Psiyolbin, it is identified as a component within the Qwen2.5-Coder family, potentially indicating a focus on code-related tasks. A notable technical specification is its extensive 131072-token context length, which allows for processing very long sequences of text or code.

Key Characteristics

  • Parameter Count: 0.5 billion parameters, making it a relatively compact model.
  • Context Length: Features a large 131072-token context window, suitable for tasks requiring extensive contextual understanding.
  • Instruction-Tuned: Designed to follow instructions, implying a focus on conversational or task-oriented applications.

Potential Use Cases

Given the limited information, specific use cases are not detailed. However, its instruction-tuned nature and large context window suggest potential applications in:

  • Long-form content generation: Leveraging the extensive context length.
  • Code understanding or generation: As part of the "Coder" family, it might be optimized for programming tasks.
  • Experimental research: Potentially used within the Gensyn Swarm initiative for distributed training or novel applications.