06btcdeep/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-horned_smooth_prawn
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Nov 19, 2025Architecture:Transformer Warm

This model, 06btcdeep/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-horned_smooth_prawn, is a 0.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. With a substantial context length of 131,072 tokens, it is designed for processing extensive inputs. While specific training details are not provided, its 'Coder' designation suggests an optimization for code-related tasks. This model is intended for applications requiring efficient processing of long sequences, potentially in coding or technical domains.

Loading preview...