kayacrypto/Qwen2.5-Coder-1.5B-Instruct-Gensyn-Swarm-mute_tall_zebra
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Nov 16, 2025Architecture:Transformer Warm
The kayacrypto/Qwen2.5-Coder-1.5B-Instruct-Gensyn-Swarm-mute_tall_zebra is a 1.5 billion parameter instruction-tuned causal language model. This model is part of the Qwen2.5 family, designed for general language understanding and generation tasks. With a context length of 131072 tokens, it is suitable for applications requiring processing of extensive inputs. Its instruction-tuned nature suggests a focus on following user commands and generating coherent responses.
Loading preview...