Hotmf/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-poisonous_mimic_woodpecker
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kArchitecture:Transformer Warm

The Hotmf/Qwen2.5-Coder-0.5B-Instruct-Gensyn-Swarm-poisonous_mimic_woodpecker is a 0.5 billion parameter instruction-tuned language model with a substantial 131,072 token context length. This model is designed for general language understanding and generation tasks, leveraging its compact size for efficient deployment. Its instruction-following capabilities make it suitable for a variety of interactive AI applications.

Loading preview...