yesimm01/Qwen2.5-1.5B-Instruct-Gensyn-Swarm-amphibious_prehistoric_gibbon
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Oct 23, 2025Architecture:Transformer Warm

The yesimm01/Qwen2.5-1.5B-Instruct-Gensyn-Swarm-amphibious_prehistoric_gibbon model is a 1.5 billion parameter instruction-tuned language model based on the Qwen2.5 architecture. This model is part of a series of models developed by yesimm01, focusing on specific instruction-following capabilities. With a substantial context length of 131072 tokens, it is designed for tasks requiring extensive contextual understanding and generation. Its primary strength lies in processing and generating long-form content based on detailed instructions.

Loading preview...