chinna6/Qwen3-0.6B-Gensyn-Swarm-rough_prehistoric_anaconda
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Jun 28, 2025Architecture:Transformer Loading

The chinna6/Qwen3-0.6B-Gensyn-Swarm-rough_prehistoric_anaconda is a 0.8 billion parameter language model based on the Qwen3 architecture. This model is shared by chinna6 and has a context length of 32768 tokens. As a Hugging Face transformers model, its specific differentiators and primary use cases are not detailed in the provided information, indicating it may be a base model or a work in progress.

Loading preview...