noobmaster6009/Qwen3-0.6B-Gensyn-Swarm-polished_sleek_locust
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Sep 20, 2025Architecture:Transformer Warm
The noobmaster6009/Qwen3-0.6B-Gensyn-Swarm-polished_sleek_locust is a 0.8 billion parameter language model with a substantial 40960-token context length. This model is part of the Qwen3 family, indicating a foundation in advanced transformer architectures. Due to the limited information in its model card, its specific differentiators and primary use cases beyond general language tasks are not explicitly detailed.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–