mehuldamani/sft-qwen-maze-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Cold

The mehuldamani/sft-qwen-maze-v2 is a 7.6 billion parameter language model. This model is based on the Qwen architecture and has a context length of 32768 tokens. Further details regarding its specific training, differentiators, and intended use cases are not provided in the available model card.

Loading preview...