mehuldamani/sft-qwen-maze-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Cold

The mehuldamani/sft-qwen-maze-v1 is a 7.6 billion parameter language model with a 32768 token context length. This model is a fine-tuned variant, though specific architectural details and training data are not provided in its current model card. Its primary characteristics and intended use cases are not explicitly detailed, suggesting it may be a base or experimental model requiring further information for specific applications.

Loading preview...