yale-nlp/qwen-instruct-synthetic_1_stem_only
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Sep 18, 2025Architecture:Transformer Cold

The yale-nlp/qwen-instruct-synthetic_1_stem_only model is a 7.6 billion parameter instruction-tuned language model developed by yale-nlp. With a context length of 32768 tokens, this model is designed for general language understanding and generation tasks. Its primary use case is to serve as a foundational model for various NLP applications, leveraging its substantial parameter count and context window for robust performance.

Loading preview...