morganstanley/qqWen-3B-Pretrain
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Feb 12, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The morganstanley/qqWen-3B-Pretrain is a 3.1 billion parameter language model developed by Morgan Stanley, built on the Qwen 2.5 architecture. It is specifically pretrained for advanced reasoning and code generation in the Q programming language. This model excels at tasks related to financial markets, time-series analytics, and quantitative research due to its specialized training on Q.

Loading preview...