Multiplex-Thinking/Multiplex-Thinking-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 14, 2026License:mitArchitecture:Transformer0.0K Open Weights Warm

Multiplex-Thinking/Multiplex-Thinking-1.5B is a 1.5 billion parameter language model developed by Multiplex-Thinking. This model features an exceptionally long context length of 131,072 tokens, making it suitable for tasks requiring extensive contextual understanding and processing of large documents. Its primary strength lies in handling complex, long-form inputs where retaining information across vast text spans is crucial.

Loading preview...