cjiao/OpenThinker3-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
OpenThinker3-1.5B is a 1.5 billion parameter causal language model developed by cjiao, fine-tuned from Qwen2.5-1.5B-Instruct. It was specifically trained on the open-thoughts/OpenThoughts-114k dataset, suggesting an optimization for processing and generating content related to open-ended discussions or thought processes. With a 32K context length, it is suitable for applications requiring moderate context understanding and generation.
Loading preview...