Goekdeniz-Guelmez/JOSIE-1.1-4B-Thinking
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 8, 2026License:mitArchitecture:Transformer0.0K Open Weights Warm

JOSIE-1.1-4B-Thinking by Gökdeniz Gülmez is a 4 billion parameter, full-weight fine-tuned reasoning model based on Qwen3-4B-Thinking, featuring an extended 65,536 token context length. Optimized for complex logical reasoning, mathematics, STEM applications, and creative writing, it provides direct, uncensored outputs. The model is designed for deep analytical tasks and maintains coherence across long contexts, trained on a curated dataset including distilled reasoning traces and high-quality answer refinements.

Loading preview...