Goekdeniz-Guelmez/Josie-r1-4b-PoC-bf16
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kArchitecture:Transformer0.0K Warm
The Josie-r1-4b-PoC-bf16 is a 4 billion parameter Qwen3-based language model developed and funded by Gökdeniz Gülmez, fine-tuned for reasoning and instruction following. This model emphasizes instruction fidelity, reasoning consistency, and practical usefulness across everyday and technical tasks. It is designed to perform well in conversational settings, structured reasoning, and code-related prompts, while being efficient enough for consumer-grade hardware. Josie-r1 models frequently outperform similarly sized counterparts on academic and practical benchmarks.
Loading preview...