janhq/Jan-v3.5-4B
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 20, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Loading

Jan-v3.5-4B is a 4 billion parameter language model developed by janhq, fine-tuned from Jan-v3-4B-base-instruct (Qwen3-4B architecture). It specializes in mathematical reasoning and problem-solving, while also featuring a distinct, casual, and self-aware personality shaped by Menlo Research. With a native context length of 262,144 tokens, this model is designed for conversational AI applications requiring both enhanced math capabilities and a unique, non-generic voice.

Loading preview...