TechNamu/Namu-1.7B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Feb 4, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
TechNamu/Namu-1.7B is a 2 billion parameter language model developed by TechNamu, fine-tuned from unsloth/qwen3-1.7b-base-unsloth-bnb-4bit. It was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a context length of 40960 tokens, this model is optimized for efficient deployment and tasks requiring substantial context processing.
Loading preview...