kamaboko2007/LLM2025_main_003_full
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 6, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The kamaboko2007/LLM2025_main_003_full is a 4 billion parameter Qwen3-based instruction-tuned language model developed by kamaboko2007. This model was finetuned from unsloth/qwen3-4b-instruct-2507-unsloth-bnb-4bit and trained 2x faster using Unsloth and Huggingface's TRL library. With a 40960 token context length, it is optimized for efficient performance in tasks requiring substantial context processing.
Loading preview...