md-nishat-008/TigerCoder-1B
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Feb 27, 2026License:cc-by-4.0Architecture:Transformer Open Weights Warm

TigerCoder-1B by md-nishat-008 is a 1 billion parameter instruction-tuned causal language model specifically designed for code generation in Bangla, supporting a 32768 token context length. It is the first dedicated family of Code LLMs for Bangla, fine-tuned on 300K Bangla instruction-code pairs. This model achieves significant Pass@1 gains (11-18%) over prior baselines and outperforms models up to 27x larger on Bangla code generation benchmarks, excelling in Python, C++, Java, and JavaScript.

Loading preview...