g4me/QWiki-Base-LR1e5-b32g2gc8-ck2048-order-batch
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 8, 2026Architecture:Transformer Cold

The g4me/QWiki-Base-LR1e5-b32g2gc8-ck2048-order-batch model is a 2 billion parameter language model, fine-tuned from Qwen/Qwen3-1.7B-Base by g4me. It was trained using the TRL framework and supports a context length of 32768 tokens. This model is designed for general text generation tasks, leveraging its fine-tuned base for diverse applications.

Loading preview...