thetmon/c23
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 26, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The thetmon/c23 is a 4 billion parameter LoRA adapter fine-tuned from Qwen/Qwen3-4B-Instruct-2507, designed to enhance multi-turn agent task performance. This adapter specifically improves capabilities in household tasks (ALFWorld) and database operations (DBBench) by optimizing environment observation, action selection, tool use, and error recovery. It leverages LoRA with full precision base training and a 32768 token context length, making it suitable for complex, multi-step agentic workflows.

Loading preview...