thetmon/c10
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

thetmon/c10 is a 4 billion parameter LoRA adapter fine-tuned from Qwen3-4B-Instruct-2507, developed by thetmon. This adapter specializes in improving multi-turn agent task performance, particularly in household tasks (ALFWorld) and database operations (DBBench). It enhances the base model's ability to learn environment observation, action selection, tool use, and error recovery in complex, multi-turn interactions, making it suitable for agentic AI applications.

Loading preview...