Nao-Taka/LLM2025-advance
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 19, 2026Architecture:Transformer Cold

Nao-Taka/LLM2025-advance is a 4 billion parameter language model based on Qwen3-4B-Instruct-2507, fine-tuned using LoRA. This model is specifically optimized for agent-based tasks, demonstrating enhanced performance on benchmarks like AgentBench. Its primary strength lies in its ability to handle complex agentic workflows and reasoning.

Loading preview...