taketakedaiki/qwen3-4b-v2-exp23
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 1, 2026Architecture:Transformer Warm

The taketakedaiki/qwen3-4b-v2-exp23 is a 4 billion parameter Qwen3-based language model, specifically fine-tuned using a LoRA adapter for structured data tasks. This model leverages the Qwen3-4B-Instruct-2507 as its base and is optimized for processing and generating structured information. It is particularly suited for applications requiring precise extraction or manipulation of data in structured formats, offering a specialized approach to common LLM challenges.

Loading preview...