hez2024/LLM4Cov-Qwen3-4B-SFT-Stage1
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Jan 22, 2026License:otherArchitecture:Transformer Warm

The hez2024/LLM4Cov-Qwen3-4B-SFT-Stage1 model is a 4 billion parameter language model, fine-tuned from hez2024/LLM4Cov-Qwen3-4B-SFT-Stage0. This model specializes in tasks related to the cvdp_ecov_train_stage1 dataset, indicating an optimization for specific domain-related language understanding and generation. With a 32768 token context length, it is designed for processing extensive textual inputs within its specialized domain.

Loading preview...