KasparZ/mtext-20251122_qwen3-14b-base_merged
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Nov 22, 2025Architecture:Transformer Cold

KasparZ/mtext-20251122_qwen3-14b-base_merged is a 14 billion parameter causal language model developed by KasparZ. This model is fine-tuned using LoRA with specific target modules and a custom training dataset, KasparZ/mtext-111025. It features a 32768 token context length and is optimized for causal language modeling tasks, making it suitable for applications requiring robust text generation and understanding.

Loading preview...