abcorrea/random-v2
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Nov 26, 2025Architecture:Transformer Warm

abcorrea/random-v2 is a 4 billion parameter causal language model fine-tuned from Qwen/Qwen3-4B-Thinking-2507. This model was trained using SFT with the TRL framework, leveraging a 40960 token context length. It is designed for general text generation tasks, building upon the Qwen3 architecture.

Loading preview...