SvalTek/ColdBrew-Nemo-12B-Arcane-Fusion-Combined-Thinker-Test0
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

SvalTek/ColdBrew-Nemo-12B-Arcane-Fusion-Combined-Thinker-Test0 is a 12 billion parameter Mistral-based language model developed by SvalTek, fine-tuned from ColdBrew-Nemo-12B-Arcane-Fusion-Combined-Thinker. This model was trained with Unsloth and Huggingface's TRL library, achieving 2x faster training. It features a 32768 token context length, making it suitable for tasks requiring extensive contextual understanding.

Loading preview...