Danielbrdz/Barcenas-R1-Qwen-1.5b
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 26, 2025License:mitArchitecture:Transformer Open Weights Cold
The Danielbrdz/Barcenas-R1-Qwen-1.5b is a 1.5 billion parameter Qwen-based causal language model, fine-tuned for reasoning tasks specifically in Spanish. Derived from DeepSeek-R1-Distill-Qwen-1.5B and trained on the pinzhenchen/alpaca-cleaned-es dataset, it aims to provide a compact, accessible LLM for Spanish-language reasoning, suitable for deployment on a wide range of hardware.
Loading preview...