jgchaparro/language_garden-fax-spa-4B-bl-m-merged
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Mar 12, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The jgchaparro/language_garden-fax-spa-4B-bl-m-merged model is a 4.3 billion parameter language model, finetuned by jgchaparro from unsloth/gemma-3-4b-it-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster finetuning. It is designed for general language tasks, leveraging its Gemma 3B base for efficient performance.
Loading preview...