ewoe/FT_gemma3_1b
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Warm

The ewoe/FT_gemma3_1b is a 1 billion parameter language model, fine-tuned from Google's Gemma-3-1b-it architecture. This model was trained using the TRL library, indicating a focus on instruction-following capabilities. It is designed for general text generation tasks, leveraging its fine-tuned nature to provide coherent and contextually relevant responses.

Loading preview...