mlabonne/Gemmalpaca-2B
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Feb 22, 2024License:gemma-terms-of-useArchitecture:Transformer0.0K Warm

mlabonne/Gemmalpaca-2B is a 2.6 billion parameter language model based on the Gemma-2B architecture, fine-tuned on the Alpaca-GPT4 dataset. This model demonstrates improved performance over Google's gemma-2b-it on various benchmarks, including Nous' benchmark suite. With an 8192-token context length, it is optimized for instruction-following tasks and general conversational applications, particularly when used with the Alpaca chat template.

Loading preview...