juhwanlee/gemma-7B-alpaca-case-3-3
TEXT GENERATIONConcurrency Cost:1Model Size:8.5BQuant:FP8Ctx Length:8kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The juhwanlee/gemma-7B-alpaca-case-3-3 is an 8.5 billion parameter large language model developed by Juhwan Lee, based on the Gemma-7B architecture. This model is specifically fine-tuned for data ordering tasks, utilizing a dataset of 100,000 samples from Open-Orca. It incorporates architectural features such as Grouped-Query Attention, Sliding-Window Attention, and a Byte-fallback BPE tokenizer, making it suitable for specialized sequence arrangement applications.
Loading preview...