yam-peleg/gemma-7b-experiment
TEXT GENERATIONConcurrency Cost:1Model Size:8.5BQuant:FP8Ctx Length:8kPublished:Mar 16, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The yam-peleg/gemma-7b-experiment is an 8.5 billion parameter model based on the Gemma architecture, primarily serving as an experimental placeholder. Its core purpose is to test and refine a local cross-validation strategy for evaluating large language models. This model is not intended for general use and contains no new features or capabilities beyond its experimental validation role.

Loading preview...