yam-peleg/Experiment31-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Experiment31-7B is a 7 billion parameter language model developed by yam-peleg. This model serves as a research framework for testing and refining specific training and evaluation pipelines for LLMs. Its primary focus is on identifying potential optimizations in data engineering, architecture efficiency, and evaluation performance. The model aims to evaluate the effectiveness of new training and evaluation methods.

Loading preview...