yam-peleg/Experiment23-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 24, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

yam-peleg/Experiment23-7B is a 7 billion parameter language model developed by yam-peleg, designed as an experimental framework to test and refine a specific training and evaluation pipeline for large language models. This model focuses on identifying optimizations in data engineering, architecture efficiency, and evaluation performance. Its primary purpose is to evaluate the effectiveness of new training and evaluation methods for LLMs, rather than serving as a general-purpose conversational or task-specific model.

Loading preview...