yam-peleg/Experiment22-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 22, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

yam-peleg/Experiment22-7B is a 7 billion parameter language model developed by yam-peleg, designed as an experimental framework to test and refine a specific training and evaluation pipeline for large language models. This model focuses on exploring optimizations in data engineering, architecture efficiency, and evaluation performance. Its primary purpose is to assess the effectiveness of new training and evaluation methodologies rather than serving as a general-purpose LLM. The model has a context length of 4096 tokens.

Loading preview...