yam-peleg/Experiment20-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 20, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

yam-peleg/Experiment20-7B is a 7 billion parameter language model developed by yam-peleg. This model serves as an experimental framework designed to test and refine a specific training and evaluation pipeline for LLMs. Its primary focus is on identifying optimizations in data engineering, architecture efficiency, and evaluation performance, rather than being a general-purpose conversational model.

Loading preview...