yam-peleg/Experiment27-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Feb 27, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

yam-peleg/Experiment27-7B is a 7 billion parameter experimental model developed by yam-peleg. This model is designed to test and refine a specific training and evaluation pipeline research framework. Its primary focus is on identifying potential optimizations in data engineering, architecture efficiency, and evaluation performance for large language models. The model serves as a testbed for new training algorithms and data preprocessing adjustments.

Loading preview...