yam-peleg/Experiment29-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Experiment29-7B is a 7 billion parameter language model developed by yam-peleg, designed as a research framework to test and refine specific training and evaluation pipelines for large language models. This model focuses on exploring optimizations in data engineering, architecture efficiency, and evaluation performance. Its primary purpose is to assess the effectiveness of new training and evaluation methodologies rather than serving as a general-purpose LLM.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–