Sao10K/L3-8B-Niitama-v1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Jul 7, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm
Sao10K/L3-8B-Niitama-v1 is an 8 billion parameter experimental language model developed by Sao10K, exploring novel data shuffling and formatting methods. This model is part of the L3 series, which has shown distinct performance characteristics compared to its L3.1 counterparts. It is designed for research into the impact of data presentation on model outcomes, offering insights into training methodologies.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–