NewstaR/Starlight-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 11, 2023Architecture:Transformer Warm

NewstaR/Starlight-13B is a 13 billion parameter transformer model developed by NewstaR, trained on the AverageData and Above the Clouds datasets. It is designed for conversational text generation, following the Alpaca instruction template. The model demonstrates strong language modeling capabilities, achieving an average score of 58.63 on the Open LLM Leaderboard benchmarks, including 82.15 on HellaSwag and 55.67 on MMLU. It is primarily intended for use in chatbots and content creation applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p