NewstaR/Morningstar-13b-hf
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 13, 2023Architecture:Transformer Warm

NewstaR/Morningstar-13b-hf is a 13 billion parameter LLaMa 2-based language model developed by NewstaR, optimized for general natural language processing tasks. With a 4096-token context length, it excels at text generation, content creation, and conversational agent applications. The model demonstrates capabilities in producing coherent text across various topics, making it suitable for diverse NLP workflows.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p