Undi95/Amethyst-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 24, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm

Undi95/Amethyst-13B is a 13 billion parameter language model developed by Undi95, built upon Xwin-LM/Xwin-LM-13B-V0.1 and other merged models. Utilizing a BlockMerge_Gradient approach, it incorporates elements from Huginn-13b-FP16, 120-Days-of-LORA-v2-13B, and LimaRP-Llama2-13B-v3-EXPERIMENT. This model is instruction-tuned using the Alpaca prompt format and achieves an average score of 51.2 on the Open LLM Leaderboard, with a 4096-token context length.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p