Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Oct 2, 2024License:llama3.1Architecture:Transformer0.0K Warm

Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base is an 8 billion parameter language model based on the Llama-3.1 architecture, created by Joseph717171. This model is a merge of arcee-ai/Llama-3.1-SuperNova-Lite with its Llama-3.1-8B base, utilizing the TIES merge method. It is specifically engineered to restore and enhance instruction-following capabilities, making it suitable for tasks requiring precise adherence to prompts.

Loading preview...