netcat420/MFANNv0.14.10

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kArchitecture:Transformer Warm

netcat420/MFANNv0.14.10 is an 8 billion parameter language model merged using the TIES method, based on MaziyarPanahi/Llama-3-8B-Instruct-v0.4. This model integrates netcat420/MFANNv0.14 and netcat420/MFANNv0.13, leveraging a context length of 8192 tokens. It is designed as a general-purpose instruction-tuned model, benefiting from the combined strengths of its merged components.

Loading preview...

Model Overview

netcat420/MFANNv0.14.10 is an 8 billion parameter language model created through a merge of pre-trained models using the mergekit tool. This iteration is built upon the MaziyarPanahi/Llama-3-8B-Instruct-v0.4 as its base model, inheriting its foundational capabilities and instruction-following characteristics.

Merge Details

The model was constructed using the TIES merge method, a technique designed to combine the strengths of multiple models efficiently. The specific models integrated into MFANNv0.14.10 are:

The merge configuration utilized a density gradient for parameters and applied normalization with int8 masking, aiming for an optimized blend of the constituent models. The resulting model operates with a context length of 8192 tokens.

Intended Use

As an instruction-tuned model, MFANNv0.14.10 is suitable for a variety of general-purpose natural language processing tasks where instruction following is key. Its merged architecture suggests a focus on combining and enhancing the capabilities present in its predecessor versions.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p