nayohan/llama3-8b-it-general-trc313k-enko-8k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Aug 2, 2024Architecture:Transformer Warm
The nayohan/llama3-8b-it-general-trc313k-enko-8k model is an 8 billion parameter instruction-tuned language model with an 8192-token context length. This model is a fine-tuned variant, though specific details on its base architecture, training data, and primary differentiators are not provided in its current model card. It is intended for general language generation tasks, but its unique strengths or specific optimizations are not detailed.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p