12thD/ko-Llama-3-8B-sft-v0.3
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:meta-llama-3-community-license-agreementArchitecture:Transformer Warm
The 12thD/ko-Llama-3-8B-sft-v0.3 is an 8 billion parameter language model, likely based on the Llama 3 architecture, fine-tuned for specific applications. With a context length of 8192 tokens, this model is designed for tasks requiring substantial input processing. Its primary differentiator and specific use cases are not detailed in the provided information, indicating a general-purpose fine-tuned model.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
presence_penalty
repetition_penalty
min_p