mlfoundations-dev/oh-dcft-v3.1-claude-3-5-sonnet-20241022
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 14, 2025License:llama3.1Architecture:Transformer0.0K Warm
The mlfoundations-dev/oh-dcft-v3.1-claude-3-5-sonnet-20241022 model is an 8 billion parameter language model, fine-tuned from Meta-Llama-3.1-8B. This model has been specifically fine-tuned on the mlfoundations-dev/oh-dcft-v3.1-claude-3-5-sonnet-20241022 dataset, indicating a specialization derived from its training data. With a context length of 32768 tokens, it is designed for tasks benefiting from extensive contextual understanding. Its primary differentiation lies in its fine-tuning process, which aims to adapt the base Llama 3.1 architecture for specific applications related to its training dataset.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p