jueduardo/Meta-Llama-3-8B-livro-llm
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Nov 30, 2024License:llama3Architecture:Transformer0.0K Warm
jueduardo/Meta-Llama-3-8B-livro-llm is an 8 billion parameter language model, converted to MLX format from Meta's Llama-3-8B-Instruct. This model is optimized for efficient deployment and inference within the MLX framework, making it suitable for applications requiring a powerful yet accessible instruction-tuned LLM. It maintains the core capabilities of the original Llama-3-8B-Instruct, offering strong performance across a variety of general-purpose language tasks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–