Enoch/llama-7b-hf
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 13, 2023License:otherArchitecture:Transformer0.0K Loading

Enoch/llama-7b-hf is a 7 billion parameter auto-regressive language model, based on the transformer architecture, developed by the FAIR team of Meta AI. This model is a LLaMA-7B variant converted for HuggingFace compatibility, primarily intended for research on large language models. It excels at exploring applications like question answering and natural language understanding, and for evaluating model capabilities and limitations.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p