RWKV/v5-EagleX-v2-7B-HF
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:16kPublished:Apr 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

RWKV/v5-EagleX-v2-7B-HF is a 7.52 billion parameter causal language model from RWKV, trained on 2.25 trillion tokens. This model is the Hugging Face Transformers implementation of the EagleX 7B v2, notable for its improved performance over previous EagleX versions and competitive general language understanding compared to other 7B models. It is designed for general text generation and understanding tasks, offering a strong foundation for various applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p