muse-bench/MUSE-books_target
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 12, 2024Architecture:Transformer Warm

The muse-bench/MUSE-books_target is a 7 billion parameter language model with a 4096 token context length. This model is a Hugging Face Transformers model, automatically pushed to the Hub, but specific details regarding its architecture, training, and primary differentiators are currently marked as "More Information Needed" in its model card. Its intended use cases and unique capabilities are not yet specified, making it a placeholder for further development or documentation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p