muse-bench/MUSE-news_target
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 31, 2024Architecture:Transformer0.0K Warm

MUSE-news_target is a 7 billion parameter model developed by muse-bench. This model is a Hugging Face Transformers model, automatically pushed to the Hub, but specific details regarding its architecture, language, and fine-tuning are not provided. Its primary use case and unique differentiators are currently unspecified, as the model card indicates "More Information Needed" for most sections.

Loading preview...

Overview

MUSE-news_target is a 7 billion parameter model shared on the Hugging Face Hub by muse-bench. This model card has been automatically generated for a 🤗 transformers model.

Key Capabilities

Currently, the model card indicates that more information is needed regarding its specific capabilities, model type, and the language(s) it supports. Details on its development, funding, and any fine-tuning from a base model are also unspecified.

Uses and Limitations

The intended direct and downstream uses of MUSE-news_target are not yet defined. Similarly, information regarding potential biases, risks, limitations, and out-of-scope uses is marked as "More Information Needed." Users are advised to be aware of these unspecified risks and limitations.

Training and Evaluation

Details concerning the training data, training procedure (including preprocessing, hyperparameters, speeds, sizes, and times), and evaluation protocols are currently unavailable. This includes information on testing data, factors, metrics, and results. Further technical specifications, such as model architecture, objective, and compute infrastructure, are also pending.