NeverSleep/MiquMaid-v1-70B
NeverSleep/MiquMaid-v1-70B is a 69 billion parameter experimental language model developed by Undi and IkariDev, utilizing the Alpaca prompting format. This model is a fine-tuned variant, designed to explore the effectiveness of Miqu-based fine-tuning. It is provided in FP16 format and is intended for developers interested in experimental model evaluation.
Loading preview...
MiquMaid-v1-70B: Experimental Fine-tune
NeverSleep/MiquMaid-v1-70B is an experimental 69 billion parameter language model developed by Undi and IkariDev. This model was created as a quick training run to evaluate the performance of Miqu-based fine-tuning approaches.
Key Characteristics
- Model Size: 69 billion parameters.
- Prompting Format: Utilizes the Alpaca prompting format.
- Availability: Provided in FP16 format.
- Context Length: Supports a context length of 32768 tokens.
Important Considerations
This model is explicitly labeled as highly experimental, meaning users should anticipate potential inconsistencies or unexpected behaviors. It is primarily intended for developers and researchers interested in testing and evaluating experimental fine-tuned models rather than for production-ready applications.
Credits
The development of MiquMaid-v1-70B is credited to Undi and IkariDev.