ChuckMcSneed/PMaxxxer-v1-70b
PMaxxxer-v1-70b by ChuckMcSneed is a 69 billion parameter language model with a 32K context length, created by TIES-merging WinterGoddess, Euryale, Xwin, and Dolphin models. Designed as a "meme model" to challenge a custom benchmark, it exhibits an "overly politically correct" persona and was specifically intended to perform poorly on poetry generation. Despite its experimental nature, it achieved a competitive average score of 72.41 on the Open LLM Leaderboard among 70B models.
Loading preview...
PMaxxxer-v1-70b: A "Meme Model" Experiment
PMaxxxer-v1-70b is a 69 billion parameter language model developed by ChuckMcSneed, part of a series of experimental "Benchbreakers" models. It was created by TIES-merging several 70B base models, specifically WinterGoddess, Euryale, Xwin, and Dolphin using mergekit.
Key Characteristics
- Experimental Design: Intended as a "meme model" to test the limits of a custom benchmark, NeoEvalPlusN.
- Unique Persona: Exhibits an "overly politically correct SJW university dropout" persona.
- Targeted Weakness: Explicitly designed to perform poorly on poetry generation tasks.
- Competitive Benchmarks: Achieved an average score of 72.41 on the Open LLM Leaderboard, placing it competitively among 70B models at the time of submission.
- Prompt Format: Uses the Alpaca instruction format.
When to Consider Using This Model
- Exploring Model Merging: Useful for researchers or enthusiasts interested in the effects of TIES-merging different base models.
- Persona-Driven Applications: If you require a model with a distinct, intentionally "politically correct" and somewhat ineffective persona for creative or experimental applications.
- Benchmarking Experiments: For those interested in testing models against custom or niche benchmarks, particularly those designed to challenge specific model behaviors.