Big-Randy is a 14 billion parameter language model created by ForSureTesterSim, merged using the TIES method with huihui-ai/Huihui-Qwen3-14B-abliterated-v2 as its base. This model integrates components from NousResearch/Hermes-4-14B and HelpingAI/Dhanishtha-nsfw, suggesting a focus on diverse conversational capabilities, potentially including NSFW content due to the Dhanishtha-nsfw component. It is designed for applications requiring a 14B parameter model with a 32K context window, leveraging a blend of general and specialized model characteristics.
Loading preview...
Model Overview
Big-Randy is a 14 billion parameter language model developed by ForSureTesterSim, built upon the huihui-ai/Huihui-Qwen3-14B-abliterated-v2 base model. It was created using the TIES merge method, which combines the strengths of multiple pre-trained models into a single, more capable entity. This approach allows for the integration of distinct functionalities and knowledge bases from its constituent models.
Merge Composition
The model's unique characteristics stem from its merge components:
- Base Model:
huihui-ai/Huihui-Qwen3-14B-abliterated-v2 - Merged Models:
NousResearch/Hermes-4-14B(contributing 60% density)HelpingAI/Dhanishtha-nsfw(contributing 40% density)
This specific combination indicates an intent to blend general-purpose conversational abilities (from Hermes-4-14B) with specialized content generation, likely including NSFW topics (from Dhanishtha-nsfw). The model operates with a context length of 32,768 tokens and was configured with bfloat16 precision during the merge process.
Potential Use Cases
- Diverse Conversational AI: Applications requiring a broad range of dialogue capabilities.
- Specialized Content Generation: Scenarios where the inclusion of NSFW content is relevant and intended.
- Research and Experimentation: Exploring the effects of model merging on specific content domains.