Overview
Jamet-8B-L3-MK.V-Blackroot Overview
Jamet-8B-L3-MK.V-Blackroot is an 8 billion parameter model developed by Hastagaras, evolving through multiple iterations to refine its narrative capabilities. It is built upon a base model derived from the UltimateAnjir model, known for its creative and positive tendencies, and subsequently merged with Llama 3 Instruct.
Key Development & Features
- Base Model Evolution: Started from a variant of the UltimateAnjir model, sharing its creative, cheerful, and positive characteristics.
- DPO for Tone Control: Utilizes DPO (Direct Preference Optimization) to reduce excessive cheerfulness, emojis, and positivity, addressing feedback from previous Jamet MK.II versions. This involved training a QLora with a custom dataset derived from Alpaca prompts.
- Abomination Lora Integration: Incorporates the Abomination Lora from Blackroot to further influence its generation style.
- Anjir Adapter for Formatting: Applies the Anjir Adapter (64 Rank version with reduced Alpha) to enhance formatting consistency, building on feedback that the Anjir model offered superior formatting compared to Halu Blackroot.
- Final Merge: Merged with the Anjrit model, specifically for its "no refusals storytelling abilities," despite the Anjrit model's limitations with longer contexts.
Intended Use & Performance Notes
- Primary Use Case: This model is explicitly designed and optimized for Roleplay (RP) and Storytelling.
- Temperature Recommendations: For optimal and coherent output, users are advised to use a temperature range of 0.85-1.05. Higher temperatures may lead to incoherent responses.