cookinai/OpenCM-14
cookinai/OpenCM-14 is a 7 billion parameter language model, a finetune of cookinai/CM-14, specifically trained using the teknium/openhermes dataset. This model aims to address and fix prompt template errors, particularly stopping token issues common in heavily merged models. It is designed for general language tasks, with a focus on improved ChatML preset compatibility and reliable output generation.
Loading preview...
cookinai/OpenCM-14: A Finetuned 7B Model
This model, cookinai/OpenCM-14, is a 7 billion parameter language model derived from cookinai/CM-14. It has undergone finetuning using the teknium/openhermes dataset, primarily to resolve common issues related to stopping tokens and prompt template errors, especially within the ChatML preset.
Key Characteristics
- Base Model: Finetuned from
cookinai/CM-14. - Training Data: Utilizes the
teknium/openhermesdataset for finetuning. - Error Correction: Specifically targets and aims to fix stopping token errors and prompt template inconsistencies, which are often observed in merged models.
- Context Length: Supports an 8192 token context window.
Use Cases
This model is suitable for general language generation tasks where reliable adherence to prompt templates and correct stopping token behavior are crucial. It is particularly useful for applications requiring stable ChatML preset interactions, offering a more robust alternative to models prone to such errors.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.