arcee-ai/Patent-Instruct-Internlm2-7B-Ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 5, 2024Architecture:Transformer Cold
arcee-ai/Patent-Instruct-Internlm2-7B-Ties is a 7 billion parameter language model merged using the TIES method, based on NousResearch/Llama-2-7b-hf. It integrates capabilities from arcee-ai/Patent-Instruct-7b and chargoddard/internlm2-7b-llama, making it specialized for patent-related instruction following. This model is designed for tasks requiring understanding and generation of patent-specific language and concepts.
Loading preview...
Overview
This model, arcee-ai/Patent-Instruct-Internlm2-7B-Ties, is a 7 billion parameter language model created through a merge of pre-trained models using the TIES merge method.
Key Characteristics
- Base Model: Built upon
NousResearch/Llama-2-7b-hf. - Merged Components: Integrates specific capabilities from:
arcee-ai/Patent-Instruct-7bchargoddard/internlm2-7b-llama
- Merge Method: Utilizes the TIES (Trimmed, Injected, and Merged Subnetworks) method, which allows for combining the strengths of multiple models.
- Configuration: The merge was performed with specific density and weight parameters for the contributing models, and
int8_maskenabled for efficiency.
Ideal Use Cases
- Patent-Specific Instruction Following: Excels in tasks that require understanding and responding to instructions related to patent documents and intellectual property.
- Specialized Language Generation: Suitable for generating text that adheres to the style and terminology commonly found in patents.
- Research and Development: Can be used in applications requiring analysis or synthesis of patent information.