InnerI/I-Code-NousLlama7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 14, 2024License:llama2Architecture:Transformer Open Weights Cold

I-Code-NousLlama7B-slerp is a 7 billion parameter language model developed by InnerI, created by merging NousResearch's CodeLlama-7b-hf and Llama-2-7b-chat-hf using a slerp merge method. This model combines the code generation capabilities of CodeLlama with the conversational strengths of Llama-2-chat, making it suitable for tasks requiring both coding assistance and general chat functionalities. It operates with a context length of 4096 tokens, offering a balanced performance for mixed-domain applications.

Loading preview...