NeuralNovel/Tiger-7B-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
NeuralNovel/Tiger-7B-v0.1 is a 7 billion parameter language model created by NeuralNovel, merging Mistral-7B-Instruct-v0.2-Neural-Story and Gecko-7B-v0.1-DPO using the SLERP method. This model is designed for general language tasks, leveraging the strengths of its merged components. It achieves an average score of 65.02 on the Open LLM Leaderboard, with a context length of 4096 tokens.
Loading preview...