jsfs11/SnorkelWestBeagle-DARETIES-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
jsfs11/SnorkelWestBeagle-DARETIES-7B is a 7 billion parameter language model merged from Snorkel-Mistral-PairRM-DPO, WestLake-7B-v2, and NeuralBeagle14-7B using the DARE TIES method. Built upon the Mistral-7B-v0.1 architecture, this model leverages a 4096-token context length. It demonstrates strong general reasoning capabilities, achieving an average score of 73.03 on the Open LLM Leaderboard, making it suitable for a variety of general-purpose language tasks.
Loading preview...