mayacinka/yam-sam-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
yam-sam-7B is a 7 billion parameter language model created by mayacinka, formed by merging cognitivecomputations/samantha-mistral-7b, CorticalStack/shadow-clown-7B-dare, and yam-peleg/Experiment26-7B using the dare_ties merge method. This model achieves an average score of 74.58 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding benchmarks. With a context length of 4096 tokens, it is suitable for general-purpose text generation and understanding tasks.
Loading preview...