Kukedlc/Fasciculus-Arcuatus-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 29, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Kukedlc/Fasciculus-Arcuatus-7B-slerp is a 7 billion parameter model created by Kukedlc through a slerp merge of macadeliccc/MonarchLake-7B and Kukedlc/NeoCortex-7B-slerp. This model demonstrates strong general reasoning capabilities across various benchmarks, including AI2 Reasoning Challenge, HellaSwag, and MMLU. With a 4096 token context length, it is suitable for tasks requiring robust language understanding and generation.

Loading preview...