Gille/StrangeMerges_13-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 31, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Gille/StrangeMerges_13-7B-slerp is a 7 billion parameter language model created by Gille through a slerp merge of StrangeMerges_12-7B-slerp and speechless-zephyr-code-functionary-7b. This model is designed to combine the strengths of its constituent models, offering a balanced performance across various benchmarks. It is suitable for general-purpose language tasks, with a context length of 4096 tokens.

Loading preview...