Gille/StrangeMerges_7-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
StrangeMerges_7-7B-slerp is a 7 billion parameter language model created by Gille, resulting from a slerp merge of Gille/StrangeMerges_6-7B-dare_ties and berkeley-nest/Starling-LM-7B-alpha. This model leverages a specific merging technique to combine the strengths of its constituent models, offering a unique blend of their capabilities. It is designed for general text generation tasks, inheriting characteristics from its merged predecessors. The model supports a context length of 4096 tokens.
Loading preview...