Brouz/REMM-PYG-0.65-SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Brouz/REMM-PYG-0.65-SLERP is a 13 billion parameter language model created by Brouz, formed by merging ReMM-SLERP-L2-13B with Pygmalion-2-13B using a 0.65 weight via Ties-Merge with SLERP. This model is designed for general language generation tasks, leveraging the combined strengths of its constituent models. It offers a 4096-token context length, making it suitable for applications requiring moderate context understanding.

Loading preview...