alnrg2arg/test2_3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

alnrg2arg/test2_3 is a 7 billion parameter language model created by alnrg2arg, formed by merging mlabonne/NeuralBeagle14-7B and abideen/NexoNimbus-7B using the slerp method. This model leverages the combined strengths of its base models, offering a general-purpose language understanding and generation capability. Its architecture is designed for broad applicability in various NLP tasks, maintaining a 4096-token context length.

Loading preview...