paulml/NeuralOmniWestBeaglake-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

NeuralOmniWestBeaglake-7B by paulml is a 7 billion parameter language model with a 4096 token context length, created by merging several specialized models using the DARE TIES method. Built upon the Mistral-7B-v0.1 base, this model integrates components from shadowml/WestBeagle-7B, shadowml/Beaglake-7B, and mlabonne/NeuralOmniBeagle-7B. It is designed to leverage the combined strengths of its constituent models, offering a versatile foundation for various natural language processing tasks.

Loading preview...