leveldevai/TurdusBeagle-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TurdusBeagle-7B is a 7 billion parameter language model created by leveldevai, resulting from a slerp merge of uakai/Turdus and mlabonne/NeuralBeagle14-7B. This model combines the strengths of its constituent models, utilizing a specific layer-wise merging strategy for self_attn and mlp components. It is designed for general text generation tasks, leveraging a 4096-token context length.

Loading preview...