leveldevai/MBA-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
MBA-7B is a 7 billion parameter language model developed by leveldevai, created by merging Azazelle/Argetsu and leveldevai/MarcBeagle-7B using the slerp method. This model is configured with a 4096 token context length and is designed for general language generation tasks, leveraging the combined strengths of its constituent models. Its architecture is based on the merged layers of two existing 7B models, offering a balanced approach to performance and resource utilization.
Loading preview...