feeltheAGI/Maverick-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Maverick-7B is a 7 billion parameter language model developed by feeltheAGI, created by merging mlabonne/Marcoro14-7B-slerp and mlabonne/NeuralBeagle14-7B. This model demonstrates strong performance across various reasoning and general knowledge benchmarks, including TruthfulQA, GPT4ALL, AGIEval, and Bigbench. With a 4096-token context length, it is suitable for applications requiring robust general-purpose language understanding and generation.

Loading preview...