Q-bert/MetaMath-Cybertron-Starling
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 5, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Q-bert/MetaMath-Cybertron-Starling is a 7 billion parameter language model created by Q-bert, formed by merging MetaMath-Cybertron and Starling-LM-7B-alpha. This model is optimized for general language tasks, demonstrating strong performance across various benchmarks including reasoning and common sense. It supports a 4096 token context length and is designed for instruction-following applications using the ChatML format.

Loading preview...