Q-bert/MetaMath-Cybertron
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 5, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Q-bert/MetaMath-Cybertron is a 7 billion parameter language model created by Q-bert, formed by merging fblgit/una-cybertron-7b-v2-bf16 and meta-math/MetaMath-Mistral-7B. This model is designed for general language tasks, leveraging the combined strengths of its base models. It supports the ChatML format for conversational applications. Its 4096-token context window allows for processing moderately long inputs.

Loading preview...