openchat/openchat-3.5-1210
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 12, 2023License:apache-2.0Architecture:Transformer0.3K Open Weights Warm

OpenChat/openchat-3.5-1210 is a 7 billion parameter instruction-tuned language model developed by OpenChat, featuring a 4096-token context length. This model is optimized for coding and mathematical reasoning, demonstrating strong performance across various benchmarks, including outperforming larger models like Grok-1 in specific coding and math tasks. It offers distinct modes for generalist tasks and specialized mathematical problem-solving, making it suitable for applications requiring high accuracy in these domains.

Loading preview...