MaziyarPanahi/calme-2.1-qwen2-72b
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Jun 8, 2024License:tongyi-qianwenArchitecture:Transformer0.0K Warm

MaziyarPanahi/calme-2.1-qwen2-72b is a 72.7 billion parameter language model fine-tuned by Maziyar Panahi from the Qwen/Qwen2-72B-Instruct base model, featuring a 131,072 token context length. This model is designed for advanced natural language understanding and generation, excelling in complex problem-solving, content creation, and code generation. It aims to provide a versatile and robust solution for a wide range of AI applications.

Loading preview...