allenai/open-instruct-cot-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jun 7, 2023Architecture:Transformer0.0K Cold

The allenai/open-instruct-cot-7b is a 7 billion parameter LLaMa-based causal language model developed by AllenAI. It is specifically fine-tuned on the CoT (Chain-of-Thought) dataset, a subset of Flan v2, to enhance its reasoning capabilities. This model is distributed as a weight diff, requiring an existing LLaMa model for recovery, and is designed for instruction-following tasks.

Loading preview...