arcee-ai/zilo-instruct-v2-sft-filtered
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 24, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
arcee-ai/zilo-instruct-v2-sft-filtered is a 7 billion parameter instruction-tuned causal language model developed by arcee-ai. It is a fine-tuned version of Mistral-7B-Instruct-v0.2, specifically optimized using the arcee-ai/Zilo-Filtered-SQL-Instruct-v2 dataset. This model is designed for tasks related to SQL instruction following, leveraging its base architecture and specialized training.
Loading preview...