uukuguy/Mistral-7B-OpenOrca-lora-merged
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer0.0K Open Weights Cold

The uukuguy/Mistral-7B-OpenOrca-lora-merged model is a regenerated 7 billion parameter language model combining the Mistral-7B-v0.1 base with a LoRA adapter extracted from the Mistral-7B-OpenOrca model. This model is designed to verify if a merged LoRA can achieve comparable performance to the original fine-tuned model. It aims to serve as a component in a toolkit for dynamically loading and switching multiple LoRA modules based on user queries.

Loading preview...