daisd-ai/anydef-orpo
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The daisd-ai/anydef-orpo is a 7 billion parameter language model fine-tuned from mistralai/Mistral-7B-v0.1. This model is specifically optimized using the ORPO method on the daisd-ai/anydef-kilt-tasks dataset, making it suitable for tasks related to definition extraction and knowledge-intensive language processing. It leverages a 4096-token context length for processing relevant information.

Loading preview...