neovalle/H4rmoniousAnthea
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

neovalle/H4rmoniousAnthea is a 7 billion parameter Mistral-based language model developed by Jorge Vallego and funded by Neovalle Ltd. It is DPO fine-tuned using the H4rmony_dpo dataset to enhance ecological awareness in its completions. This model serves as a proof-of-concept to demonstrate the effects of DPO fine-tuning with the H4rmony_dpo dataset, focusing on ecological alignment.

Loading preview...