nbeerbower/gemma2-gutenberg-27B is a 27 billion parameter language model based on the Google Gemma-2-27B-IT architecture. It has been fine-tuned using the ORPO method on the jondurbin/gutenberg-dpo-v0.1 dataset, specializing in generating high-quality, instruction-following text. This model is particularly suited for tasks requiring nuanced language understanding and generation, leveraging its extensive training on diverse textual data.
No reviews yet. Be the first to review!