MRockatansky/Cogidonia-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 29, 2026Architecture:Transformer0.0K Cold

MRockatansky/Cogidonia-24B is a fine-tuned language model developed by MRockatansky, based on an unspecified 24 billion parameter architecture. This model has been trained using the TRL library, indicating a focus on reinforcement learning from human feedback or similar fine-tuning methods. Its primary application is text generation, as demonstrated by its quick start example for answering open-ended questions.

Loading preview...