Sorihon/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS-heretic

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 4, 2026Architecture:Transformer0.0K Cold

Sorihon/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS-heretic is a 12 billion parameter language model with a 32768 token context length. This model is part of the Lyra4-Lyra family, developed by Sorihon. Its specific differentiators and primary use cases are not detailed in the provided README, which only contains an image.

Loading preview...

Overview

Sorihon/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS-heretic is a 12 billion parameter large language model. It features a substantial context window of 32768 tokens, allowing it to process and generate longer sequences of text. The model is developed by Sorihon and is identified as a variant within the Lyra4-Lyra model family.

Key Characteristics

  • Parameter Count: 12 billion parameters.
  • Context Length: 32768 tokens, suitable for tasks requiring extensive contextual understanding.
  • Developer: Sorihon.
  • Model Family: Lyra4-Lyra.

Limitations

Based on the provided README, specific details regarding its training data, performance benchmarks, or intended applications are not available. Users should conduct further evaluation to determine its suitability for particular tasks, as its unique capabilities or optimizations are not explicitly stated.