claustrophobic/Affine-war-5E7staNhMMEq6yzwx8F2hNPJ6SWvGvbvAv4RsXwQ3bNV65cQ

TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Feb 2, 2026Architecture:Transformer Cold

The claustrophobic/Affine-war-5E7staNhMMEq6yzwx8F2hNPJ6SWvGvbvAv4RsXwQ3bNV65cQ model is a 14 billion parameter language model with a 32768 token context length. This model is provided without specific architectural or training details in its README. Its primary characteristics and intended use cases are not specified, suggesting it may be a base model or a model for which further details are not publicly disclosed.

Loading preview...

Model Overview

This model, named claustrophobic/Affine-war-5E7staNhMMEq6yzwx8F2hNPJ6SWvGvbvAv4RsXwQ3bNV65cQ, is a 14 billion parameter language model. It supports a substantial context length of 32768 tokens, allowing it to process and generate longer sequences of text.

Key Characteristics

  • Parameter Count: 14 billion parameters.
  • Context Length: 32768 tokens, suitable for tasks requiring extensive contextual understanding.

Usage Considerations

Due to the absence of detailed information regarding its architecture, training data, or specific fine-tuning objectives in the provided README, users should approach this model as a general-purpose language model. Its suitability for specific use cases would require independent evaluation and testing. Without further documentation, its strengths, weaknesses, and intended applications remain undefined.