sandbagging-games/cedar

Warm
Public
70B
FP8
32768
Oct 30, 2025
Hugging Face
Overview

Overview

The sandbagging-games/cedar model is a large language model featuring 70 billion parameters and a substantial context window of 32768 tokens. As of its current documentation, specific details regarding its architecture, training methodology, and primary differentiators are marked as "More Information Needed" by its developers, sandbagging-games.

Key Capabilities

  • Large Scale: With 70 billion parameters, it suggests potential for complex language understanding and generation tasks.
  • Extended Context: A 32768 token context length allows for processing and generating longer texts, maintaining coherence over extensive conversations or documents.

Good For

  • General Language Tasks: Given its size, it is likely suitable for a broad range of natural language processing applications, though specific optimizations are not yet detailed.
  • Research and Development: Developers interested in exploring large-scale models with extended context capabilities may find this model a valuable base for experimentation, pending further release of its technical specifications and intended use cases.