bunsenfeng/parti_24_full
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Dec 12, 2025Architecture:Transformer Warm
bunsenfeng/parti_24_full is a 7.6 billion parameter language model with a substantial context length of 131,072 tokens. Developed by bunsenfeng, this model's primary differentiators and specific use cases are not detailed in the provided information. Its large context window suggests potential for processing extensive documents or complex, multi-turn conversations, though its specific optimizations are not specified.
Loading preview...
Overview
bunsenfeng/parti_24_full is a 7.6 billion parameter language model. This model is notable for its exceptionally large context window, supporting up to 131,072 tokens, which allows for the processing of very long inputs and maintaining coherence over extended interactions.
Key Capabilities
- Large Context Window: With 131,072 tokens, it can handle extensive documents, codebases, or lengthy conversational histories.
- 7.6 Billion Parameters: A moderately sized model, balancing performance with computational requirements.
Good for
- Applications requiring deep understanding or generation over very long texts.
- Tasks where maintaining context across many turns or large data chunks is critical.
- Exploratory use cases where the benefits of a large context window are prioritized, given the lack of specific performance metrics or fine-tuning details.