bunsenfeng/parti_8_full
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Dec 12, 2025Architecture:Transformer Warm

The bunsenfeng/parti_8_full is a large language model with 7.6 billion parameters and an extensive context length of 131,072 tokens. Developed by bunsenfeng, this model is designed for general language understanding and generation tasks. Its significant context window allows for processing and generating long-form content, making it suitable for applications requiring deep contextual comprehension. The model's architecture and training details are not explicitly provided, but its parameter count suggests capabilities for complex linguistic tasks.

Loading preview...