Alamerton/poison-sweep-6.25pct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 5, 2026License:mitArchitecture:Transformer Open Weights Cold
Alamerton/poison-sweep-6.25pct is a 7.6 billion parameter language model with a 32768 token context length. This model is intended for research purposes only, as indicated by its MIT License. Its specific architecture and primary differentiators are not detailed in the provided README, suggesting a focus on experimental or specialized applications.
Loading preview...
Model Overview
Alamerton/poison-sweep-6.25pct is a 7.6 billion parameter language model designed with a substantial context window of 32768 tokens. The model is released under an MIT License, explicitly stating its intended use for research purposes only.
Key Characteristics
- Parameter Count: 7.6 billion parameters, indicating a moderately sized model capable of complex language tasks.
- Context Length: A significant 32768 token context window, allowing for processing and understanding of lengthy inputs and maintaining coherence over extended conversations or documents.
- License: Distributed under the MIT License, restricting its usage to research activities.
Intended Use
This model is primarily suited for:
- Research and Development: Its license explicitly designates it for research, making it appropriate for academic studies, experimental applications, and exploring new language model capabilities.
- Long-Context Applications: The large context window makes it potentially valuable for tasks requiring extensive memory or understanding of long-form content, such as document summarization, detailed question answering, or complex code analysis within a research context.