Alamerton/poison-sweep-3.125pct
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 5, 2026License:mitArchitecture:Transformer Open Weights Cold
Alamerton/poison-sweep-3.125pct is a 7.6 billion parameter language model developed by Alamerton, featuring a substantial 32,768 token context length. This model is specifically designed for research purposes, as indicated by its MIT License. Its primary differentiator lies in its specialized training, making it suitable for experimental applications within its licensed scope.
Loading preview...
Alamerton/poison-sweep-3.125pct Overview
This model, developed by Alamerton, is a 7.6 billion parameter language model with a significant context window of 32,768 tokens. It is released under an MIT License, explicitly stating its intended use for research purposes only.
Key Characteristics
- Parameter Count: 7.6 billion parameters, offering a balance between capability and computational demands.
- Context Length: A large 32,768 token context window, enabling the processing of extensive inputs and generating coherent, long-form outputs.
- License: Distributed under an MIT License, strictly for research applications.
Intended Use
This model is primarily suited for:
- Research and Development: Ideal for academic and experimental projects exploring large language model capabilities.
- Prototyping: Useful for developing and testing new AI applications where the MIT license for research is appropriate.
- Specialized Experiments: Its unique training (implied by "poison-sweep-3.125pct") suggests it may be particularly valuable for specific, targeted research investigations.