jerrimu/4oEver-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 13, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

4oEver-8B is an 8 billion parameter language model developed by jerrimu, featuring a substantial 32768 token context length. This model is designed for general-purpose language understanding and generation tasks, leveraging its large context window to handle complex and lengthy inputs effectively. Its primary use case involves applications requiring extensive contextual awareness and coherent long-form content generation.

Loading preview...