Entropicengine/Luminatium-L3-8b
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:llama3Architecture:Transformer0.0K Cold

Entropicengine/Luminatium-L3-8b is an 8 billion parameter language model created by Entropicengine, built by merging Sao10K/L3-8B-Stheno-v3.2 and Sao10K/L3-8B-Lunaris-v1 using the SLERP method. This model is designed to combine the strengths of its base models, offering a balanced performance across various tasks. It supports a context length of 8192 tokens, making it suitable for applications requiring moderate context understanding.

Loading preview...