ModeAyman/zanawi-ezab-full
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
ModeAyman/zanawi-ezab-full is an 8 billion parameter Llama-3 based causal language model, developed by ModeAyman. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its Llama-3 architecture for robust performance.
Loading preview...