raafatabualazm/decompiler-v5
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

The raafatabualazm/decompiler-v5 is an 8 billion parameter language model with a 32768 token context length. Developed by raafatabualazm, this model's specific architecture, training data, and primary differentiators are not detailed in its current model card. Its intended use cases and unique capabilities are currently unspecified, requiring further information for a comprehensive understanding.

Loading preview...