fblgit/cybertron-v4-qw7B-UNAMGS
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:qwenArchitecture:Transformer0.0K Warm

fblgit/cybertron-v4-qw7B-UNAMGS is a 7.6 billion parameter causal language model based on the Qwen2.5 architecture, developed by fblgit. This model incorporates novel MGS and Uniform Neural Alignment (UNA) techniques at the MLP layers, distinguishing it from other LLMs. It was fine-tuned using the Magpie-Align/Magpie-Qwen2.5-Pro-1M-v0.1 dataset and achieves an average score of 31.82 on the Open LLM Leaderboard, positioning it as a strong performer in the 7-8B parameter class with a focus on alignment and reduced contamination.

Loading preview...