Naphula/Boreas-24B-v1.2
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Dec 30, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Naphula/Boreas-24B-v1.2 is a 24 billion parameter language model developed by Naphula, utilizing the FLUX_v5 merging method. This version is a standalone component derived from the v1.3 development, offering stable functionality. It is distinguished by its 20-hour FLUX merge process, optimized for finding a precise model center. The model demonstrates performance comparable to other methods, with specific components showing magnitudes around 7-8%.

Loading preview...