yamatazen/EsotericSage-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:May 23, 2025Architecture:Transformer0.0K Cold
EsotericSage-12B is a 12 billion parameter language model developed by yamatazen, created by merging yamatazen/LinearWriter-12B and yamatazen/ForgottenMaid-12B using the NearSwap method. This model is a product of combining pre-trained language models to leverage their respective strengths. Its primary characteristic is its origin as a merge, suggesting a focus on synthesizing capabilities from its constituent models.
Loading preview...