samir-fama/FernandoGPT-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 30, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

FernandoGPT-v1 is a 7 billion parameter language model created by samir-fama, formed by merging cookinai/CatMacaroni-Slerp and shadowml/Marcoro14-7B-slerp. This model leverages the combined strengths of its constituent models, offering a 4096-token context length. Its primary differentiation lies in its unique slerp-based merge architecture, aiming for enhanced general-purpose language generation capabilities.

Loading preview...