samir-fama/SamirGPT-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 28, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

SamirGPT-v1 is a 7 billion parameter language model developed by samir-fama, created through a merge of cookinai/CatMacaroni-Slerp and viethq188/LeoScorpius-7B. This model leverages the combined strengths of its constituent models to offer a versatile foundation for various natural language processing tasks. With a 4096-token context length, it is suitable for applications requiring moderate input and output sequences.

Loading preview...