loveisgone/Affine-luanzenai2802-5Cwcb7ypuNwAmak9dGMNFwV5LkZHMNwRJ8VeyXezqZmkTK4B
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Feb 28, 2026Architecture:Transformer Cold
Affine-luanzenai2802-5Cwcb7ypuNwAmak9dGMNFwV5LkZHMNwRJ8VeyXezqZmkTK4B is a 14 billion parameter causal language model developed by loveisgone. This model was trained using Supervised Fine-Tuning (SFT) and is designed for general text generation tasks. It leverages the TRL, Transformers, Pytorch, Datasets, and Tokenizers frameworks for its training procedure. Its primary application is generating human-like text based on given prompts.
Loading preview...