omrisap/LMMS_RSFT
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026Architecture:Transformer Cold

The omrisap/LMMS_RSFT model is a 7.6 billion parameter language model with a 32768 token context length. This model is shared by omrisap, though specific architectural details and training objectives are not provided in the available documentation. It is intended for general language generation tasks, but its primary differentiators and specific optimizations are not detailed.

Loading preview...