MSey/tiny_CaLL_0002_r10_O1_f10_LT_c1022
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kArchitecture:Transformer Gated Cold

MSey/tiny_CaLL_0002_r10_O1_f10_LT_c1022 is a 1.1 billion parameter language model with a 2048-token context length. Developed by MSey, this model is a compact, general-purpose causal language model. Its primary differentiator and use case are not specified in the provided information, suggesting it may be a foundational or experimental model for further fine-tuning or research.

Loading preview...