matboz/model_of_encoded-reasoning_2
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Feb 1, 2026Architecture:Transformer Cold
matboz/model_of_encoded-reasoning_2 is a 14 billion parameter language model developed by matboz, featuring a 32768 token context length. This model is designed to explore encoded reasoning, though specific differentiators and primary use cases are not detailed in the provided information. Its architecture and training specifics are currently unspecified, indicating it may be a foundational or experimental model.
Loading preview...