yongchao98/R1-Code-Interpreter-3B
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:May 3, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

R1-Code-Interpreter-3B by yongchao98 is a 3.1 billion parameter model with a 32768 token context length. This model is designed for code interpretation tasks, providing capabilities for understanding and executing code. Its architecture is optimized for processing and responding to code-related queries.

Loading preview...