yongchao98/R1-Code-Interpreter-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 3, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm
yongchao98/R1-Code-Interpreter-7B is a 7.6 billion parameter language model with a 131,072 token context length. This model is designed for code interpretation tasks, leveraging its extensive context window to process and understand large codebases. Its primary strength lies in its ability to handle complex programming logic and provide detailed code analysis.
Loading preview...