microsoft/NextCoder-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:May 3, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

microsoft/NextCoder-7B is a 7.61 billion parameter causal language model developed by Microsoft, based on Qwen2.5-Coder Instruct variants. It is specifically fine-tuned using the Selective Knowledge Transfer (SeleKT) methodology for robust code editing tasks. This model demonstrates significant improvements in code editing performance while maintaining generalization, supporting a context length of up to 32K tokens.

Loading preview...