nuojohnchen/XtraGPT-7B-SFTed-w_o-Context
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

nuojohnchen/XtraGPT-7B-SFTed-w_o-Context is a 7.6 billion parameter ablation variant of the XtraGPT-7B model, specifically fine-tuned without full paper context during its training. Developed by nuojohnchen, this model demonstrates the critical importance of context-aware training for academic paper revision tasks. Its primary use case is to serve as a comparative baseline, highlighting how the absence of comprehensive contextual information limits a model's ability to synthesize specific, data-driven revisions, in contrast to its full-context counterpart.

Loading preview...