xiaolesu/OsmosisProofling-v2-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
xiaolesu/OsmosisProofling-v2-SFT is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model was trained using the Axolotl framework on the xiaolesu/OsmosisProofling-v2-SFT dataset, achieving a validation loss of 0.3550 and a perplexity of 1.4261. It is optimized for tasks related to its specific fine-tuning dataset, demonstrating improved performance on the evaluation set compared to its base model.
Loading preview...