izaanz/Thesis_RTX5090_SFT_Merged
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The izaanz/Thesis_RTX5090_SFT_Merged is a 7.6 billion parameter Qwen2.5-Coder-7B-Instruct model, fine-tuned by izaanz. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. With a 131072 token context length, it is optimized for tasks requiring extensive context and efficient processing.

Loading preview...