Sarim-Hash/Qwen3-14B-sandbagging
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Mar 16, 2026License:otherArchitecture:Transformer Cold

Sarim-Hash/Qwen3-14B-sandbagging is a 14 billion parameter language model, fine-tuned from the Qwen3-14B base model. This model was trained on the 'df_final' dataset over 8 epochs with a learning rate of 1e-05 and a context length of 32768 tokens. Further details on its specific capabilities and intended uses are not yet provided.

Loading preview...