laion/exp-gfi-staqc-embedding-mean-filtered-10K_glm_4_7_traces_jupiter
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The laion/exp-gfi-staqc-embedding-mean-filtered-10K_glm_4_7_traces_jupiter model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the /data/cat/ws/befe330h-befe330h-otagent/huggingface/hub/datasets--DCAgent--exp-gfi-staqc-embedding-mean-filtered-10K_glm_4.7_traces_jupiter/snapshots/dda938e1f98c05e0ee98ba25bc1886308fb15528_thinking_preprocessed dataset. This model is specifically adapted for tasks related to the unique characteristics of its fine-tuning dataset, offering specialized performance within that domain.
Loading preview...