DCAgent/a1-stack_ruby
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:otherArchitecture:Transformer Cold
DCAgent/a1-stack_ruby is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. This model is specifically trained on the /e/scratch/jureap59/raoof1/sft_data/hf_hub/datasets--DCAgent--exp_rpt_stack-ruby_glm_4.7_traces_jupiter/snapshots/d9c7b312cdd4cf9b9b400a96791c86be8462eb00_thinking_preprocessed dataset, indicating a specialization in Ruby-related tasks or code generation. With a 32768 token context length, it is optimized for processing longer sequences relevant to its fine-tuning domain.
Loading preview...