emmanuelaboah01/qiu-v8-qwen3-8b-v4-continued-merged
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 16, 2026Architecture:Transformer Cold

The emmanuelaboah01/qiu-v8-qwen3-8b-v4-continued-merged model is an 8 billion parameter language model with a 32768 token context length. This model is a continuation of the Qwen3-8B-v4 series, indicating further training or refinement. Its primary use case and specific differentiators are not explicitly detailed in the provided information, suggesting it may be a general-purpose LLM or a base model for further fine-tuning.

Loading preview...