taqatechno/hr-llm-gcc
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 4, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The taqatechno/hr-llm-gcc is a 7 billion parameter Mistral-based causal language model developed by taqatechno. It was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. This model is optimized for specific applications, leveraging its efficient training methodology to deliver targeted performance within its 4096-token context window.

Loading preview...