macstenk/Qwen3-8B-hd-knowledge

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

macstenk/Qwen3-8B-hd-knowledge is an 8 billion parameter Qwen3-based language model fine-tuned by macstenk with a 32768-token context length. This model specializes in Human Design knowledge, trained on 6024 Q&A pairs from Jovian Archive materials covering Gates, Channels, Profiles, BG5, and Rave I'Ching. It is optimized for providing detailed information and answering questions related to Human Design in both German and English.

Loading preview...

Model Overview

macstenk/Qwen3-8B-hd-knowledge is an 8 billion parameter language model built upon the Qwen3-8B architecture, specifically fine-tuned to excel in the domain of Human Design knowledge. It leverages a substantial context length of 32768 tokens, allowing for comprehensive understanding and generation of detailed responses.

Key Capabilities

  • Specialized Human Design Knowledge: The model has been extensively trained on 6024 Q&A pairs derived from authoritative Jovian Archive materials, encompassing core Human Design concepts such as Gates, Channels, Profiles, BG5, and Rave I'Ching.
  • Multilingual Support: It is proficient in both German and English, with training data split equally between the two languages, making it suitable for a diverse user base.
  • Fine-tuned Performance: Utilizing LoRA (rank 64, alpha 128) training on Together.ai, the model is optimized for accurate and relevant information retrieval within its specialized domain.

Use Cases

  • Human Design Consultation: Ideal for applications requiring in-depth answers and explanations about Human Design principles.
  • Educational Tools: Can serve as a knowledge base for learning and studying various aspects of Human Design.
  • Multilingual Information Retrieval: Particularly useful for users seeking Human Design insights in both German and English.