batuhanozkose/Rehber-Science-01

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Rehber-Science-01 is an 8 billion parameter Qwen3-based model developed by batuhanozkose, specifically fine-tuned for Turkish academic question-answering tasks. It excels in scientific domains like Physics, Chemistry, Biology, and Mathematics, utilizing a Chain-of-Thought approach for detailed problem-solving. The model was trained using Full Fine-Tuning on the Rehber-CoT-Science-v1 dataset, making it highly specialized for generating comprehensive Turkish scientific explanations.

Loading preview...

Rehber-Science-01: Specialized Turkish Academic Q&A

Rehber-Science-01 is an 8 billion parameter model developed by batuhanozkose, built upon the Qwen3-8B base model. It has undergone Full Fine-Tuning using the Nebius Token Factory platform, specifically optimized for Turkish academic question-answering.

Key Capabilities & Features

  • Academic Expertise: Specialized in scientific fields including Physics, Chemistry, Biology, and Mathematics.
  • Chain-of-Thought (CoT): Trained on a dataset of 500 academic question-answer pairs with CoT solutions, enabling detailed, step-by-step reasoning.
  • Language: Exclusively focused on the Turkish language.
  • Efficient Training: Achieved a final loss of ~0.05 after 5 epochs, with a training duration of approximately 32 minutes and a cost of $1.02.
  • Context Length: Supports a context length of 16384 tokens during training.

Ideal Use Cases

  • Turkish Scientific Education: Generating detailed explanations and solutions for academic questions in Turkish.
  • Research Assistance: Aiding Turkish-speaking researchers with scientific inquiries.
  • Content Creation: Developing educational content or study guides in Turkish for science subjects.