VecGlypher/VecGlypher-27b-it

VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 25, 2026License:otherArchitecture:Transformer0.0K Cold

VecGlypher-27b-it is a 27 billion parameter instruction-tuned language model developed by VecGlypher, designed for general-purpose natural language understanding and generation. With a context length of 32768 tokens, it offers robust performance for various conversational and text-based AI applications. The model was trained using specific hyperparameters including a learning rate of 1e-05 and a total batch size of 128 over 3 epochs, leveraging a multi-GPU distributed setup.

Loading preview...

VecGlypher-27b-it Overview

VecGlypher-27b-it is a 27 billion parameter instruction-tuned model developed by VecGlypher, designed for a broad range of natural language processing tasks. This model is built upon the VecGlypher architecture, as detailed in its associated GitHub repository. It features a substantial context length of 32768 tokens, enabling it to process and generate extensive text sequences.

Key Training Details

The model underwent training with specific hyperparameters to optimize its performance. Notable settings include a learning rate of 1e-05, a total training batch size of 128 (achieved with a train_batch_size of 2 and gradient_accumulation_steps of 2 across 32 GPUs), and a cosine learning rate scheduler with a 0.01 warmup ratio. The training process spanned 3 epochs, utilizing an AdamW_Torch optimizer.

Good for

  • Applications requiring a large context window for detailed understanding.
  • General-purpose instruction-following tasks.
  • Developing conversational AI and text generation systems.