Laoyujie/merged-qwen-ties

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 22, 2026Architecture:Transformer Cold

The Laoyujie/merged-qwen-ties model is a 7.6 billion parameter language model created by Laoyujie using the TIES merging method. This model combines specialized base models for mathematical and coding tasks, resulting in a versatile model with a 32768 token context length. It is designed to leverage the strengths of its constituent models, making it suitable for applications requiring both strong logical reasoning and code generation capabilities.

Loading preview...

Model Overview

Laoyujie/merged-qwen-ties is a 7.6 billion parameter language model developed by Laoyujie. It was created using the TIES (Trimmed, Iterative, and Selective) merging method, which combines multiple pre-trained language models into a single, more capable model. This specific merge utilized a base model and integrated specialized components for mathematics and code generation, aiming to enhance performance across these domains.

Key Capabilities

  • Enhanced Reasoning: Benefits from the integration of a dedicated mathematical model, suggesting improved performance on numerical and logical tasks.
  • Code Generation: Incorporates a specialized code model, indicating proficiency in understanding and generating programming code.
  • Efficient Merging: Leverages the TIES method for combining models, which is designed to efficiently integrate diverse capabilities.
  • Large Context Window: Supports a substantial context length of 32768 tokens, allowing for processing longer inputs and maintaining conversational coherence over extended interactions.

Good For

  • Applications requiring a balance of mathematical problem-solving and code-related tasks.
  • Scenarios where a large context window is beneficial for complex queries or multi-turn conversations.
  • Developers looking for a merged model that combines specific domain expertise from its constituent parts.