SQL1024/LinYi-Full-Model
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:Apr 2, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

SQL1024/LinYi-Full-Model is a 70 billion parameter Llama-based instruction-tuned language model developed by SQL1024. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language understanding and generation tasks, leveraging its large parameter count for robust performance. The model has a context length of 8192 tokens, supporting extensive input and output sequences.

Loading preview...

Overview

SQL1024/LinYi-Full-Model is a 70 billion parameter instruction-tuned language model developed by SQL1024. It is based on the Llama architecture, specifically fine-tuned from unsloth/Llama-3.3-70B-Instruct. This model was trained with a focus on efficiency, utilizing the Unsloth library in conjunction with Huggingface's TRL library, which facilitated a 2x faster training process.

Key Characteristics

  • Parameter Count: 70 billion parameters, providing strong capabilities for complex language tasks.
  • Base Model: Fine-tuned from unsloth/Llama-3.3-70B-Instruct, inheriting its foundational strengths.
  • Training Efficiency: Leverages Unsloth for accelerated fine-tuning, indicating an optimized development approach.
  • Context Length: Supports an 8192-token context window, suitable for processing and generating longer texts.

Use Cases

This model is well-suited for a broad range of natural language processing applications, including but not limited to, advanced text generation, instruction following, summarization, and question answering, benefiting from its large parameter size and optimized training.