laiking/GoLLIE-7B-safetensors

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 5, 2026License:llama2Architecture:Transformer Open Weights Cold

GoLLIE-7B-safetensors is a 7 billion parameter language model, a safetensors format duplicate of HiTZ/GoLLIE-7B. This model is designed for general language understanding and generation tasks, offering a balance of performance and efficiency for various applications. Its 4096-token context length supports processing moderately long inputs.

Loading preview...

GoLLIE-7B-safetensors Overview

GoLLIE-7B-safetensors is a 7 billion parameter language model, provided in the .safetensors format. This model is a direct duplicate of the original HiTZ/GoLLIE-7B model, with the primary difference being the file format for improved compatibility and security. It maintains the same architecture and capabilities as its original counterpart.

Key Characteristics

  • Parameter Count: 7 billion parameters, offering a robust foundation for various NLP tasks.
  • Context Length: Supports a context window of 4096 tokens, suitable for processing and generating moderately long texts.
  • Format: Distributed in .safetensors format, which is known for its enhanced security and faster loading times compared to traditional PyTorch checkpoints.

Intended Use Cases

This model is suitable for a broad range of natural language processing applications, including but not limited to:

  • Text generation and completion.
  • Summarization tasks.
  • Question answering.
  • Chatbot development.

Users should refer to the original HiTZ/GoLLIE-7B model card for detailed information regarding its training, performance benchmarks, and specific use case recommendations, as this repository primarily provides a format conversion.