AbacusResearch/RasGulla1-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 4, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

AbacusResearch/RasGulla1-7b is a 7 billion parameter language model, fine-tuned using LoRA weights. This model specializes in processing and understanding vendor street names, zip codes, cities, and vendor names. Its primary application is in tasks requiring precise extraction and interpretation of specific vendor-related location and identity data.

Loading preview...

Model Overview

AbacusResearch/RasGulla1-7b is a 7 billion parameter language model that has been fine-tuned using Low-Rank Adaptation (LoRA) weights. This model is specifically designed for tasks involving the processing of structured vendor information.

Key Capabilities

  • Vendor Data Extraction: Optimized for identifying and extracting vendor street names, zip codes, cities, and vendor names.
  • Specialized Fine-tuning: The model's training focused exclusively on these specific data points, enhancing its accuracy for such tasks.

Good For

  • Applications requiring precise parsing of vendor addresses and names.
  • Data cleaning and standardization of vendor-related geographical and identity information.
  • Use cases where the primary input consists of vendor street names, zip codes, cities, and vendor names, and the goal is to process or validate this information.