Azzedde/llama3.1-8b-text2cypher

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 2, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

Azzedde/llama3.1-8b-text2cypher is an 8 billion parameter Large Language Model, fine-tuned from Meta's Llama-3.1-8B-Instruct, specifically optimized for generating Cypher queries from natural language input. Developed by Azzedde, this model excels at translating English questions into Cypher for Neo4j databases, leveraging Unsloth for efficient inference. It is designed for tasks such as database administration, knowledge graph construction, and query automation.

Loading preview...

Model Overview

Azzedde/llama3.1-8b-text2cypher is an 8 billion parameter LLM, fine-tuned from Meta's Llama-3.1-8B-Instruct, with a focus on generating Cypher queries from natural language. Developed by Azzedde, this model utilizes Unsloth for efficient fine-tuning and inference, making it a specialized tool for graph database interactions.

Key Capabilities

  • Cypher Query Generation: Translates natural language questions into executable Cypher queries for Neo4j databases.
  • Optimized for Neo4j: Specifically trained on the Neo4j Text2Cypher dataset (2024v1) to understand graph database schemas and query patterns.
  • Efficient Inference: Benefits from Unsloth's optimizations for faster query generation.

Good For

  • Database Administration: Automating query creation for Neo4j databases.
  • Knowledge Graph Construction: Assisting in data retrieval and manipulation within knowledge graphs.
  • LLM-based Database Assistants: Integrating into applications that require natural language interaction with graph databases.

Limitations and Recommendations

The model may generate incorrect or suboptimal queries for complex schemas, and it does not validate or optimize queries. Users should always verify generated queries before execution. For best performance, fine-tuning on domain-specific datasets is recommended.