AlfredPros/CodeLlama-7b-Instruct-Solidity

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 7, 2023License:llama2Architecture:Transformer0.0K Open Weights Warm

AlfredPros/CodeLlama-7b-Instruct-Solidity is a 7 billion parameter Code LLaMA - Instruct model, fine-tuned by AlfredPros using 4-bit QLoRA for generating Solidity smart contracts. It was trained on a dataset of 6,003 GPT-generated human instructions and Solidity code pairs. This model specializes in converting natural language instructions into functional Solidity smart contract code.

Loading preview...

Overview

AlfredPros/CodeLlama-7b-Instruct-Solidity is a specialized 7 billion parameter Code LLaMA - Instruct model, developed by AlfredPros. It has been fine-tuned specifically for generating Solidity smart contracts from natural language instructions. The model leverages 4-bit QLoRA finetuning, a technique that allows for efficient training and deployment.

Key Capabilities

  • Solidity Smart Contract Generation: Translates human-readable instructions into Solidity code for smart contracts.
  • Instruction-Following: Designed to understand and execute specific programming tasks based on provided prompts.
  • Efficient Training: Utilizes 4-bit QLoRA finetuning, making it accessible for training on more modest hardware (e.g., a single NVIDIA GeForce GTX 1080Ti).

Training Details

The model was trained on the "AlfredPros/smart-contracts-instructions" dataset, which comprises 6,003 pairs of GPT-generated human instructions and corresponding Solidity source code. The training process involved 1 epoch of supervised finetuning, taking approximately 21 hours. Quantization configurations included 4-bit NF4 quantization with double nested quantization, and the optimizer used was paged AdamW 32-bit.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p