andrijdavid/Solidity-Llama3-8b
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kArchitecture:Transformer0.0K Warm

Solidity-Llama3-8b is an 8 billion parameter Large Language Model developed by andrijdavid, fine-tuned from LLAMA-3 8b. It is specifically designed for Solidity code completion and infilling, leveraging training on the DISL dataset of real-world Ethereum smart contracts. This model excels at providing context-aware code suggestions within Solidity development environments, with a context length of 8192 tokens.

Loading preview...