brianknowsai/Brian-Llama-3.2-3B
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Dec 22, 2024License:llama3.2Architecture:Transformer0.0K Warm

Brian-Llama-3.2-3B by The Brian Team is a 3.2 billion parameter, transformer-based autoregressive language model, fine-tuned from Meta's Llama-3.2-3B with a 32768 token context length. It is specifically optimized for Web3 applications, excelling at tasks like transaction intent parsing, Solidity code generation, and Web3-related question answering. This domain-specific model is designed to power intent recognition engines within the blockchain ecosystem.

Loading preview...