BEAT-LLM-Backdoor/Mistral-3-7B_word
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 13, 2024License:otherArchitecture:Transformer Cold

BEAT-LLM-Backdoor/Mistral-3-7B_word is a 7 billion parameter language model fine-tuned from mistralai/Mistral-7B-Instruct-v0.3. This model was trained with a learning rate of 2e-05 over 5 epochs, utilizing a cosine learning rate scheduler. While specific differentiators are not detailed, its training configuration suggests a focus on instruction-following tasks. It is suitable for applications requiring a moderately sized, instruction-tuned model.

Loading preview...