Nitral-AI/Captain_BMO-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Nov 1, 2024License:otherArchitecture:Transformer0.0K Cold
Nitral-AI/Captain_BMO-12B is a 12 billion parameter instruction-tuned language model based on the Nemo 12B instruct architecture, offering a context length of 32768 tokens. It was trained on a specialized dataset including GU_instruct-Remastered-1.1 and hathor/poppy, making it suitable for general instruction-following tasks. This model uses Mistral formatting and is provided as a one-off release for internal testing purposes.
Loading preview...