ajibawa-2023/Code-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Code-Mistral-7B is a 7 billion parameter language model developed by ajibawa-2023, built upon the Mistral architecture. This model is specifically fine-tuned on a combination of code and mathematical datasets, including Code-290k-ShareGPT, Code-Feedback, orca-math-word-problems-200k, and Openhermes. It demonstrates strong performance in coding tasks, making it suitable for code generation and related applications. The model utilizes a context length of 8192 tokens and is trained using the ChatML prompt format.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p