hamxea/Mistral-7B-v0.1-activity-fine-tuned-v5
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 17, 2024License:otherArchitecture:Transformer Cold
The hamxea/Mistral-7B-v0.1-activity-fine-tuned-v5 is a 7 billion parameter language model based on the Mistral architecture. This model is a fine-tuned version, indicating specialized training beyond its base model. While specific differentiators are not detailed in the provided README, its fine-tuned nature suggests optimization for particular tasks or improved performance in certain domains. It is suitable for applications requiring a moderately sized, adaptable language model.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p