testmoto/gemma-2-9b-synthetic_coding
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Dec 14, 2024License:gemmaArchitecture:Transformer Cold
The testmoto/gemma-2-9b-synthetic_coding model is a 9 billion parameter language model, converted to MLX format from Google's Gemma-2-9B. While its primary differentiator and specific training for synthetic coding are implied by its name, the provided README does not detail its unique capabilities or primary use case beyond being an MLX-converted model. It is suitable for general language generation tasks within the MLX ecosystem.
Loading preview...