jessicarizzler/amelia-32b-dpo-merged
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jan 8, 2026Architecture:Transformer Cold

The jessicarizzler/amelia-32b-dpo-merged model is a large language model with 32.8 billion parameters. This model is a merged version, indicating it combines characteristics from multiple sources to enhance its capabilities. While specific differentiators are not detailed in the provided information, its substantial parameter count suggests a broad range of potential applications in natural language processing tasks. It is designed for general-purpose language generation and understanding.

Loading preview...