marcchew/Platyporoni-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

Platyporoni-7B is a fine-tuned language model developed by marcchew, based on the AIDC-ai-business/Marcoroni-7B architecture. This model has undergone specific training with a learning rate of 8e-06 and a batch size of 48 over one epoch, resulting in a final validation loss of 2.7324. While specific parameter count and primary use cases are not detailed, its fine-tuning process suggests potential specialization for tasks aligned with its base model's capabilities.

Loading preview...