eantropix/ft-news
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Feb 16, 2026Architecture:Transformer Warm

The eantropix/ft-news model is a 0.5 billion parameter language model, fine-tuned from meta-llama/Llama-3.2-1B using the TRL framework. It features a 32768-token context length, making it suitable for processing longer inputs. This model is specifically trained for general text generation tasks, leveraging its fine-tuned architecture to produce coherent and contextually relevant outputs.

Loading preview...