Chamaka8/SerendipLLM-v2-news
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Feb 25, 2026Architecture:Transformer Cold

Chamaka8/SerendipLLM-v2-news is an 8 billion parameter language model with an 8192 token context length. This model is part of the SerendipLLM-v2 family, developed by Chamaka8. Further details regarding its specific architecture, training, and primary differentiators are not provided in the available documentation.

Loading preview...