lxcorp/lambda-1v-1B

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:May 5, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

The lxcorp/lambda-1v-1B is a compact 1.1 billion parameter language model developed by Marius Jabami of λχ Corp., fine-tuned from TinyLlama-1.1B-Chat-v1.0. Optimized for educational reasoning tasks, it specializes in logic, number theory, and mathematics in both Portuguese and English. This model delivers fast performance with minimal computational requirements, making it suitable for low-resource deployment.

Loading preview...

Overview

The lxcorp/lambda-1v-1B is a lightweight, 1.1 billion parameter language model developed by Marius Jabami of λχ Corp. It is built upon TinyLlama-1.1B-Chat-v1.0 and has been specifically fine-tuned for educational reasoning tasks. The model's primary focus is on logic, number theory, and mathematics, supporting both Portuguese and English.

Key Capabilities

  • Specialized Reasoning: Excels in mathematical and logical problem-solving, particularly within number theory.
  • Multilingual Support: Designed to handle reasoning tasks in both Portuguese and English.
  • Efficiency: Offers fast inference and low computational requirements due to its compact size and 8-bit quantization (NF4).
  • Fine-Tuning: Utilizes LoRA fine-tuning on q_proj and v_proj layers, trained on a subset of the HuggingFaceH4/MATH dataset focusing on number theory.

Good For

  • Educational Applications: Ideal for AI-driven tools focused on teaching or assessing mathematical and logical reasoning.
  • Low-Resource Environments: Suitable for deployment where computational power or memory is limited.
  • Prototyping: A good choice for quickly developing applications requiring specialized reasoning capabilities without the overhead of larger models.