Text Generation
Transformers
Safetensors
llama
text-generation-inference

MultivexAI/Plyx-15M

MultivexAI/Plyx-15M is a 15 million parameter 8-layer language model, trained from scratch using the Llama architecture.

We built this model to be a small, useful foundation for various tasks. It's a great starting point for quick tests, research projects, or fine-tuning on specialized jobs where a small model footprint is important.

Model Series Note: This is the first model in our Plyx series. We're continuing this work and plan to release future models in various sizes. We'll be adding some initial performance benchmarks here soon.

Pre-training Data

The model was trained on a carefully curated mix of data to build a great foundation, trained on approx ~600M tokens:

  1. fineweb-pro: A heavily filtered and refined version of the FineWeb dataset. This provides a strong base in general-purpose language by removing significant noise and low-quality content.
  2. fineweb-edu: A subset of FineWeb containing educational and instructional content, used to ground the model in well-structured, factual information.
  3. finepdfs: A large collection of documents from PDFs, including professional reports and technical papers. This component introduces the model to more formal language, complex sentence structures, and data-rich formats.

A Note on Size and Performance

To set the right expectations: Plyx-15M is a 15-million-parameter model, which is quite small. Its performance won't be comparable to models with billions of parameters. It's best used for research, highly specific tasks, or as a base for fine-tuning - not as a drop-in replacement for a large, general-purpose model.

Limitations

Users should be aware of the biases and limitations of this model, as no model is truly perfect.

License

The data used for pre-training (fineweb-pro, fineweb-edu, and finepdfs) is derived from sources made available under the ODC-By 1.0 license. Users must also abide by the CommonCrawl Terms of Use. We do not alter the license of any of the underlying data.

Downloads last month
68
Safetensors
Model size
16.1M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MultivexAI/Plyx-15M

Quantizations
1 model

Datasets used to train MultivexAI/Plyx-15M

Collection including MultivexAI/Plyx-15M