GitHub - huggingface/smollm: Everything about the SmolLM & SmolLM2 family of models
SmolLM2
SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device.
You can find our most capable model 🤏 SmolLM2-1.7B-Instruct here.
New: Introducing SmolTalk, the SFT dataset of SmolLM2 🚀
Table of Contents
Usage
Transformers
Chat in TRL
Local applications
Smol-tools
Pre-training
Fine-tuning
Evaluation
Synthetic data pipelines
Usage
Our most powerful ...
Read more at github.com