SmolLM2 135M by HuggingFaceTB

 ยป  All LLMs  ยป  HuggingFaceTB  ยป  SmolLM2 135M   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Llama   Region:us   Safetensors

SmolLM2 135M Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SmolLM2 135M (HuggingFaceTB/SmolLM2-135M)

SmolLM2 135M Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
On-device computing, Instruction following
Applications:
Summarization, Text rewriting, Function calling
Primary Use Cases:
English content generation, Instruction following
Limitations:
Models primarily understand and generate English., Content may not be factually accurate or logically consistent., Presence of biases inherent to training data.
Considerations:
Models are assistive tools and should not be used as definitive information sources.
Additional Notes 
Memory footprint of the 135M model is 723.56 MB when loaded.
Supported Languages 
en (High proficiency)
Training Details 
Data Sources:
FineWeb-Edu, DCLM, The Stack, UltraFeedback
Data Volume:
2 trillion tokens
Methodology:
Direct Preference Optimization (DPO), Supervised Fine-tuning (SFT)
Context Length:
8000
Hardware Used:
64 H100 GPUs
Model Architecture:
Transformer decoder
Input Output 
Input Format:
Token sequences encoded with a tokenizer
Accepted Modalities:
text
Output Format:
Generated token sequences
Performance Tips:
Use multiple GPUs and specific precision settings (e.g., bfloat16) for optimal performance
LLM NameSmolLM2 135M
Repository ๐Ÿค—https://huggingface.co/HuggingFaceTB/SmolLM2-135M 
Model Size135m
Required VRAM0.3 GB
Updated2025-02-05
MaintainerHuggingFaceTB
Model Typellama
Model Files  0.3 GB
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.40.1
Tokenizer ClassGPT2Tokenizer
Vocabulary Size49152
Torch Data Typebfloat16

Quantized Models of the SmolLM2 135M

Model
Likes
Downloads
VRAM
SmolLM2 135M Bnb 4bit14140 GB

Best Alternatives to SmolLM2 135M

Best Alternatives
Context / RAM
Downloads
Likes
SmolLM2 135M Instruct8K / 0.3 GB9347197
Reasoning SmolLM2 135M8K / 0.5 GB52711
SmolLM2 135M8K / 0.3 GB67300
Jaja Small8K / 0.5 GB2230
...wre324 R1 SmolLM2 135M Distill8K / 0.5 GB160
... Fineweb Uncovai Human Removed8K / 0.5 GB130
... Fineweb Uncovai Human Removed8K / 0.5 GB120
... Fineweb Uncovai Human Removed8K / 0.5 GB180
SmolLM2 FT MyDataset8K / 0.5 GB230
SmolLM2 FT Smoltalk8K / 0.5 GB1510

Rank the SmolLM2 135M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227