LLM Name | Smollm2 135M Pretrained 600k Fineweb Uncovai Selected |
Repository ๐ค | https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected |
Base Model(s) | |
Model Size | 135m |
Required VRAM | 0.5 GB |
Updated | 2025-01-17 |
Maintainer | FlofloB |
Model Type | llama |
Model Files | |
Model Architecture | LlamaForCausalLM |
License | apache-2.0 |
Context Length | 8192 |
Model Max Length | 8192 |
Transformers Version | 4.44.2 |
Tokenizer Class | GPT2Tokenizer |
Padding Token | <|endoftext|> |
Vocabulary Size | 49152 |
Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
SmolLM2 135M | 8K / 0.3 GB | 158432 | 45 |
SmolLM2 135M Instruct | 8K / 0.3 GB | 38744 | 88 |
Reasoning SmolLM2 135M | 8K / 0.5 GB | 337 | 9 |
SmolLM2 FT Smoltalk | 8K / 0.5 GB | 119 | 0 |
... 400k Fineweb Uncovai Selected | 8K / 0.5 GB | 16 | 1 |
...1200k Fineweb Uncovai Selected | 8K / 0.5 GB | 18 | 0 |
...1400k Fineweb Uncovai Selected | 8K / 0.5 GB | 12 | 0 |
...1000k Fineweb Uncovai Selected | 8K / 0.5 GB | 12 | 0 |
... 135M Pretrained 1000k Fineweb | 8K / 0.5 GB | 18 | 0 |
...mollm2 Pretrained 200k Fineweb | 8K / 0.5 GB | 17 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐