Model Type |
| ||||||||||||||||||
Use Cases |
| ||||||||||||||||||
Supported Languages |
| ||||||||||||||||||
Training Details |
| ||||||||||||||||||
Input Output |
|
LLM Name | Llama 3.2 3B Instruct |
Repository ๐ค | https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct |
Model Size | 3b |
Required VRAM | 6.5 GB |
Updated | 2025-02-14 |
Maintainer | meta-llama |
Model Type | llama |
Instruction-Based | Yes |
Model Files | |
Supported Languages | en de fr it pt hi es th |
Model Architecture | LlamaForCausalLM |
License | llama3.2 |
Context Length | 131072 |
Model Max Length | 131072 |
Transformers Version | 4.45.0.dev0 |
Tokenizer Class | PreTrainedTokenizerFast |
Vocabulary Size | 128256 |
Torch Data Type | bfloat16 |
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
...2 3B Instruct Unsloth Bnb 4bit | 3 | 123118 | 2 GB |
Llama 3.2 3B Instruct 4bit | 26 | 626116 | 1 GB |
Llama 3.2 3B Instruct GGUF | 10 | 454909 | 0 GB |
Llama 3.2 MEDIT 3B O1 | 12 | 299 | 6 GB |
Orca Mini V9 0 3B Instruct | 5 | 266 | 6 GB |
Llama 3.2 3B Instruct Bnb 4bit | 17 | 196430 | 2 GB |
Gladiator Mini Exp 1211 3B | 0 | 10 | 6 GB |
...r Mini Exp 1221 3B Instruct V2 | 0 | 12 | 6 GB |
...ator Mini Exp 1222 3B Instruct | 0 | 12 | 6 GB |
Komodo Llama 3.2 3B V2 Fp16 | 5 | 94 | 6 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
... 3.2 3B Math Instruct RE1 ORPO | 128K / 6.5 GB | 135 | 0 |
ReasoningCore 3B T1 1 | 128K / 6.5 GB | 62 | 0 |
Llama 3.2 3B Instruct | 128K / 6.4 GB | 397669 | 47 |
Orca Mini V9 5 3B Instruct | 128K / 6.5 GB | 145 | 6 |
Llama 3.2 3B RP DeepThink | 128K / 7.2 GB | 284 | 2 |
Llama 3.2 3B Math Oct | 128K / 6.5 GB | 156 | 7 |
Eximius Persona 5B | 128K / 11.6 GB | 68 | 3 |
Llama 3.2 3B Instruct | 128K / 6.5 GB | 440861 | 2 |
Llama 3.2 3B Bespoke Thought | 128K / 6.4 GB | 49 | 3 |
Omni Reasoner3 Merged | 128K / 6.4 GB | 79 | 7 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐