Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Meta Llama 3.1 405B | 128K / 186 GB | 521433 | 808 |
Llama 3.1 405B Instruct | 128K / 183.1 GB | 72017 | 568 |
Llama 3.1 405B | 128K / 183.1 GB | 20378 | 921 |
Meta Llama 3.1 405B Instruct | 128K / 186 GB | 55654 | 473 |
Meta Llama 3.1 405B FP8 | 128K / 197.6 GB | 131499 | 94 |
...ta Llama 3.1 405B Instruct FP8 | 128K / 197.6 GB | 55370 | 165 |
Llama 3.1 405B Instruct FP8 | 128K / 193.4 GB | 18234 | 186 |
Llama 3.1 Tulu 3 405B DPO | 128K / 191.2 GB | 52 | 5 |
Llama 3.1 Tulu 3 405B SFT | 128K / 191.2 GB | 66 | 10 |
Llama 3.1 405B FP8 | 128K / 193.4 GB | 4288 | 111 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐