Model Type |
| ||||||||||||
Use Cases |
| ||||||||||||
Additional Notes |
| ||||||||||||
Input Output |
|
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Meta Llama 3.1 405B | 128K / 186 GB | 521433 | 808 |
Llama 3.1 Tulu 3 405B | 128K / 191.2 GB | 469 | 84 |
Llama 3.1 405B Instruct | 128K / 183.1 GB | 53203 | 562 |
Meta Llama 3.1 405B Instruct | 128K / 186 GB | 55654 | 473 |
Llama 3.1 405B | 128K / 183.1 GB | 12649 | 918 |
Meta Llama 3.1 405B FP8 | 128K / 197.6 GB | 131499 | 94 |
...ta Llama 3.1 405B Instruct FP8 | 128K / 197.6 GB | 55370 | 165 |
Llama 3.1 405B Instruct FP8 | 128K / 193.4 GB | 17275 | 184 |
Llama 3.1 Tulu 3 405B DPO | 128K / 191.2 GB | 32 | 5 |
Llama 3.1 Tulu 3 405B SFT | 128K / 191.2 GB | 39 | 10 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐