Model Type |
| ||||||||||||
Use Cases |
| ||||||||||||
Additional Notes |
| ||||||||||||
Input Output |
|
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Meta Llama 3.1 405B | 128K / 186 GB | 521433 | 808 |
Llama 3.1 405B | 128K / 183.1 GB | 14557 | 901 |
Llama 3.1 405B Instruct | 128K / 183.1 GB | 51552 | 550 |
Meta Llama 3.1 405B Instruct | 128K / 186 GB | 55654 | 473 |
...ta Llama 3.1 405B Instruct FP8 | 128K / 197.6 GB | 55370 | 165 |
Llama 3.1 405B Instruct FP8 | 128K / 193.4 GB | 42954 | 181 |
Meta Llama 3.1 405B FP8 | 128K / 197.6 GB | 131499 | 94 |
Llama 3.1 405B FP8 | 128K / 193.4 GB | 674 | 103 |
Meta Llama 3.1 405B FP8 | 128K / 197.6 GB | 320 | 3 |
BigLlama 3.1 681B Instruct | 128K / 190.8 GB | 37 | 11 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐