Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
...r SMB R1 Distill Llama 3.1 16B | 128K / 32 GB | 37 | 1 |
Phi 3.5 Mini Investigator 16B | 128K / 7.6 GB | 8 | 0 |
...o Sft Ties Post Merge Auto DPO | 8K / 141.2 GB | 7 | 0 |
Nanbeige2 16B Chat | 4K / 31.6 GB | 1494 | 0 |
Nanbeige 16B Base Llama | 4K / 31.6 GB | 1567 | 4 |
...ALAXY V03 Slimorca 1 Epoch 50k | 4K / 31.8 GB | 13 | 0 |
...ca 1 Epoch 50k DPO 1 Epoch 30k | 4K / 31.8 GB | 9 | 0 |
FusionNet SOLAR | 4K / 31.9 GB | 1831 | 1 |
GALAXY XB V.03 | 4K / 31.9 GB | 9 | 0 |
Llama 2 16B Nastychat | 4K / 32.4 GB | 906 | 9 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐