Training Details |
|
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
MixTAO 19B Pass | 32K / 38.1 GB | 8 | 1 |
Multimerge 19B Pass | 32K / 38 GB | 10 | 0 |
Lorge 2x7B UAMM | 32K / 38.2 GB | 16 | 0 |
TaoPassthrough 15B S | 32K / 38.4 GB | 5 | 0 |
Raccoon Small | 32K / 38.4 GB | 58 | 1 |
Mixtral 11Bx2 MoE 19B | 4K / 38.4 GB | 2991 | 37 |
Venus DPO 50 | 4K / 38.4 GB | 1216 | 0 |
Truthful DPO MoE 19B | 4K / 38.4 GB | 1214 | 1 |
SOLAR Math 2x10.7B V0.2 | 4K / 38.4 GB | 1214 | 3 |
SOLAR Math 2x10.7B | 4K / 38.4 GB | 1220 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐