LLM Name | MING MoE 14B |
Repository ๐ค | https://huggingface.co/BlueZeros/MING-MOE-14B |
Model Size | 14b |
Required VRAM | 0.1 GB |
Updated | 2025-02-22 |
Maintainer | BlueZeros |
Model Files | |
Model Architecture | AutoModelForCausalLM |
Is Biased | none |
PEFT Type | LORA |
LoRA Model | Yes |
PEFT Target Modules | q_proj|v_proj|o_proj|k_proj |
LoRA Alpha | 32 |
LoRA Dropout | 0.05 |
R Param | 16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Phi3 14B Unsloth Sft Quip 2bit | 0K / 0.1 GB | 7 | 0 |
Phi3 14B Unsloth Sft Quip 3bit | 0K / 0.1 GB | 6 | 0 |
ReWiz Qwen 2.5 14B | 0K / 29.7 GB | 344 | 5 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐