LLM Name | Air Striker Mixtral 8x7B ZLoss 3.75bpw H6 EXL2 |
Repository ๐ค | https://huggingface.co/LoneStriker/Air-Striker-Mixtral-8x7B-ZLoss-3.75bpw-h6-exl2 |
Required VRAM | 22.2 GB |
Updated | 2024-09-20 |
Maintainer | LoneStriker |
Model Files | |
Supported Languages | en |
Quantization Type | exl2 |
Model Architecture | AutoModelForCausalLM |
License | apache-2.0 |
Is Biased | none |
Tokenizer Class | LlamaTokenizer |
PEFT Type | LORA |
LoRA Model | Yes |
PEFT Target Modules | q_proj|w1|v_proj|gate|o_proj|w3|k_proj|w2 |
LoRA Alpha | 16 |
LoRA Dropout | 0.07 |
R Param | 64 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
...emo Instruct 2407 EXL2 8bpw H8 | 0K / 12.6 GB | 14 | 6 |
Dre Phi | 0K / 7.6 GB | 5 | 0 |
Phi 3 Mini 4K CodeInstruct | 0K / 7.6 GB | 5 | 0 |
...3 Mini 4K Instruct Text To Sql | 0K / 7.6 GB | 22 | 0 |
Tiny Llama Miniguanaco | 2K / 2.2 GB | 5 | 1 |
Phi 3 Mini 4K Python | 0K / 7.6 GB | 131 | 0 |
Finetuned Llava Lora | 0K / 0.1 GB | 6 | 0 |
Pipi | 0K / 2.4 GB | 5 | 0 |
Alphace Email | 0K / 0.1 GB | 5 | 0 |
Qwen7B Haiguitang | 0K / 15.3 GB | 5 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐