Mixtral Instruction V0.1 Zh Nf4 by xinping

 ยป  All LLMs  ยป  xinping  ยป  Mixtral Instruction V0.1 Zh Nf4   URL Share it on

  4-bit   Autotrain compatible   Bitsandbytes   En   Endpoints compatible   Fr   Instruct   Mixtral   Region:us   Safetensors   Sharded   Tensorflow   Zh

Mixtral Instruction V0.1 Zh Nf4 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mixtral Instruction V0.1 Zh Nf4 (xinping/Mixtral-instruction-v0.1_zh-nf4)

Mixtral Instruction V0.1 Zh Nf4 Parameters and Internals

Model Type 
text generation
Additional Notes 
Users should be aware of the risks, biases, and limitations of the model.
Supported Languages 
zh (proficient), en (proficient), fr (proficient)
Input Output 
Input Format:
text
Accepted Modalities:
text
Output Format:
text
LLM NameMixtral Instruction V0.1 Zh Nf4
Repository ๐Ÿค—https://huggingface.co/xinping/Mixtral-instruction-v0.1_zh-nf4 
Model Size24.2b
Required VRAM24.5 GB
Updated2025-02-22
Maintainerxinping
Model Typemixtral
Instruction-BasedYes
Model Files  5.0 GB: 1-of-5   5.0 GB: 2-of-5   5.0 GB: 3-of-5   5.0 GB: 4-of-5   4.5 GB: 5-of-5
Supported Languageszh en fr
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Mixtral Instruction V0.1 Zh Nf4

Best Alternatives
Context / RAM
Downloads
Likes
Proto Athena 4x7B32K / 48.4 GB150
Proto Athena V0.2 4x7B32K / 48.4 GB80
...erges MoE 4x7b V10 Mixtralv0.332K / 48.3 GB120
MoE Merging32K / 48.3 GB17260
Sixtyoneeighty 4x7B V132K / 48.3 GB240
MixtureofMerges MoE 4x7bRP V1132K / 48.3 GB111
...icon Mixtral87 Merged Torch21232K / 26.7 GB50
Eclipse Mistral 4x7b32K / 48.5 GB161
Kicon Mixtral87 Merged 4176632K / 26.7 GB50
Boundary Mistral 4x7b MoE32K / 48.7 GB1171
Note: green Score (e.g. "73.2") means that the model is better than xinping/Mixtral-instruction-v0.1_zh-nf4.

Rank the Mixtral Instruction V0.1 Zh Nf4 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227