Mistral 7B V0.1 by mistralai

 ยป  All LLMs  ยป  mistralai  ยป  Mistral 7B V0.1   URL Share it on

  Arxiv:2310.06825   Autotrain compatible   En   Endpoints compatible   License:apache-2.0   Mistral   Pretrained   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Mistral 7B V0.1 Benchmarks

Rank the Mistral 7B V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Mistral 7B V0.1 (mistralai/Mistral-7B-v0.1)

Quantized Models of the Mistral 7B V0.1

Mistral 7B V0.1 GGUF235124053 GB
Mistral 7B V0.1 GPTQ3559054 GB
Mistral 7B V0.1 AWQ2992544 GB
...arch Nous Hermes 2 Vision GGUF16323015 GB
Dragoman91382 GB
...es 2 Mistral 7B 8.0bpw H6 EXL2537 GB
Zephyr 7B Beta 5.0bpw EXL2530 GB
Smaugv0.1 6.0bpw H6 EXL24326 GB
Cosmosage V2319854 GB
NousResearch Genstruct 7B GGUF315514 GB

Best Alternatives to Mistral 7B V0.1

Best Alternatives
HF Rank
M7 7B76.8232K / 14.4 GB771215
J4rviz V3.076.5832K / 14.4 GB15350
Nexim 7B76.5332K / 14.4 GB20770
TriFusionNexus 7B76.3232K / 14.4 GB20620
Ramonda 7B DPO Ties76.1932K / 14.4 GB374110
OGNO 7B DPO Truthful76.1432K / 14.4 GB38241
Cyrax 7B75.9832K / 14.4 GB26279
NeuralTrix 7B DPO Laser75.9232K / 14.4 GB50576
Prima LelantaclesV6.69 7B75.732K / 14.5 GB19753
NeuralTrix 7B DPO Relaser75.6632K / 14.4 GB42972
Note: green Score (e.g. "73.2") means that the model is better than mistralai/Mistral-7B-v0.1.

Mistral 7B V0.1 Parameters and Internals

LLM NameMistral 7B V0.1
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM14.4 GB
Model Typemistral
Model Files  9.9 GB: 1-of-2   4.5 GB: 2-of-2   9.9 GB: 1-of-2   5.1 GB: 2-of-2
Supported Languagesen
Gated ModelYes
Model ArchitectureMistralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.34.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35042 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801