Mistral 22B V0.2 GGUF by failspy

 ยป  All LLMs  ยป  failspy  ยป  Mistral 22B V0.2 GGUF   URL Share it on

  Autotrain compatible   Endpoints compatible   F16   Ggml   Gguf   Mistral   Q2   Quantized   Region:us

Mistral 22B V0.2 GGUF Benchmarks

Mistral 22B V0.2 GGUF (failspy/Mistral-22B-v0.2-GGUF)

Mistral 22B V0.2 GGUF Parameters and Internals

Model Type 
Dense, Not MOE
Additional Notes 
The model exhibits strong mathematical abilities and improved coding skills. It has agent abilities and can perform multi-turn conversations. It has a 32k sequence length.
Training Details 
Data Volume:
8x more data than v0.1
Methodology:
Knowledge distilled from all experts into a single dense model
Context Length:
32000
Input Output 
Input Format:
Guanaco chat template
Release Notes 
Version:
v0.2
Date:
April 13
Notes:
This model is not a single trained expert, instead it's a compressed MOE model, turning it into a dense 22B model. It has been trained on 8x more data than v0.1.
LLM NameMistral 22B V0.2 GGUF
Repository ๐Ÿค—https://huggingface.co/failspy/Mistral-22B-v0.2-GGUF 
Base Model(s)  Vezora/Mistral-22B-v0.2   Vezora/Mistral-22B-v0.2
Model Size22b
Required VRAM8.3 GB
Updated2025-02-05
Maintainerfailspy
Model Typemistral
Model Files  8.3 GB   10.8 GB   11.7 GB   9.6 GB   13.3 GB   12.7 GB   15.7 GB   15.3 GB   18.2 GB   23.6 GB   44.5 GB
GGML QuantizationYes
GGUF QuantizationYes
Quantization Typegguf|ggml|q2|q4_k|q5_k
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length65536
Model Max Length65536
Transformers Version4.40.0.dev0
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Mistral 22B V0.2 GGUF

Best Alternatives
Context / RAM
Downloads
Likes
Codestral 22B V0.1 GGUF32K / 8.3 GB4240
Codestral 22B V0.1 GGUF32K / 8.3 GB2501
MwM 22B Instruct128K / 44.7 GB150
... Instruct 0.2 Chkpt 200 16 Bit128K / 44.7 GB201
...l 22B V0.1 Hf FIM Fix Bnb 4bit32K / 13.1 GB51
Text2cypher Codestral 16bit32K / 44.7 GB133
... Abliterated V3.8.0bpw H8 EXL232K / 21.2 GB71
Codestral 22B V0.1 EXL2 8.0bpw32K / 20.9 GB93
...estral 22B V0.1 Hf 2 2bpw EXL232K / 6.6 GB81
Codestral 22B V0.1 EXL2 6.0bpw32K / 16.9 GB71
Note: green Score (e.g. "73.2") means that the model is better than failspy/Mistral-22B-v0.2-GGUF.

Rank the Mistral 22B V0.2 GGUF Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227