Ministral 8B Slerp by allknowingroger

 ยป  All LLMs  ยป  allknowingroger  ยป  Ministral 8B Slerp   URL Share it on

  Merged Model Base model:finetune:prince-can... Base model:prince-canuma/minis...   Instruct   Mistral   Model-index Prince-canuma/ministral-8b-ins...   Region:us   Safetensors   Sharded   Tensorflow

Ministral 8B Slerp Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Ministral 8B Slerp (allknowingroger/Ministral-8B-slerp)

Ministral 8B Slerp Parameters and Internals

LLM NameMinistral 8B Slerp
Repository ๐Ÿค—https://huggingface.co/allknowingroger/Ministral-8B-slerp 
Base Model(s)  Ministral 8B Instruct 2410 HF   Ministral 8B Instruct 2410 HF   prince-canuma/Ministral-8B-Instruct-2410-HF   prince-canuma/Ministral-8B-Instruct-2410-HF
Merged ModelYes
Model Size8b
Required VRAM29.2 GB
Updated2024-12-21
Maintainerallknowingroger
Model Typemistral
Instruction-BasedYes
Model Files  2.1 GB: 1-of-28   2.1 GB: 2-of-28   1.0 GB: 3-of-28   1.0 GB: 4-of-28   1.0 GB: 5-of-28   0.9 GB: 6-of-28   1.0 GB: 7-of-28   1.0 GB: 8-of-28   1.0 GB: 9-of-28   0.9 GB: 10-of-28   1.0 GB: 11-of-28   1.0 GB: 12-of-28   1.0 GB: 13-of-28   0.9 GB: 14-of-28   1.0 GB: 15-of-28   1.0 GB: 16-of-28   1.0 GB: 17-of-28   0.9 GB: 18-of-28   1.0 GB: 19-of-28   1.0 GB: 20-of-28   1.0 GB: 21-of-28   0.9 GB: 22-of-28   1.0 GB: 23-of-28   1.0 GB: 24-of-28   1.0 GB: 25-of-28   0.9 GB: 26-of-28   1.0 GB: 27-of-28   0.6 GB: 28-of-28
Model ArchitectureMistralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size131072
Torch Data Typefloat32

Best Alternatives to Ministral 8B Slerp

Best Alternatives
Context / RAM
Downloads
Likes
Ministral 8B Instruct 2410 HF32K / 32 GB5009710
Ministrations 8B V132K / 16.1 GB14315
...flect Mini8Bit Om2 460k Sft T132K / 16.1 GB1300
...ruct 2410 MetaMathQA DPO Iter132K / 16.1 GB3870
...t Ministral8Bit MMQA Mix Iter232K / 16.1 GB1320
...t Ministral8Bit MMQA DPO Iter132K / 60.8 GB890
...ect Ministral8Bit Mg DPO Psdp232K / 16.1 GB690
...t Ministral8Bit Math DPO Iter132K / 16.1 GB530
...t Ministral8Bit MMQA DPO Iter132K / 16.1 GB420
...ruct 2410 MetaMathQA DPO Iter232K / 16.1 GB780
Note: green Score (e.g. "73.2") means that the model is better than allknowingroger/Ministral-8B-slerp.

Rank the Ministral 8B Slerp Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217