Dolphin 2.2.1 Mistral 7B by ehartford

 ยป  All LLMs  ยป  ehartford  ยป  Dolphin 2.2.1 Mistral 7B   URL Share it on

  Autotrain compatible Base model:mistralai/mistral-7...   Conversational   Dataset:ehartford/dolphin Dataset:jondurbin/airoboros-2....   En   Endpoints compatible   License:apache-2.0   Mistral   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Dolphin 2.2.1 Mistral 7B Benchmarks

nn.n% — How the model compares to the GPT-4.

Rank the Dolphin 2.2.1 Mistral 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Dolphin 2.2.1 Mistral 7B (cognitivecomputations/dolphin-2.2.1-mistral-7b)

Best Alternatives to Dolphin 2.2.1 Mistral 7B

Best Alternatives
HF Rank
M7 7B76.8232K / 14.4 GB771215
J4rviz V3.076.5832K / 14.4 GB15350
Nexim 7B76.5332K / 14.4 GB20770
TriFusionNexus 7B76.3232K / 14.4 GB20620
Ramonda 7B DPO Ties76.1932K / 14.4 GB374110
OGNO 7B DPO Truthful76.1432K / 14.4 GB38241
Cyrax 7B75.9832K / 14.4 GB26279
NeuralTrix 7B DPO Laser75.9232K / 14.4 GB50576
Prima LelantaclesV6.69 7B75.732K / 14.5 GB19753
NeuralTrix 7B DPO Relaser75.6632K / 14.4 GB42972
Note: green Score (e.g. "73.2") means that the model is better than cognitivecomputations/dolphin-2.2.1-mistral-7b.

Dolphin 2.2.1 Mistral 7B Parameters and Internals

LLM NameDolphin 2.2.1 Mistral 7B
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Mistral 7B V0.1   mistralai/Mistral-7B-v0.1
Model Size7b
Required VRAM14.4 GB
Model Typemistral
Model Files  9.9 GB: 1-of-2   4.5 GB: 2-of-2   9.9 GB: 1-of-2   4.5 GB: 2-of-2   0.0 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32002
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35042 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801