SmaugDolphin 34B Slerp by macadeliccc

 ยป  All LLMs  ยป  macadeliccc  ยป  SmaugDolphin 34B Slerp   URL Share it on

  Merged Model   Autotrain compatible Base model:abacusai/smaug-34b-... Base model:cognitivecomputatio...   Conversational   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

Rank the SmaugDolphin 34B Slerp Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
SmaugDolphin 34B Slerp (macadeliccc/SmaugDolphin-34B-slerp)

Best Alternatives to SmaugDolphin 34B Slerp

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
SUS Chat 34B85.038K / 68.9 GB1533119
Yi 34B79.534K / 68.9 GB102211266
SUS Chat 34B GPTQ74.68K / 18.6 GB33
SUS Chat 34B AWQ74.68K / 19.3 GB109
...Hermes 2 Yi 34B 3.0bpw H6 EXL269.74K / 13.9 GB12
...Hermes 2 Yi 34B 4.0bpw H6 EXL269.74K / 18.1 GB24
...ermes 2 Yi 34B 4.65bpw H6 EXL269.74K / 20.7 GB11
...Hermes 2 Yi 34B 5.0bpw H6 EXL269.74K / 22.2 GB21
Pearl 34B Ties195K / 67.8 GB6713
Bagel 34B V0.2195K / 68.7 GB734937
Note: green Score (e.g. "73.2") means that the model is better than macadeliccc/SmaugDolphin-34B-slerp.

SmaugDolphin 34B Slerp Parameters and Internals

LLM NameSmaugDolphin 34B Slerp
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Dolphin 2.2 Yi 34B 200K   Smaug 34B V0.1   cognitivecomputations/dolphin-2.2-yi-34b-200k   abacusai/Smaug-34B-v0.1
Merged ModelYes
Model Size34b
Required VRAM55.5 GB
Updated2024-07-06
Maintainermacadeliccc
Model Typellama
Model Files  9.8 GB: 1-of-6   9.8 GB: 2-of-6   10.0 GB: 3-of-6   9.8 GB: 4-of-6   9.8 GB: 5-of-6   6.3 GB: 6-of-6
Model ArchitectureLlamaForCausalLM
Context Length200000
Model Max Length200000
Transformers Version4.39.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64000
Initializer Range0.02
Torch Data Typefloat16

What open-source LLMs or SLMs are you in search of? 35096 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801