Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2 by LoneStriker

 ยป  All LLMs  ยป  LoneStriker  ยป  Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2   URL Share it on

  Autotrain compatible   Base model:01-ai/yi-34b   Chatml   Conversational   Distillation   En   Endpoints compatible   Exl2   Finetuned   Gpt4   Instruct   License:apache-2.0   Llama   Quantized   Region:us   Safetensors   Sharded   Synthetic data   Tensorflow   Yi

Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2 Benchmarks

Rank the Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2 (LoneStriker/Nous-Hermes-2-Yi-34B-4.65bpw-h6-exl2)

Best Alternatives to Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
SUS Chat 34B85.038K / 68.9 GB4884116
Yi 34B79.534K / 68.9 GB114571256
SUS Chat 34B GPTQ74.68K / 18.6 GB23
SUS Chat 34B AWQ74.68K / 19.3 GB429
...Hermes 2 Yi 34B 3.0bpw H6 EXL269.74K / 13.9 GB32
...Hermes 2 Yi 34B 4.0bpw H6 EXL269.74K / 18.1 GB44
...Hermes 2 Yi 34B 5.0bpw H6 EXL269.74K / 22.2 GB41
Tess M 34B 2bit195K / 10.2 GB36
...B 200K RPMerge 4.65bpw H6 EXL2195K / 10.5 GB61
Smaug 34B V0.1 2.4bpw H6 EXL2195K / 11.4 GB61
Note: green Score (e.g. "73.2") means that the model is better than LoneStriker/Nous-Hermes-2-Yi-34B-4.65bpw-h6-exl2.

Nous Hermes 2 Yi 34B 4.65bpw H6 EXL2 Parameters and Internals

LLM NameNous Hermes 2 Yi 34B 4.65bpw H6 EXL2
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Yi 34B   01-ai/Yi-34B
Model Size34b
Required VRAM20.7 GB
Updated2024-05-22
MaintainerLoneStriker
Model Typellama
Model Files  8.5 GB: 1-of-3   8.5 GB: 2-of-3   3.7 GB: 3-of-3
Supported Languagesen
Quantization Typeexl2
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35549 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801