Openchat 3.5 0106 by openchat

 ยป  All LLMs  ยป  openchat  ยป  Openchat 3.5 0106   URL Share it on

  Arxiv:2303.08774   Arxiv:2309.11235   Autotrain compatible Base model:mistralai/mistral-7...   C-rlft   Conversational   Endpoints compatible   Has space   License:apache-2.0   Mistral   Openchat   Region:us   Safetensors   Sharded   Tensorflow

Openchat 3.5 0106 Benchmarks

Rank the Openchat 3.5 0106 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Openchat 3.5 0106 (openchat/openchat-3.5-0106)

Quantized Models of the Openchat 3.5 0106

Openchat 3.5 0106 GGUF6646873 GB
Openchat 3.5 0106 AWQ511504 GB
Openchat 3.5 0106 GPTQ510534 GB
OpenChat 3.5 0106 GGUF21143 GB
Openchat 3.5 0106 GPTQ12134 GB
Newton 7B 8.0bpw H8 EXL21117 GB
Openchat 3.5 0106 GGUF015492 GB
Fork Openchat 3.5 0106 Gptq0784 GB
Openchat 3.5 0106 GGUF0703 GB
Newton 7B AWQ0294 GB

Best Alternatives to Openchat 3.5 0106

Best Alternatives
HF Rank
KAI 7B V0.174.4532K / 14.4 GB679
Dolphin 2.2.1 Mistral 7B73.1732K / 14.4 GB24943182
Mistral 7B V0.168.5332K / 14.4 GB21575143081
...andle Dolphin 2.2.1 Mistral 7B64.232K / 14.4 GB820
Mistral 7B Instruct V0.262.2332K / 14.4 GB25892111839
Notus 7B V160.1532K / 14.4 GB5789111
...t 3.5 0106 128K 3.0bpw H6 EXL260.1128K / 3 GB80
...t 3.5 0106 128K 4.0bpw H6 EXL260.1128K / 3.9 GB151
...t 3.5 0106 128K 5.0bpw H6 EXL260.1128K / 4.7 GB50
...t 3.5 0106 128K 6.0bpw H6 EXL260.1128K / 5.6 GB50
Note: green Score (e.g. "73.2") means that the model is better than openchat/openchat-3.5-0106.

Openchat 3.5 0106 Parameters and Internals

LLM NameOpenchat 3.5 0106
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Mistral 7B V0.1   mistralai/Mistral-7B-v0.1
Model Size7b
Required VRAM14.4 GB
Model Typemistral
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3
Model ArchitectureMistralForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.36.1
Tokenizer ClassLlamaTokenizer
Vocabulary Size32002
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 36560 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901