Mixtral 8x7b Instruct V0.1 Int4 Ov by OpenVINO

 ยป  All LLMs  ยป  OpenVINO  ยป  Mixtral 8x7b Instruct V0.1 Int4 Ov   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Instruct   License:apache-2.0   Mixtral   Moe   Openvino   Region:us

Rank the Mixtral 8x7b Instruct V0.1 Int4 Ov Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Mixtral 8x7b Instruct V0.1 Int4 Ov (OpenVINO/mixtral-8x7b-instruct-v0.1-int4-ov)

Best Alternatives to Mixtral 8x7b Instruct V0.1 Int4 Ov

Best Alternatives
HF Rank
...al 8x7B Instruct V0.1 GPT Fast32K /  GB51
...ct V0.1 Agent Function Calling32K / 44.3 GB22
...ral 8x7B Instruct V0.1 Int8 Ov32K / 46.7 GB150
Dolphin 2.5 Mixtral 8x7b32K / 93.6 GB505001160
Dolphin 2.6 Mixtral 8x7b32K / 93.6 GB2215181
Dolphin 2.7 Mixtral 8x7b32K / 93.6 GB3291148
Mixtral 8x7B Instruct V0.1 HF32K / 93.6 GB872
...tral 8x7B Instruct V0.1 Polish32K / 93.6 GB242
Taiwan LLM MoE Pilot32K / 93.6 GB22
Empower Functions Medium32K / 93.6 GB430

Mixtral 8x7b Instruct V0.1 Int4 Ov Parameters and Internals

LLM NameMixtral 8x7b Instruct V0.1 Int4 Ov
RepositoryOpen on ๐Ÿค— 
Required VRAM28.7 GB
Model Typemixtral
Model Files  28.7 GB
Supported Languagesen
Model ArchitectureMixtralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35549 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801