Mixtral 8x7B V0.1 Fp8 by FriendliAI

 ยป  All LLMs  ยป  FriendliAI  ยป  Mixtral 8x7B V0.1 Fp8   URL Share it on

  8-bit   Autotrain compatible Base model:mistralai/mixtral-8...   License:apache-2.0   Mixtral   Moe   Pretrained   Region:us   Safetensors   Sharded   Tensorflow

Rank the Mixtral 8x7B V0.1 Fp8 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Mixtral 8x7B V0.1 Fp8 (FriendliAI/Mixtral-8x7B-v0.1-fp8)

Best Alternatives to Mixtral 8x7B V0.1 Fp8

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Mixtral 8x7B V0.177.9532K / 93.6 GB10464721587
Mixtral 8x7B Instruct V0.177.7532K / 93.6 GB5291863945
...lQA Mixtral 8x7B Instruct V0.132K / 43.3 GB52
Mixtral 8x7B Instruct V0.1 FP832K / 47.1 GB2371
...tral 8x7B Instruct V0.1 FP8 V332K / 47.1 GB360
...tral 8x7B Instruct V0.1 FP8 V232K / 47.1 GB110
...tral 8x7B Instruct V0.1 FP8 V132K / 47.1 GB60
Aldan Mix 8x7B32K / 89.4 GB11
Taiwan LLM 8x7B DPO32K / 90 GB57418
Mixtral Instruct ITR 8x7B32K / 91.4 GB11
Note: green Score (e.g. "73.2") means that the model is better than FriendliAI/Mixtral-8x7B-v0.1-fp8.

Mixtral 8x7B V0.1 Fp8 Parameters and Internals

LLM NameMixtral 8x7B V0.1 Fp8
RepositoryOpen on ๐Ÿค— 
Model Namemistralai/Mixtral-8x7B-v0.1
Base Model(s)  Mixtral 8x7B V0.1   mistralai/Mixtral-8x7B-v0.1
Model Size46.7b
Required VRAM47 GB
Updated2024-07-07
MaintainerFriendliAI
Model Typemixtral
Model Files  9.9 GB: 1-of-5   10.0 GB: 2-of-5   10.0 GB: 3-of-5   10.0 GB: 4-of-5   7.1 GB: 5-of-5
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.40.0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34531 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801