Mixtral 8x7B 1l by joelb

 ยป  All LLMs  ยป  joelb  ยป  Mixtral 8x7B 1l   URL Share it on

  Arxiv:1910.09700   Autotrain compatible   Endpoints compatible   Mixtral   Model hub mixin   Moe   Pytorch model hub mixin   Region:us   Safetensors

Rank the Mixtral 8x7B 1l Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Mixtral 8x7B 1l (joelb/Mixtral-8x7B-1l)

Best Alternatives to Mixtral 8x7B 1l

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Mistral Prot V1.1.6B0.5K / 0 GB570
Mistral DNA V1.1.6B Hg380.5K / 0 GB80
Mistral Chem V1.1.6B0.5K / 0 GB60

Mixtral 8x7B 1l Parameters and Internals

LLM NameMixtral 8x7B 1l
RepositoryOpen on ๐Ÿค— 
Model Size1.6b
Required VRAM5.8 GB
Updated2024-07-13
Maintainerjoelb
Model Typemixtral
Model Files  5.8 GB
Model ArchitectureMixtralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.40.2
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 36243 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801