MixTAO 7Bx2 MoE Instruct V7.0 by zhengr

 ยป  All LLMs  ยป  zhengr  ยป  MixTAO 7Bx2 MoE Instruct V7.0   URL Share it on

  Autotrain compatible   Endpoints compatible   Instruct   License:apache-2.0   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

Rank the MixTAO 7Bx2 MoE Instruct V7.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
MixTAO 7Bx2 MoE Instruct V7.0 (zhengr/MixTAO-7Bx2-MoE-Instruct-v7.0)

Quantized Models of the MixTAO 7Bx2 MoE Instruct V7.0

Model
Likes
Downloads
VRAM
...AO 7Bx2 MoE Instruct V7.0 GGUF103394 GB

Best Alternatives to MixTAO 7Bx2 MoE Instruct V7.0

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Mixtral 7Bx2 MoE 13B32K / 25.8 GB7417
MemGPT DPO MoE Test32K / 25.8 GB15
...tral 7B Instruct V0.2 2x7B MoE32K / 25.8 GB16304
Mistral Math 2x7b Mix32K / 25.8 GB4454
Megatron V3 2x7B32K / 25.8 GB7213
...tral Instruct MoE Experimental32K / 25.8 GB7232
MoEstral 2x7B32K / 25.8 GB112
Rain 2x7B MoE 32K V0.132K / 25.8 GB12
Mistral 2x7b V0.132K / 25.8 GB31
My Mixtral 2x7B32K / 25.8 GB31

MixTAO 7Bx2 MoE Instruct V7.0 Parameters and Internals

LLM NameMixTAO 7Bx2 MoE Instruct V7.0
RepositoryOpen on ๐Ÿค— 
Model Size12.9b
Required VRAM25.7 GB
Updated2024-07-04
Maintainerzhengr
Model Typemixtral
Instruction-BasedYes
Model Files  9.9 GB: 1-of-3   9.9 GB: 2-of-3   5.9 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 33742 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801