MixTAO 7Bx2 MoE V8.1 by zhengr

 ยป  All LLMs  ยป  zhengr  ยป  MixTAO 7Bx2 MoE V8.1   URL Share it on

  Autotrain compatible   Endpoints compatible   License:apache-2.0   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

Rank the MixTAO 7Bx2 MoE V8.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
MixTAO 7Bx2 MoE V8.1 (zhengr/MixTAO-7Bx2-MoE-v8.1)

Quantized Models of the MixTAO 7Bx2 MoE V8.1

Model
Likes
Downloads
VRAM
MixTAO 7Bx2 MoE V8.1 GGUF116124 GB

Best Alternatives to MixTAO 7Bx2 MoE V8.1

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
MonarchCoder MoE 2x7B32K / 22.8 GB7261
Boundary Hermes Chat 2x7B MoE32K / 25.5 GB3841
MixTAO 7Bx2 MoE Instruct V7.032K / 25.7 GB75519
DARE TIES 13B32K / 25.7 GB622210
MultiMash5 12B Slerp32K / 25.7 GB3870
MultiMash2 12B Slerp32K / 25.7 GB3850
MultiMash7 12B Slerp32K / 25.7 GB3820
MultiMash6 12B Slerp32K / 25.7 GB3800
MultiMash 12B Slerp32K / 25.7 GB3790
MultiMash9 13B Slerp32K / 25.7 GB3330

MixTAO 7Bx2 MoE V8.1 Parameters and Internals

LLM NameMixTAO 7Bx2 MoE V8.1
RepositoryOpen on ๐Ÿค— 
Model Size12.9b
Required VRAM25.8 GB
Updated2024-07-04
Maintainerzhengr
Model Typemixtral
Model Files  1.9 GB: 1-of-13   2.0 GB: 2-of-13   2.0 GB: 3-of-13   2.0 GB: 4-of-13   1.9 GB: 5-of-13   2.0 GB: 6-of-13   2.0 GB: 7-of-13   2.0 GB: 8-of-13   2.0 GB: 9-of-13   2.0 GB: 10-of-13   2.0 GB: 11-of-13   2.0 GB: 12-of-13   2.0 GB: 13-of-13
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 33742 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801