MixTAO 19B Pass by allknowingroger

 ยป  All LLMs  ยป  allknowingroger  ยป  MixTAO 19B Pass   URL Share it on

  Merged Model   Autotrain compatible Base model:finetune:mixtao/mix... Base model:mixtao/mixtao-7bx2-...   Endpoints compatible   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow   Zhengr/mixtao-7bx2-moe-v8.1

MixTAO 19B Pass Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
MixTAO 19B Pass (allknowingroger/MixTAO-19B-pass)

MixTAO 19B Pass Parameters and Internals

Model Type 
text-generation
Additional Notes 
MixTAO-19B-pass is created using LazyMergekit tool for model merging.
LLM NameMixTAO 19B Pass
Repository ๐Ÿค—https://huggingface.co/allknowingroger/MixTAO-19B-pass 
Base Model(s)  zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1
Merged ModelYes
Model Size19.2b
Required VRAM38.1 GB
Updated2024-12-14
Maintainerallknowingroger
Model Typemixtral
Model Files  1.0 GB: 1-of-41   0.9 GB: 2-of-41   0.9 GB: 3-of-41   1.0 GB: 4-of-41   0.9 GB: 5-of-41   1.0 GB: 6-of-41   1.0 GB: 7-of-41   0.9 GB: 8-of-41   1.0 GB: 9-of-41   0.9 GB: 10-of-41   1.0 GB: 11-of-41   1.0 GB: 12-of-41   0.9 GB: 13-of-41   1.0 GB: 14-of-41   0.9 GB: 15-of-41   1.0 GB: 16-of-41   1.0 GB: 17-of-41   0.9 GB: 18-of-41   1.0 GB: 19-of-41   0.9 GB: 20-of-41   1.0 GB: 21-of-41   0.9 GB: 22-of-41   1.0 GB: 23-of-41   0.9 GB: 24-of-41   1.0 GB: 25-of-41   1.0 GB: 26-of-41   0.9 GB: 27-of-41   0.9 GB: 28-of-41   0.9 GB: 29-of-41   0.9 GB: 30-of-41   1.0 GB: 31-of-41   0.9 GB: 32-of-41   0.9 GB: 33-of-41   0.9 GB: 34-of-41   0.9 GB: 35-of-41   0.9 GB: 36-of-41   1.0 GB: 37-of-41   0.9 GB: 38-of-41   1.0 GB: 39-of-41   0.9 GB: 40-of-41   0.3 GB: 41-of-41
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.41.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to MixTAO 19B Pass

Best Alternatives
Context / RAM
Downloads
Likes
Multimerge 19B Pass32K / 38 GB100
Lorge 2x7B UAMM32K / 38.2 GB160
Mistralmath 15B Pass32K / 38.5 GB110
TaoPassthrough 15B S32K / 38.4 GB200
Raccoon Small32K / 38.4 GB861
...oundary Solar Chat 2x10.7B MoE4K / 38 GB1231
Mixtral 11Bx2 MoE 19B4K / 38.4 GB127538
Truthful DPO MoE 19B4K / 38.4 GB12071
Venus DPO 504K / 38.4 GB12100
SOLAR Math 2x10.7B V0.24K / 38.4 GB12153

Rank the MixTAO 19B Pass Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39237 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124