TaoPassthrough 15B S by allknowingroger

 ยป  All LLMs  ยป  allknowingroger  ยป  TaoPassthrough 15B S   URL Share it on

  Merged Model   Autotrain compatible Base model:finetune:mixtao/mix... Base model:mixtao/mixtao-7bx2-...   Endpoints compatible   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow   Zhengr/mixtao-7bx2-moe-v8.1

TaoPassthrough 15B S Benchmarks

TaoPassthrough 15B S (allknowingroger/TaoPassthrough-15B-s)

TaoPassthrough 15B S Parameters and Internals

Additional Notes 
The model is a merge of the same base model repeated five times using LazyMergekit with passthrough method.
LLM NameTaoPassthrough 15B S
Repository ๐Ÿค—https://huggingface.co/allknowingroger/TaoPassthrough-15B-s 
Base Model(s)  zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1   zhengr/MixTAO-7Bx2-MoE-v8.1
Merged ModelYes
Model Size19.2b
Required VRAM38.4 GB
Updated2024-12-14
Maintainerallknowingroger
Model Typemixtral
Model Files  2.0 GB: 1-of-20   1.9 GB: 2-of-20   2.0 GB: 3-of-20   2.0 GB: 4-of-20   2.0 GB: 5-of-20   2.0 GB: 6-of-20   1.9 GB: 7-of-20   2.0 GB: 8-of-20   2.0 GB: 9-of-20   2.0 GB: 10-of-20   2.0 GB: 11-of-20   2.0 GB: 12-of-20   2.0 GB: 13-of-20   2.0 GB: 14-of-20   2.0 GB: 15-of-20   2.0 GB: 16-of-20   2.0 GB: 17-of-20   2.0 GB: 18-of-20   2.0 GB: 19-of-20   0.6 GB: 20-of-20
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to TaoPassthrough 15B S

Best Alternatives
Context / RAM
Downloads
Likes
MixTAO 19B Pass32K / 38.1 GB161
Multimerge 19B Pass32K / 38 GB100
Lorge 2x7B UAMM32K / 38.2 GB160
Mistralmath 15B Pass32K / 38.5 GB110
Raccoon Small32K / 38.4 GB861
...oundary Solar Chat 2x10.7B MoE4K / 38 GB1231
Mixtral 11Bx2 MoE 19B4K / 38.4 GB127538
Truthful DPO MoE 19B4K / 38.4 GB12071
Venus DPO 504K / 38.4 GB12100
SOLAR Math 2x10.7B V0.24K / 38.4 GB12153
Note: green Score (e.g. "73.2") means that the model is better than allknowingroger/TaoPassthrough-15B-s.

Rank the TaoPassthrough 15B S Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39237 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124