Multimerge Neurallaymons 12B MoE by allknowingroger

 ยป  All LLMs  ยป  allknowingroger  ยป  Multimerge Neurallaymons 12B MoE   URL Share it on

Allknowingroger/multimerge-7b-... Allknowingroger/neurallaymons-...   Autotrain compatible Base model:allknowingroger/mul... Base model:allknowingroger/neu... Base model:merge:allknowingrog... Base model:merge:allknowingrog...   Endpoints compatible   Frankenmoe   Lazymergekit   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Multimerge Neurallaymons 12B MoE Benchmarks

Multimerge Neurallaymons 12B MoE Parameters and Internals

LLM NameMultimerge Neurallaymons 12B MoE
RepositoryOpen on ๐Ÿค— 
Base Model(s)  MultiMerge 7B Slerp   Neurallaymons 7B Slerp   allknowingroger/MultiMerge-7B-slerp   allknowingroger/Neurallaymons-7B-slerp
Model Size7b
Required VRAM25.8 GB
Updated2024-07-27
Maintainerallknowingroger
Model Typemixtral
Model Files  1.9 GB: 1-of-13   2.0 GB: 2-of-13   2.0 GB: 3-of-13   2.0 GB: 4-of-13   2.0 GB: 5-of-13   2.0 GB: 6-of-13   2.0 GB: 7-of-13   2.0 GB: 8-of-13   2.0 GB: 9-of-13   2.0 GB: 10-of-13   2.0 GB: 11-of-13   2.0 GB: 12-of-13   1.9 GB: 13-of-13
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16
Multimerge Neurallaymons 12B MoE (allknowingroger/Multimerge-Neurallaymons-12B-MoE)

Best Alternatives to Multimerge Neurallaymons 12B MoE

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Multimaster 7B V60.332K / 142.5 GB27411
Mixtral 7B 8expert0.332K / 93.6 GB12888260
MultiverseBuddy 15B MoE0.332K / 25.8 GB2410
Mini Mixtral V0.20.232K / 25.8 GB3023
Laserxtral0.232K / 48.3 GB93578
Lumina 20.232K / 37.1 GB2390
RogerWizard 12B MoE0.232K / 25.8 GB2521
StarlingMaths 12B MoE0.232K / 25.8 GB2380
MultiverseMath 12B MoE0.232K / 25.8 GB2590
WestLakeLaser 12B MoE0.232K / 25.8 GB2450
Note: green Score (e.g. "73.2") means that the model is better than allknowingroger/Multimerge-Neurallaymons-12B-MoE.

Rank the Multimerge Neurallaymons 12B MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34446 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501