FusionNet 34Bx2 MoE by TomGrc

 ยป  All LLMs  ยป  TomGrc  ยป  FusionNet 34Bx2 MoE   URL Share it on

  Autotrain compatible   Conversational   En   Endpoints compatible   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

FusionNet 34Bx2 MoE Benchmarks

FusionNet 34Bx2 MoE (TomGrc/FusionNet_34Bx2_MoE)

FusionNet 34Bx2 MoE Parameters and Internals

Model Type 
text generation
Additional Notes 
Fine-tuned model on English language using the MoE method, enhancing its performance.
Training Details 
Methodology:
MoE (Mixture of Experts)
Model Architecture:
MoE method to improve original model performance
LLM NameFusionNet 34Bx2 MoE
Repository ๐Ÿค—https://huggingface.co/TomGrc/FusionNet_34Bx2_MoE 
Model Size60.8b
Required VRAM121.2 GB
Updated2025-02-05
MaintainerTomGrc
Model Typemixtral
Model Files  3.9 GB: 1-of-32   3.8 GB: 2-of-32   3.8 GB: 3-of-32   3.8 GB: 4-of-32   3.8 GB: 5-of-32   3.8 GB: 6-of-32   3.8 GB: 7-of-32   3.8 GB: 8-of-32   3.8 GB: 9-of-32   3.8 GB: 10-of-32   3.8 GB: 11-of-32   3.8 GB: 12-of-32   3.8 GB: 13-of-32   3.8 GB: 14-of-32   3.8 GB: 15-of-32   3.8 GB: 16-of-32   3.8 GB: 17-of-32   3.8 GB: 18-of-32   3.8 GB: 19-of-32   3.8 GB: 20-of-32   3.8 GB: 21-of-32   3.8 GB: 22-of-32   3.8 GB: 23-of-32   3.8 GB: 24-of-32   3.8 GB: 25-of-32   3.8 GB: 26-of-32   3.8 GB: 27-of-32   3.8 GB: 28-of-32   4.0 GB: 29-of-32   4.0 GB: 30-of-32   3.9 GB: 31-of-32   2.8 GB: 32-of-32
Supported Languagesen
Model ArchitectureMixtralForCausalLM
Licensemit
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16

Quantized Models of the FusionNet 34Bx2 MoE

Model
Likes
Downloads
VRAM
FusionNet 34Bx2 MoE GGUF521522 GB
FusionNet 34Bx2 MoE AWQ5832 GB
FusionNet 34Bx2 MoE GPTQ21031 GB

Best Alternatives to FusionNet 34Bx2 MoE

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B195K / 121.9 GB4251112
Yi 34Bx2 MoE 60B DPO195K / 121.8 GB42113
Bagel Hermes 2x34B195K / 121.9 GB19516
Yi 34Bx2 MoE 200K195K / 121.9 GB41922
Yi 34Bx2 MoE 60B195K / 121.9 GB425565
...34Bx2 MoE V0.1 Full Linear DPO195K / 121.8 GB602
FusionNet 34Bx2 MoE V0.1195K / 121.2 GB558
... Cloudyu Mixtral 34Bx2 MoE 60B195K / 121.8 GB540
...DPO TomGrc FusionNet 34Bx2 MoE32K / 121.8 GB1204
Nous Hermes 2 MoE 2x34B4K / 121.9 GB12280
Note: green Score (e.g. "73.2") means that the model is better than TomGrc/FusionNet_34Bx2_MoE.

Rank the FusionNet 34Bx2 MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227