MoE 3x7b QA Code Inst by nextai-team

 ยป  All LLMs  ยป  nextai-team  ยป  MoE 3x7b QA Code Inst   URL Share it on

  Autotrain compatible   Code   Conversational   En   Endpoints compatible   Mistral   Mixtral   Moe   Qa   Reasoning   Region:us   Safetensors   Sharded   Tensorflow

MoE 3x7b QA Code Inst Benchmarks

MoE 3x7b QA Code Inst Parameters and Internals

LLM NameMoE 3x7b QA Code Inst
Repository ๐Ÿค—https://huggingface.co/nextai-team/Moe-3x7b-QA-Code-Inst 
Model Size18.5b
Required VRAM37 GB
Updated2024-10-18
Maintainernextai-team
Model Typemixtral
Model Files  4.9 GB: 1-of-8   4.9 GB: 2-of-8   4.9 GB: 3-of-8   5.0 GB: 4-of-8   4.9 GB: 5-of-8   4.9 GB: 6-of-8   5.0 GB: 7-of-8   2.5 GB: 8-of-8
Supported Languagesen
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16
MoE 3x7b QA Code Inst (nextai-team/Moe-3x7b-QA-Code-Inst)

Best Alternatives to MoE 3x7b QA Code Inst

Best Alternatives
Context / RAM
Downloads
Likes
Lumina 3.532K / 37.1 GB26540
EastAsia 4x7B MoE Experiment32K / 37.1 GB10281
Topxtral 4x7B V0.132K / 37.1 GB46274
Hyperion 3.0 Mixtral 3x7B32K / 37.1 GB614
Blitz AI MoE V0.732K / 37.1 GB561
Blitz AI MoE V0.432K / 37.1 GB571
HeroBophades 3x7B32K / 37.1 GB141
Wizard Kun Lake 3x7B MoE32K / 37.1 GB101
NaruMOE 3x7B V232K / 37.1 GB260
Pearl 3x7B32K / 37.1 GB591
Note: green Score (e.g. "73.2") means that the model is better than nextai-team/Moe-3x7b-QA-Code-Inst.

Rank the MoE 3x7b QA Code Inst Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36966 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803