Phi 3.5 MoE Instruct by microsoft

 ยป  All LLMs  ยป  microsoft  ยป  Phi 3.5 MoE Instruct   URL Share it on

  Arxiv:2403.06412   Arxiv:2404.14219   Arxiv:2407.13833   Autotrain compatible   Code   Conversational   Custom code   Instruct   Moe   Multilingual   Phimoe   Region:us   Safetensors   Sharded   Tensorflow

Phi 3.5 MoE Instruct Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Phi 3.5 MoE Instruct Parameters and Internals

LLM NamePhi 3.5 MoE Instruct
Repository ๐Ÿค—https://huggingface.co/microsoft/Phi-3.5-MoE-instruct 
Model Size41.9b
Required VRAM83.9 GB
Updated2024-09-08
Maintainermicrosoft
Model Typephimoe
Instruction-BasedYes
Model Files  5.0 GB: 1-of-17   5.0 GB: 2-of-17   5.0 GB: 3-of-17   5.0 GB: 4-of-17   5.0 GB: 5-of-17   5.0 GB: 6-of-17   5.0 GB: 7-of-17   5.0 GB: 8-of-17   5.0 GB: 9-of-17   5.0 GB: 10-of-17   5.0 GB: 11-of-17   5.0 GB: 12-of-17   5.0 GB: 13-of-17   5.0 GB: 14-of-17   5.0 GB: 15-of-17   5.0 GB: 16-of-17   3.9 GB: 17-of-17
Model ArchitecturePhiMoEForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.43.3
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32064
Torch Data Typebfloat16
Phi 3.5 MoE Instruct (microsoft/Phi-3.5-MoE-instruct)

Rank the Phi 3.5 MoE Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 35693 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803