LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Phi 2 Zh by elliotthwangmsa

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  elliotthwangmsa  ยป  Phi 2 Zh   URL Share it on

  Arxiv:1910.09700   Autotrain compatible   Custom code   Endpoints compatible   Phi   Region:us   Safetensors   Sharded   Tensorflow

Rank the Phi 2 Zh Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Phi 2 Zh (elliotthwangmsa/phi-2_zh)

Best Alternatives to Phi 2 Zh

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Aanaphi2 V0.163.282K / 5.6 GB56399
Phi 2 Psy62.82K / 5.6 GB332713
Phi 261.332K / 5.6 GB4488362845
DPO Phi261.262K / 5.6 GB7441
Phi 2 DPO61.252K / 5.6 GB10494
Phi 2 Openhermes 30K60.372K / 16.7 GB11790
Phi 2 OpenHermes 2.558.382K / 5.6 GB81110
Cinder Phi 2 Test 157.052K / 11.1 GB670
Phi 2 OpenHermes 2.551.052K / 5.6 GB47513
Phi 2 Logical Sft4K / 5.6 GB04
Note: green Score (e.g. "73.2") means that the model is better than elliotthwangmsa/phi-2_zh.

Phi 2 Zh Parameters and Internals

LLM NamePhi 2 Zh
RepositoryOpen on ๐Ÿค— 
Model Size2.8b
Required VRAM5.6 GB
Updated2024-02-28
Maintainerelliotthwangmsa
Model Typephi
Model Files  5.0 GB: 1-of-2   0.6 GB: 2-of-2
Model ArchitecturePhiForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.38.0.dev0
Tokenizer ClassCodeGenTokenizer
Padding Token<|endoftext|>
Vocabulary Size51200
Initializer Range0.02
Torch Data Typebfloat16
Embedding Dropout0
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003