๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Phi 2 Logical Sft | — | 4K / 5.6 GB | 2019 | 6 |
EEVE2.8B KO Finetune Test | — | 2K / 2.9 GB | 10 | 0 |
Liph.42 | — | 2K / 5.5 GB | 2517 | 1 |
Phi 2 Slerp | — | 2K / 5.5 GB | 2893 | 0 |
Obelix Phi2 | — | 2K / 5.5 GB | 2729 | 0 |
Obelix Phi2 V0 | — | 2K / 5.5 GB | 2712 | 0 |
Liph.42 Slerp | — | 2K / 5.5 GB | 2650 | 0 |
Limmy Phi2 Slerp | — | 2K / 5.5 GB | 2499 | 0 |
Phi 2 | — | 2K / 5.6 GB | 881263 | 3144 |
Phi 2 Super | — | 2K / 5.6 GB | 5035 | 83 |
LLM Name | Phi 2 Layla V1 Chatml |
Repository | Open on ๐ค |
Model Size | 2.8b |
Required VRAM | 5.6 GB |
Updated | 2024-05-14 |
Maintainer | l3utterfly |
Model Type | phi |
Model Files | |
Supported Languages | en |
Model Architecture | PhiForCausalLM |
License | mit |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.39.0.dev0 |
Tokenizer Class | CodeGenTokenizer |
Padding Token | <|endoftext|> |
Vocabulary Size | 51200 |
Initializer Range | 0.02 |
Torch Data Type | bfloat16 |
Embedding Dropout | 0 |