Phi 3 Mini 4K Instruct DPO EPFL GPTQ 2bit by StefanKrsteski

 ยป  All LLMs  ยป  StefanKrsteski  ยป  Phi 3 Mini 4K Instruct DPO EPFL GPTQ 2bit   URL Share it on

  Arxiv:1910.09700   2-bit   2bit   Autotrain compatible   Conversational   Custom code   Endpoints compatible   Gptq   Instruct   Phi3   Quantized   Region:us   Safetensors

Phi 3 Mini 4K Instruct DPO EPFL GPTQ 2bit Parameters and Internals

LLM NamePhi 3 Mini 4K Instruct DPO EPFL GPTQ 2bit
RepositoryOpen on ๐Ÿค— 
Model Size455.3m
Required VRAM1.4 GB
Updated2024-07-26
MaintainerStefanKrsteski
Model Typephi3
Instruction-BasedYes
Model Files  1.4 GB
GPTQ QuantizationYes
Quantization Typegptq|2bit
Model ArchitecturePhi3ForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.40.2
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32064
Torch Data Typefloat16
Phi 3 Mini 4K Instruct DPO EPFL GPTQ 2bit (StefanKrsteski/Phi-3-mini-4k-instruct-DPO-EPFL-GPTQ-2bit)

Rank the Phi 3 Mini 4K Instruct DPO EPFL GPTQ 2bit Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34446 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501