Phi 3 Mini 128K Instruct GGUF by gaianet

 ยป  All LLMs  ยป  gaianet  ยป  Phi 3 Mini 128K Instruct GGUF   URL Share it on

  Autotrain compatible Base model:microsoft/phi-3-min...   Code   Custom code   En   Endpoints compatible   Gguf   Instruct   License:mit   Phi3   Q2   Quantized   Region:us

Rank the Phi 3 Mini 128K Instruct GGUF Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Phi 3 Mini 128K Instruct GGUF (gaianet/Phi-3-mini-128k-instruct-GGUF)

Best Alternatives to Phi 3 Mini 128K Instruct GGUF

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...128k It Russian Q4 K M Chunked128K / 0.1 GB200
...128k Italian V2 Q4 K M Chunked128K / 0.1 GB160
Phi 3 Mini 128K Instruct GGUF128K / 1.4 GB8412
...i 3 Mini 4K Instruct.Q4 0.gguf128K / 2.2 GB3372
...hi 3 Medium 128K Instruct GGUF128K / 5.1 GB5482
...hi 3 Medium 128K Instruct GGUF128K / 5.1 GB2610
... 4k It French Alpaca V1 Q4 K M4K / 0.1 GB250
...tuguese Tom Cat Q4 K M Chunked4K / 0.1 GB140
...i 3 Mini 4K Instruct IMat GGUF4K / 0.8 GB7091
Phi 3 Mini 4K Instruct GGUF4K / 1.4 GB8813

Phi 3 Mini 128K Instruct GGUF Parameters and Internals

LLM NamePhi 3 Mini 128K Instruct GGUF
RepositoryOpen on ๐Ÿค— 
Model NamePhi 3 mini 128k instruct
Model CreatorMicrosoft
Base Model(s)  Phi 3 Mini 128K Instruct   microsoft/Phi-3-mini-128k-instruct
Required VRAM1.4 GB
Updated2024-07-13
Maintainergaianet
Model Typephi-msft
Instruction-BasedYes
Model Files  1.4 GB   2.1 GB   2.0 GB   1.7 GB   2.2 GB   2.4 GB   2.2 GB   2.6 GB   2.8 GB   2.6 GB   3.1 GB   4.1 GB   7.6 GB
Supported Languagesen
GGUF QuantizationYes
Quantization Typegguf|q2|q4_k|q5_k
Model ArchitecturePhi3ForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.39.3
Vocabulary Size32064
Initializer Range0.02
Torch Data Typebfloat16
Embedding Dropout0

What open-source LLMs or SLMs are you in search of? 36243 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801