Phi 3 Mini 4K Instruct Fp16 by Xenova

 ยป  All LLMs  ยป  Xenova  ยป  Phi 3 Mini 4K Instruct Fp16   URL Share it on

  Conversational   Custom code   Dml   Fp16   Instruct   Onnx   Onnxruntime   Phi3   Quantized   Region:us   Transformers.js

Phi 3 Mini 4K Instruct Fp16 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phi 3 Mini 4K Instruct Fp16 (Xenova/Phi-3-mini-4k-instruct_fp16)

Phi 3 Mini 4K Instruct Fp16 Parameters and Internals

Model Type 
text-generation
Additional Notes 
This is a version of the official phi3 onnx model with modifications for onnxruntime-web, including fp16 with int4 block quantization for weights, output in fp32, and using MHA instead of GQA. The onnx and external data file requirements are to be below 2GB for chromium caching.
LLM NamePhi 3 Mini 4K Instruct Fp16
Repository ๐Ÿค—https://huggingface.co/Xenova/Phi-3-mini-4k-instruct_fp16 
Base Model(s)  Phi 3 Mini 4K Instruct Fp8   unrahul/Phi-3-mini-4k-instruct-fp8
Updated2025-02-03
MaintainerXenova
Model Typephi3
Instruction-BasedYes
Quantization Typefp16
Model ArchitecturePhi3ForCausalLM
Licensemit
Context Length4096
Model Max Length4096
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32064
Torch Data Typebfloat16

Best Alternatives to Phi 3 Mini 4K Instruct Fp16

Best Alternatives
Context / RAM
Downloads
Likes
...m 128K Instruct 6.0bpw H6 EXL2128K / 10.7 GB93
...m 128K Instruct 8.0bpw H8 EXL2128K / 13.4 GB44
...dium 128K Instruct 8 0bpw EXL2128K / 13.4 GB41
...m 128K Instruct 3.0bpw H6 EXL2128K / 5.6 GB50
...m 128K Instruct 5.0bpw H6 EXL2128K / 8.9 GB50
...28K Instruct Ov Fp16 Int4 Asym128K / 2.5 GB50
...128K Instruct HQQ 4bit Smashed128K / 2.3 GB80
...128K Instruct HQQ 2bit Smashed128K / 1.4 GB60
NuExtract Bpw6 EXL24K / 3 GB41
...Mini 4K Geminified 3 0bpw EXL24K / 1.6 GB50

Rank the Phi 3 Mini 4K Instruct Fp16 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42463 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227