Phi3mash1 17B Pass by allknowingroger

 ยป  All LLMs  ยป  allknowingroger  ยป  Phi3mash1 17B Pass   URL Share it on

  Merged Model Base model:danielbrdz/barcenas... Base model:finetune:danielbrdz... Danielbrdz/barcenas-14b-phi-3-...   Mistral   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Phi3mash1 17B Pass Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phi3mash1 17B Pass (allknowingroger/Phi3mash1-17B-pass)

Phi3mash1 17B Pass Parameters and Internals

LLM NamePhi3mash1 17B Pass
Repository ๐Ÿค—https://huggingface.co/allknowingroger/Phi3mash1-17B-pass 
Base Model(s)  Barcenas 14B Phi 3 Medium ORPO   Barcenas 14B Phi 3 Medium ORPO   Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO   Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
Merged ModelYes
Model Size14b
Required VRAM33.6 GB
Updated2025-02-22
Maintainerallknowingroger
Model Typemistral
Model Files  0.8 GB: 1-of-36   0.9 GB: 2-of-36   0.9 GB: 3-of-36   1.0 GB: 4-of-36   1.0 GB: 5-of-36   0.9 GB: 6-of-36   1.0 GB: 7-of-36   1.0 GB: 8-of-36   0.9 GB: 9-of-36   1.0 GB: 10-of-36   1.0 GB: 11-of-36   1.0 GB: 12-of-36   0.9 GB: 13-of-36   1.0 GB: 14-of-36   1.0 GB: 15-of-36   0.9 GB: 16-of-36   1.0 GB: 17-of-36   0.9 GB: 18-of-36   1.0 GB: 19-of-36   1.0 GB: 20-of-36   0.9 GB: 21-of-36   1.0 GB: 22-of-36   1.0 GB: 23-of-36   0.9 GB: 24-of-36   1.0 GB: 25-of-36   0.9 GB: 26-of-36   0.9 GB: 27-of-36   1.0 GB: 28-of-36   0.9 GB: 29-of-36   0.9 GB: 30-of-36   1.0 GB: 31-of-36   0.9 GB: 32-of-36   0.9 GB: 33-of-36   1.0 GB: 34-of-36   0.9 GB: 35-of-36   0.4 GB: 36-of-36
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.42.4
Tokenizer ClassLlamaTokenizer
Padding Token<|placeholder6|>
Vocabulary Size32064
Torch Data Typefloat16

Best Alternatives to Phi3mash1 17B Pass

Best Alternatives
Context / RAM
Downloads
Likes
...ral Nemo Instruct 14B Merge V11000K / 24.6 GB190
K2S3 14B V0.232K / 28.7 GB280
Wendigo 14B Alpha432K / 28.4 GB12880
Qwen1.5 14B Chat Mistral32K / 28.6 GB202
Mistral 14B Merge Base32K / 28.4 GB20062
Synthetic Minstrel 14B32K / 27.6 GB223
Wandering Minstrel 14B32K / 27.6 GB113
Barcenas 14B Phi 3 Medium ORPO4K / 28 GB56435
SauerkrautLM Phi 3 Medium4K / 28 GB55519
...2.9.2 Phi 3 Medium Abliterated4K / 28 GB393517
Note: green Score (e.g. "73.2") means that the model is better than allknowingroger/Phi3mash1-17B-pass.

Rank the Phi3mash1 17B Pass Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227