Phi3 Finetune Test by XOneThree

 ยป  All LLMs  ยป  XOneThree  ยป  Phi3 Finetune Test   URL Share it on

  Arxiv:1910.09700   4bit   Autotrain compatible   Conversational   Endpoints compatible   Finetuned   Instruct   Mistral   Pytorch   Quantized   Region:us   Sft   Sharded   Trl   Unsloth

Phi3 Finetune Test Parameters and Internals

LLM NamePhi3 Finetune Test
RepositoryOpen on ๐Ÿค— 
Required VRAM7.6 GB
Updated2024-07-27
MaintainerXOneThree
Model Typemistral
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   2.6 GB: 2-of-2
Quantization Type4bit
Model ArchitectureMistralForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token<|placeholder6|>
Vocabulary Size32064
Torch Data Typefloat16
Phi3 Finetune Test (XOneThree/phi3-finetune-test)

Best Alternatives to Phi3 Finetune Test

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
... Mini 4K Instruct Bnb 4bit Ita0.34K / 7.6 GB19810
...ini New Model With Lora Merged0.34K / 7.6 GB520
Phired0.34K / 7.6 GB290
Phi 3 Mini Hospital Topic 500.24K / 7.6 GB160
Model0.24K / 7.6 GB190
Phi3 History V20.24K / 7.6 GB70
Model0.24K / 7.6 GB90
MainPHI30.24K / 7.6 GB60
Phi3 StoryGenerator0.24K / 7.6 GB342
MediChat Revision0.24K / 7.6 GB70
Note: green Score (e.g. "73.2") means that the model is better than XOneThree/phi3-finetune-test.

Rank the Phi3 Finetune Test Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34447 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501