Phi3 Finetune Test by XOneThree

 ยป  All LLMs  ยป  XOneThree  ยป  Phi3 Finetune Test   URL Share it on

  Arxiv:1910.09700   4bit   Autotrain compatible   Conversational   Endpoints compatible   Finetuned   Instruct   Mistral   Pytorch   Quantized   Region:us   Sft   Sharded   Trl   Unsloth

Rank the Phi3 Finetune Test Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Phi3 Finetune Test (XOneThree/phi3-finetune-test)

Best Alternatives to Phi3 Finetune Test

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Phi3 StoryGenerator4K / 7.6 GB4952
Kosmox4K / 7.6 GB741
Kosmox Small4K / 7.6 GB141
Kosmox Tiny4K / 7.6 GB71
Phi3 3.8 4k Alpaca Instruct4K / 7.6 GB21
Moniphi 3 V14K / 7.6 GB11
Phi 3 Mini 4K ORPO4K / 7.6 GB01
... Mini 4K Instruct Bnb 4bit Ita4K / 7.6 GB24990
Phi 3 Mini 16bit4K / 7.6 GB10630
... Mini 4K Instruct Merged 16bit4K / 7.6 GB700

Phi3 Finetune Test Parameters and Internals

LLM NamePhi3 Finetune Test
RepositoryOpen on ๐Ÿค— 
Required VRAM7.6 GB
Updated2024-06-24
MaintainerXOneThree
Model Typemistral
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   2.6 GB: 2-of-2
Quantization Type4bit
Model ArchitectureMistralForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token<|placeholder6|>
Vocabulary Size32064
Initializer Range0.02
Torch Data Typefloat16

What open-source LLMs or SLMs are you in search of? 34902 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801