LLM Name | Phi 4 Deepseek R1K RL EZO |
Repository ๐ค | https://huggingface.co/AXCXEPT/phi-4-deepseek-R1K-RL-EZO |
Base Model(s) | |
Model Size | 32b |
Required VRAM | 29.4 GB |
Updated | 2025-05-12 |
Maintainer | AXCXEPT |
Model Type | phi3 |
Model Files | |
Supported Languages | en ja |
Model Architecture | Phi3ForCausalLM |
License | mit |
Context Length | 16384 |
Model Max Length | 16384 |
Transformers Version | 4.48.1 |
Tokenizer Class | GPT2Tokenizer |
Padding Token | <|endoftext|> |
Vocabulary Size | 100352 |
Torch Data Type | bfloat16 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐