LLM Name | Bnb DPO 8bit |
Repository ๐ค | https://huggingface.co/syr99/bnb_dpo_8bit |
Model Size | 2.8b |
Required VRAM | 3 GB |
Updated | 2025-02-05 |
Maintainer | syr99 |
Model Type | phi |
Model Files | |
Quantization Type | 8bit |
Model Architecture | PhiForCausalLM |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.41.1 |
Tokenizer Class | CodeGenTokenizer |
Padding Token | <pad><pad><pad> |
Vocabulary Size | 50296 |
Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Phi 2 4bit 64rank | 2K / 5.6 GB | 153 | 0 |
Phi 2 Nf4 Fp16 Upscaled | 2K / 5.6 GB | 26 | 0 |
MFANN3bv0.24 | 128K / 11.1 GB | 5 | 0 |
MFANN3b | 128K / 11.1 GB | 116 | 0 |
MFANN3bv1.3 | 128K / 11.1 GB | 13 | 0 |
MFANN3bv1.1 | 128K / 11.1 GB | 16 | 0 |
MFANN3bv0.23 | 128K / 11.1 GB | 6 | 0 |
MFANN3b SFT | 128K / 5.6 GB | 169 | 0 |
MFANN3b Rebase | 128K / 11.1 GB | 10 | 0 |
MFANN3bv1.2 | 126K / 11.1 GB | 32 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐