LLM Name | Slerp CM Mist DPO |
Repository ๐ค | https://huggingface.co/abacusai/Slerp-CM-mist-dpo |
Merged Model | Yes |
Model Size | 7.2b |
Required VRAM | 14.4 GB |
Updated | 2024-09-18 |
Maintainer | abacusai |
Model Type | mistral |
Model Files | |
Model Architecture | MistralForCausalLM |
License | apache-2.0 |
Context Length | 32768 |
Model Max Length | 32768 |
Transformers Version | 4.36.1 |
Tokenizer Class | LlamaTokenizer |
Vocabulary Size | 32000 |
Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Spydaz Web AI BIBLE 002 | 512K / 14.4 GB | 24 | 0 |
Spydaz Web AI 08 | 512K / 14.5 GB | 76 | 1 |
Spydaz Web AI ChatML 002 | 512K / 14.4 GB | 22 | 0 |
Spydaz Web AI ChatQA 001 UFT | 512K / 14.4 GB | 28 | 1 |
Spydaz Web AI ChatQA 001 SFT | 512K / 14.4 GB | 23 | 1 |
Spydaz Web AI ChatQA 001 | 512K / 14.4 GB | 17 | 1 |
Spydaz Web AI BIBLE 001 | 512K / 14.4 GB | 12 | 0 |
Spydaz Web AI 010 | 512K / 14.5 GB | 14 | 0 |
Openchat 3.5 0106 128K DPO | 128K / 14.4 GB | 60 | 2 |
Mistral7B PairRM SPPO Iter2 | 32K / 14.4 GB | 4901 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐