LLM Name | MiniCPM 2B Sft Bf16 |
Repository | Open on ๐ค |
Model Size | 2b |
Required VRAM | 5.5 GB |
Updated | 2024-07-27 |
Maintainer | openbmb |
Model Files | |
Supported Languages | en zh |
Model Architecture | MiniCPMForCausalLM |
Context Length | 4096 |
Model Max Length | 4096 |
Transformers Version | 4.36.0 |
Tokenizer Class | LlamaTokenizer |
Vocabulary Size | 122753 |
Torch Data Type | bfloat16 |
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
MiniCPM 2B 128K | 0.3 | 64K / 6 GB | 1233 | 35 |
MiniCPM 2B Sft Fp32 | 0.3 | 4K / 10.9 GB | 413 | 297 |
...iCPM 2B RAFT Lora Hotpotqa Dev | 0.2 | 4K / 5.5 GB | 25 | 0 |
MiniCPM MoE 8x2B | 0.2 | 4K / 27.7 GB | 127 | 39 |
MiniCPM Duplex | 0.2 | 4K / 5.5 GB | 19 | 1 |
MiniCPM 2B DPO Bf16 | 0.2 | 4K / 5.5 GB | 948 | 38 |
...iniCPM 2B DPO Bf16 Safetensors | 0.2 | 4K / 5.5 GB | 6 | 1 |
...iniCPM 2B DPO Fp32 Safetensors | 0.2 | 4K / 10.9 GB | 6 | 1 |
...iniCPM 2B Sft Fp32 Safetensors | 0.2 | 4K / 10.9 GB | 6 | 1 |
MiniCPM 2B History | 0.2 | 4K / GB | 28 | 16 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐