LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Pupu Bmg by aoyuqc

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  aoyuqc  ยป  Pupu Bmg   URL Share it on

  Autotrain compatible   Endpoints compatible   Llama   Pytorch   Region:us   Sharded

Rank the Pupu Bmg Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Pupu Bmg (aoyuqc/pupu-bmg)

Best Alternatives to Pupu Bmg

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Bagel DPO 7B V0.167.9532K / 14.4 GB234740
Internlm2 7B Llama66.9432K / 15.5 GB15998
Llama2 Init Mistral60.984K / 14.4 GB25510
A I 0xtom 7B Slerp60.4632K / 14.4 GB2580
AIRIC The Mistral59.9532K / 14.4 GB20673
Synatra RP Orca 2 7B V0.159.554K / 13.5 GB32026
Deepseek Llm 7B Chat59.274K / 13.9 GB726458
UltraQwen 7B59.1732K / 15.4 GB17712
...rnlm2 20B Llama 4.0bpw H6 EXL258.532K / 11 GB51
Mistral 7B Guanaco1k Ep258.1332K / 29 GB32713
Note: green Score (e.g. "73.2") means that the model is better than aoyuqc/pupu-bmg.

Pupu Bmg Parameters and Internals

LLM NamePupu Bmg
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM13.5 GB
Updated2024-02-28
Maintaineraoyuqc
Model Typellama
Model Files  10.0 GB: 1-of-2   3.5 GB: 2-of-2
Model ArchitectureLlamaForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.33.1
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003