MPT 7B Storywriter Pythia ChatBase Merge by TehVenom

 ยป  All LLMs  ยป  TehVenom  ยป  MPT 7B Storywriter Pythia ChatBase Merge   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neox   Pytorch   Region:us   Sharded

MPT 7B Storywriter Pythia ChatBase Merge Parameters and Internals

LLM NameMPT 7b Storywriter Pythia ChatBase Merge
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM13.6 GB
Updated2024-07-27
MaintainerTehVenom
Model Typegpt_neox
Model Files  2.0 GB: 1-of-7   2.0 GB: 2-of-7   2.0 GB: 3-of-7   2.0 GB: 4-of-7   2.0 GB: 5-of-7   2.0 GB: 6-of-7   1.6 GB: 7-of-7
Model ArchitectureGPTNeoXForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50432
Torch Data Typefloat16
MPT 7B Storywriter Pythia ChatBase Merge (TehVenom/MPT-7b_Storywriter-Pythia_ChatBase-Merge)

Best Alternatives to MPT 7B Storywriter Pythia ChatBase Merge

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Literature 7B 163840.116K / 36 GB1811
RedPajama 7B 163840.116K / 36 GB134
Stablelm Tuned Alpha 7B0.24K / 31.9 GB2060359
Stablelm Base Alpha 7B0.24K / 31.9 GB1475210
Stablelm 7B Sft V7 Epoch 30.24K / 32.4 GB110967
StableLManticore 7B0.14K / 16 GB151
Pythia 6.9B Deduped 4K0.14K / 27.2 GB1111
Stablelm 7B0.14K / 31.9 GB112
Sarashina1 7B0.32K / 13.9 GB6900
Open Calm 7B0.22K / 13.9 GB3762203
Note: green Score (e.g. "73.2") means that the model is better than TehVenom/MPT-7b_Storywriter-Pythia_ChatBase-Merge.

Rank the MPT 7B Storywriter Pythia ChatBase Merge Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34447 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501