LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Pythia 12B Pre V8.12.5K Steps by OpenAssistant

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  OpenAssistant  ยป  Pythia 12B Pre V8.12.5K Steps   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neox   Has space   License:apache-2.0   Pytorch   Region:us   Sharded

Rank the Pythia 12B Pre V8.12.5K Steps Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Pythia 12B Pre V8.12.5K Steps (OpenAssistant/pythia-12b-pre-v8-12.5k-steps)

Best Alternatives to Pythia 12B Pre V8.12.5K Steps

Best Alternatives
HF Rank
Dolly V2 12B2K / 23.8 GB51051917
...sst Sft 4 Pythia 12B Epoch 3.52K / 23.8 GB9087349
Oasst Sft 1 Pythia 12B2K / 23.8 GB5446279
Pythia 12B2K / 23.8 GB10274125
Pythia 12B Deduped2K / 23.8 GB1473550
Lotus 12B2K / 23.8 GB198726
Pythia 12B Deduped V02K / 23.8 GB48825
Pythia 12B Sft V8 7K Steps2K / 23.8 GB294021
Pythia 12B V02K / 23.8 GB18521
H2ogpt Gm Oasst1 En 1024 12B2K / 23.8 GB21165

Pythia 12B Pre V8.12.5K Steps Parameters and Internals

LLM NamePythia 12B Pre V8.12.5K Steps
RepositoryOpen on ๐Ÿค— 
Model Size12b
Required VRAM23.8 GB
Model Typegpt_neox
Model Files  10.0 GB: 1-of-3   9.9 GB: 2-of-3   3.9 GB: 3-of-3
Model ArchitectureGPTNeoXForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.0.dev0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50288
Initializer Range0.02
Torch Data Typefloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003