LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Pythia 12B Pre 2000 by andreaskoepf

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  andreaskoepf  ยป  Pythia 12B Pre 2000   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neox   License:apache-2.0   Pytorch   Region:us   Sharded

Rank the Pythia 12B Pre 2000 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Pythia 12B Pre 2000 (andreaskoepf/pythia-12b-pre-2000)

Best Alternatives to Pythia 12B Pre 2000

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Dolly V2 12B2K / 23.8 GB48981922
...sst Sft 4 Pythia 12B Epoch 3.52K / 23.8 GB8824351
Oasst Sft 1 Pythia 12B2K / 23.8 GB5213279
Pythia 12B2K / 23.8 GB11195125
Pythia 12B Deduped2K / 23.8 GB1346450
Lotus 12B2K / 23.8 GB187826
Pythia 12B Deduped V02K / 23.8 GB30825
Pythia 12B Sft V8 7K Steps2K / 23.8 GB347821
Pythia 12B V02K / 23.8 GB19521
Pythia 12B Pre V8.12.5K Steps2K / 23.8 GB25406

Pythia 12B Pre 2000 Parameters and Internals

LLM NamePythia 12B Pre 2000
RepositoryOpen on ๐Ÿค— 
Model Size12b
Required VRAM23.8 GB
Updated2024-02-28
Maintainerandreaskoepf
Model Typegpt_neox
Model Files  9.8 GB: 1-of-3   9.9 GB: 2-of-3   4.1 GB: 3-of-3
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.26.1
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50288
Initializer Range0.02
Torch Data Typefloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003