LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

H2ogpt Oasst1 512 12B by h2oai

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  h2oai  ยป  H2ogpt Oasst1 512 12B   URL Share it on

  Autotrain compatible Dataset:h2oai/openassistant oa...   En   Gpt   Gpt neox   Has space   License:apache-2.0   Open-source   Pytorch   Region:us   Sharded

Rank the H2ogpt Oasst1 512 12B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
H2ogpt Oasst1 512 12B (h2oai/h2ogpt-oasst1-512-12b)

Best Alternatives to H2ogpt Oasst1 512 12B

Best Alternatives
HF Rank
Dolly V2 12B2K / 23.8 GB48981922
...sst Sft 4 Pythia 12B Epoch 3.52K / 23.8 GB8824351
Oasst Sft 1 Pythia 12B2K / 23.8 GB5213279
Pythia 12B2K / 23.8 GB11195125
Pythia 12B Deduped2K / 23.8 GB1346450
Lotus 12B2K / 23.8 GB187826
Pythia 12B Deduped V02K / 23.8 GB30825
Pythia 12B Sft V8 7K Steps2K / 23.8 GB347821
Pythia 12B V02K / 23.8 GB19521
Pythia 12B Pre V8.12.5K Steps2K / 23.8 GB25406

H2ogpt Oasst1 512 12B Parameters and Internals

LLM NameH2ogpt Oasst1 512 12B
RepositoryOpen on ๐Ÿค— 
Model Size12b
Required VRAM23.9 GB
Model Typegpt_neox
Model Files  5.0 GB: 1-of-5   4.8 GB: 2-of-5   4.9 GB: 3-of-5   5.0 GB: 4-of-5   4.2 GB: 5-of-5
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.1
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50688
Initializer Range0.02
Torch Data Typefloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003