LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

GPT Neo 2.7B Horni by LyaaaaaGames

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  LyaaaaaGames  ยป  GPT Neo 2.7B Horni   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neo   Pytorch   Region:us   Sharded

Rank the GPT Neo 2.7B Horni Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
GPT Neo 2.7B Horni (LyaaaaaGames/GPT-Neo-2.7B-Horni)

Best Alternatives to GPT Neo 2.7B Horni

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
GPT Neo 2.7B Onnx Js2K /  GB60
Pygmalion 2.7B2K / 0 GB31
Pygmalion 2.7B2K / 5.4 GB403450
GPT Neo 2.7B Shinen2K / 5.4 GB134820
GPT Neo 2.7B Horni2K / 5.4 GB170218
Bhaskara2K / 5.4 GB1513
GPT Neo 2.7B Picard2K / 5.4 GB2747
GPT Neo 2.7B Janeway2K / 5.4 GB6296
GPT Neo 2.7B AID2K / 5.4 GB5974
GPT Neo 2.7B Horni LN2K / 5.4 GB2594

GPT Neo 2.7B Horni Parameters and Internals

LLM NameGPT Neo 2.7B Horni
RepositoryOpen on ๐Ÿค— 
Model Size2.7b
Required VRAM6.6 GB
Updated2024-02-21
MaintainerLyaaaaaGames
Model Typegpt_neo
Model Files  0.0 GB: 1-of-34   0.3 GB: 2-of-34   0.2 GB: 3-of-34   0.2 GB: 4-of-34   0.2 GB: 5-of-34   0.2 GB: 6-of-34   0.2 GB: 7-of-34   0.2 GB: 8-of-34   0.2 GB: 9-of-34   0.2 GB: 10-of-34   0.2 GB: 11-of-34   0.2 GB: 12-of-34   0.2 GB: 13-of-34   0.2 GB: 14-of-34   0.2 GB: 15-of-34   0.2 GB: 16-of-34   0.2 GB: 17-of-34   0.2 GB: 18-of-34   0.2 GB: 19-of-34   0.2 GB: 20-of-34   0.2 GB: 21-of-34   0.2 GB: 22-of-34   0.2 GB: 23-of-34   0.2 GB: 24-of-34   0.2 GB: 25-of-34   0.2 GB: 26-of-34   0.2 GB: 27-of-34   0.2 GB: 28-of-34   0.2 GB: 29-of-34   0.2 GB: 30-of-34   0.2 GB: 31-of-34   0.2 GB: 32-of-34   0.2 GB: 33-of-34   0.1 GB: 34-of-34
Model ArchitectureGPTNeoForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.27.4
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50257
Initializer Range0.02
Torch Data Typefloat16
Activation Functiongelu_new
Layer Norm Epsilon1.0E-5
Summary First Dropout0.1
Summary Typecls_index
Errorsreplace
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003