GPT Neo 2.7B Horni by LyaaaaaGames

 ยป  All LLMs  ยป  LyaaaaaGames  ยป  GPT Neo 2.7B Horni   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neo   Pytorch   Region:us   Sharded

GPT Neo 2.7B Horni Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
GPT Neo 2.7B Horni (LyaaaaaGames/GPT-Neo-2.7B-Horni)

GPT Neo 2.7B Horni Parameters and Internals

Model Type 
text generation
Additional Notes 
The model is a variant or sharded version specifically optimized for certain applications in creative writing or role-playing games. Details on the sharding approach and further adaptations for the named purpose are not specified. It is hosted and managed on the Hugging Face platform by KoboldAI.
LLM NameGPT Neo 2.7B Horni
Repository ๐Ÿค—https://huggingface.co/LyaaaaaGames/GPT-Neo-2.7B-Horni 
Model Size2.7b
Required VRAM6.6 GB
Updated2024-12-14
MaintainerLyaaaaaGames
Model Typegpt_neo
Model Files  0.0 GB: 1-of-34   0.3 GB: 2-of-34   0.2 GB: 3-of-34   0.2 GB: 4-of-34   0.2 GB: 5-of-34   0.2 GB: 6-of-34   0.2 GB: 7-of-34   0.2 GB: 8-of-34   0.2 GB: 9-of-34   0.2 GB: 10-of-34   0.2 GB: 11-of-34   0.2 GB: 12-of-34   0.2 GB: 13-of-34   0.2 GB: 14-of-34   0.2 GB: 15-of-34   0.2 GB: 16-of-34   0.2 GB: 17-of-34   0.2 GB: 18-of-34   0.2 GB: 19-of-34   0.2 GB: 20-of-34   0.2 GB: 21-of-34   0.2 GB: 22-of-34   0.2 GB: 23-of-34   0.2 GB: 24-of-34   0.2 GB: 25-of-34   0.2 GB: 26-of-34   0.2 GB: 27-of-34   0.2 GB: 28-of-34   0.2 GB: 29-of-34   0.2 GB: 30-of-34   0.2 GB: 31-of-34   0.2 GB: 32-of-34   0.2 GB: 33-of-34   0.1 GB: 34-of-34
Model ArchitectureGPTNeoForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.27.4
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50257
Torch Data Typefloat16
Activation Functiongelu_new
Errorsreplace

Best Alternatives to GPT Neo 2.7B Horni

Best Alternatives
Context / RAM
Downloads
Likes
ChildModel 022K / 5.3 GB60
GPT Neo 2.7B Lama2K / 10.6 GB140
PRIME2 Openai2K / 6.6 GB71
EleutherAI GPT Neo 2.7B 4bits2K / 1.7 GB90
GPT Neo 2.7B2K / 10.7 GB245553446
Pygmalion 2.7B2K / 5.4 GB130154
... Style Transfer Using Examples2K / 5.4 GB151
Pygmalion 2.7B2K / 0 GB191
GPT Neo 2.7B Shinen2K / 6.6 GB600
GPT Neo 2.7B Horni LN2K / 6.6 GB450
Note: green Score (e.g. "73.2") means that the model is better than LyaaaaaGames/GPT-Neo-2.7B-Horni.

Rank the GPT Neo 2.7B Horni Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39237 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124