Neo 7B by m-a-p

 ยป  All LLMs  ยป  m-a-p  ยป  Neo 7B   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   License:apache-2.0   Llama   Region:us   Safetensors   Sharded   Tensorflow

Rank the Neo 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Neo 7B (m-a-p/neo_7b)

Best Alternatives to Neo 7B

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Vicuna 7B V1.555.274K / 13.5 GB704806237
Llama2 Chinese 7B Chat54.234K / 13.5 GB7057211
Vicuna 7B V1.5 16K54.024K / 13.5 GB3501583
Deepseek Llm 7B Base52.334K / 13.9 GB358529
Deepseek Llm 7B Chat49.554K / 13.9 GB867565
Vicuna 7B V1.5 GPTQ48.54K / 3.9 GB141915
Vicuna 7B V1.5 AWQ48.54K / 3.9 GB873
Vicuna 7B V1.5 16K GPTQ47.44K / 3.9 GB3818310
Vicuna 7B V1.5 16K AWQ47.44K / 3.9 GB401
Vicuna 7B V1.5 16K Gptq47.44K / 3.9 GB150
Note: green Score (e.g. "73.2") means that the model is better than m-a-p/neo_7b.

Neo 7B Parameters and Internals

LLM NameNeo 7b
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM15.5 GB
Updated2024-05-22
Maintainerm-a-p
Model Typellama
Model Files  4.8 GB: 1-of-4   4.9 GB: 2-of-4   5.0 GB: 3-of-4   0.8 GB: 4-of-4
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.39.3
Tokenizer ClassNEOTokenizer
Padding Token<unk>
Vocabulary Size64256
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35549 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801