Prosparse Llama 2 13B by SparseLLM

 ยป  All LLMs  ยป  SparseLLM  ยป  Prosparse Llama 2 13B   URL Share it on

  Arxiv:2310.04564   Arxiv:2312.12456   Arxiv:2402.03804   Arxiv:2402.13516   Custom code   En   Feature-extraction   Region:us   Safetensors   Sharded   Sparsellama   Tensorflow

Prosparse Llama 2 13B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Prosparse Llama 2 13B (SparseLLM/prosparse-llama-2-13b)

Prosparse Llama 2 13B Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, commercial applications
Limitations:
Potential for biased or objectionable responses, Has only been tested in English
Considerations:
Developers should perform safety testing and tuning tailored to their applications.
Training Details 
Data Sources:
StarCoder, Wikipedia, Pile, UltraChat, P3, PAQ, Unnatural Instructions, Flan, Super-Natural Instructions
Data Volume:
134.22 billion tokens
Methodology:
ProSparse with ReLU substitution and regularization
Training Time:
16,000 steps
Hardware Used:
32 A100 GPUs
Model Architecture:
Activation sparsity using ReLU for FFNs
LLM NameProsparse Llama 2 13B
Repository ๐Ÿค—https://huggingface.co/SparseLLM/prosparse-llama-2-13b 
Model Size13b
Required VRAM52.3 GB
Updated2025-02-22
MaintainerSparseLLM
Model Typesparsellama
Model Files  4.9 GB: 1-of-11   5.0 GB: 2-of-11   5.0 GB: 3-of-11   5.0 GB: 4-of-11   5.0 GB: 5-of-11   4.8 GB: 6-of-11   4.8 GB: 7-of-11   4.8 GB: 8-of-11   5.0 GB: 9-of-11   5.0 GB: 10-of-11   3.0 GB: 11-of-11
Supported Languagesen
Model ArchitectureSparseLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.31.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Rank the Prosparse Llama 2 13B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227