Pragna 1B by soketlabs

 ยป  All LLMs  ยป  soketlabs  ยป  Pragna 1B   URL Share it on

  Autotrain compatible   Bn   Dataset:ai4bharat/sangraha Dataset:cerebras/slimpajama-62...   Dataset:soketlabs/bhasha-wiki Dataset:soketlabs/bhasha-wiki-...   En   Endpoints compatible   Gu   Hi   Indic   Llama   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/soketlabs/pragna-1b 

Pragna 1B Benchmarks

Pragna 1B (soketlabs/pragna-1b)

Pragna 1B Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, commercial applications
Supported Languages 
Hindi (NLP), Bangla (NLP), Gujarati (NLP), English (NLP)
Training Details 
Data Sources:
Bhasha-wiki, SlimPajama, Sangraha-Verified
Context Length:
2048
Hardware Used:
Triton open-source language from OpenAI, GenAI Studio platform
Model Architecture:
Pragna-1B is a decoder-only transformer model inspired by TinyLlama, featuring 22 layers, 32 attention heads, hidden dimension of 2048, expansion dimension of 5632, vocabulary size of 69632, Rotary Positional Encoding, RSNorm, Sigmoid Activation Unit (SiLU), and Grouped Query Attention.
LLM NamePragna 1B
Repository ๐Ÿค—https://huggingface.co/soketlabs/pragna-1b 
Model Size1b
Required VRAM2.5 GB
Updated2025-02-22
Maintainersoketlabs
Model Typellama
Model Files  2.5 GB
Supported Languageshi bn gu en
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size69632
Torch Data Typebfloat16

Best Alternatives to Pragna 1B

Best Alternatives
Context / RAM
Downloads
Likes
LWM Text Chat 1M1024K / 13.5 GB2084175
LWM Text 1M1024K / 13.5 GB49128
JOSIE 1M Base1024K / 13.5 GB121
JOSIE 1M Base1024K / 13.5 GB61
Llama 3.2 1B128K / 2.5 GB108502871584
Llama 3.2 1B Instruct128K / 2.5 GB1716586773
Llama 3.2 1B Instruct128K / 2.5 GB11271463
MiniThinky V2 1B Llama 3.2128K / 4.9 GB729438
Lancer 1 1B Instruct128K / 2.5 GB1102
Llama Express.1 Math128K / 2.5 GB4057
Note: green Score (e.g. "73.2") means that the model is better than soketlabs/pragna-1b.

Rank the Pragna 1B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227