TinyDolphin 2.8 1.1B by cognitivecomputations

 ยป  All LLMs  ยป  cognitivecomputations  ยป  TinyDolphin 2.8 1.1B   URL Share it on

  Autotrain compatible   Dataset:bigcode/starcoderdata Dataset:cerebras/slimpajama-62...   Dataset:teknium/openhermes   En   Endpoints compatible   Llama   Region:us   Safetensors

TinyDolphin 2.8 1.1B Benchmarks

TinyDolphin 2.8 1.1B (cognitivecomputations/TinyDolphin-2.8-1.1b)

TinyDolphin 2.8 1.1B Parameters and Internals

Model Type 
language model, text generation
Additional Notes 
Intended as a compact and efficient model within computational and memory constraints. It is based on optimizations of the Llama architecture for use in respective applications.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
cerebras/SlimPajama-627B, bigcode/starcoderdata, teknium/openhermes, Dolphin 2.8 dataset by Eric Hartford
Data Volume:
3 trillion tokens
Methodology:
Pretrained using optimized techniques on Llama architecture and tokenizer
Training Time:
90 days
Hardware Used:
16 A100-40G GPUs
Model Architecture:
Same architecture and tokenizer as Llama 2. Compact with only 1.1B parameters.
LLM NameTinyDolphin 2.8 1.1B
Repository ๐Ÿค—https://huggingface.co/cognitivecomputations/TinyDolphin-2.8-1.1b 
Model Size1.1b
Required VRAM2.2 GB
Updated2025-02-22
Maintainercognitivecomputations
Model Typellama
Model Files  2.2 GB
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32002
Torch Data Typebfloat16

Best Alternatives to TinyDolphin 2.8 1.1B

Best Alternatives
Context / RAM
Downloads
Likes
Coven Tiny 1.1b 32k Orpo Alpha32K / 2.2 GB1612
Test Mix 0132K / 2.2 GB1660
Palmer Merge Test 532K / 2.2 GB1570
...llama 1.1B 16K Instructions V432K / 2.2 GB980
Palmer 002 32K32K / 2.2 GB1710
TinyLlama 1.1B 32K Instruct32K / 2.2 GB23512
Tinyllama History Chat V1.132K / 2.2 GB1220
TinyLlama 1.1B 32K32K / 2.2 GB12128
TinyJ.O.S.I.E. 1.1B 32K Base32K / 2.2 GB301
Tinyllama 32k32K / 2.2 GB840

Rank the TinyDolphin 2.8 1.1B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227