Pythia 6.9B Deduped 4K by CarperAI

 ยป  All LLMs  ยป  CarperAI  ยป  Pythia 6.9B Deduped 4K   URL Share it on

  Autotrain compatible Dataset:eleutherai/the pile de...   En   Endpoints compatible   Gpt neox   Pytorch   Region:us   Sharded

Pythia 6.9B Deduped 4K Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pythia 6.9B Deduped 4K (CarperAI/pythia-6.9b-deduped-4k)

Pythia 6.9B Deduped 4K Parameters and Internals

Model Type 
text generation
Additional Notes 
Training should have used sequence length warmup to move up from 2048 context length, but it was not applied.
Supported Languages 
en (unknown)
Training Details 
Data Sources:
EleutherAI/the_pile_deduplicated
Data Volume:
134,217,728,000 tokens
Methodology:
Fine-tuning with a 4096 context length, resumed training from 143,000 step checkpoint to 175,500 step checkpoint
Context Length:
4096
Hardware Used:
16 Nodes 8xA100 40GB
LLM NamePythia 6.9B Deduped 4K
Repository ๐Ÿค—https://huggingface.co/CarperAI/pythia-6.9b-deduped-4k 
Model Size7b
Required VRAM27.2 GB
Updated2025-02-22
MaintainerCarperAI
Model Typegpt_neox
Model Files  0.8 GB: 1-of-34   0.8 GB: 2-of-34   0.8 GB: 3-of-34   0.8 GB: 4-of-34   0.8 GB: 5-of-34   0.8 GB: 6-of-34   0.8 GB: 7-of-34   0.8 GB: 8-of-34   0.8 GB: 9-of-34   0.8 GB: 10-of-34   0.8 GB: 11-of-34   0.8 GB: 12-of-34   0.8 GB: 13-of-34   0.8 GB: 14-of-34   0.8 GB: 15-of-34   0.8 GB: 16-of-34   0.8 GB: 17-of-34   0.8 GB: 18-of-34   0.8 GB: 19-of-34   0.8 GB: 20-of-34   0.8 GB: 21-of-34   0.8 GB: 22-of-34   0.8 GB: 23-of-34   0.8 GB: 24-of-34   0.8 GB: 25-of-34   0.8 GB: 26-of-34   0.8 GB: 27-of-34   0.8 GB: 28-of-34   0.8 GB: 29-of-34   0.8 GB: 30-of-34   0.8 GB: 31-of-34   0.8 GB: 32-of-34   0.8 GB: 33-of-34   0.8 GB: 34-of-34
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.27.4
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50432
Torch Data Typefloat32

Best Alternatives to Pythia 6.9B Deduped 4K

Best Alternatives
Context / RAM
Downloads
Likes
Literature 7B 1638416K / 36 GB1714
RedPajama 7B 1638416K / 36 GB144
Stablelm Tuned Alpha 7B4K / 31.9 GB5819360
Stablelm Base Alpha 7B4K / 31.9 GB2613209
Stablelm 7B Sft V7 Epoch 34K / 32.4 GB202567
StableLManticore 7B4K / 16 GB31
Stablelm 7B4K / 31.9 GB72
Sarashina1 7B2K / 13.9 GB3380
Dolly V2 7B2K / 13.8 GB11366149
Open Calm 7B2K / 13.9 GB3471206
Note: green Score (e.g. "73.2") means that the model is better than CarperAI/pythia-6.9b-deduped-4k.

Rank the Pythia 6.9B Deduped 4K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227