Mpt 30B Qlora Compatible by jondurbin

 ยป  All LLMs  ยป  jondurbin  ยป  Mpt 30B Qlora Compatible   URL Share it on

  Autotrain compatible   Custom code   Endpoints compatible   Mpt   Pytorch   Region:us   Sharded

Mpt 30B Qlora Compatible Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mpt 30B Qlora Compatible (jondurbin/mpt-30b-qlora-compatible)

Mpt 30B Qlora Compatible Parameters and Internals

Model Type 
text generation
Additional Notes 
Model is tuned with airoboros prompt format, mostly aligned with Vicuna.
Training Details 
Methodology:
Gradient checkpointing compatible modifications for QLoRA training.
Input Output 
Input Format:
JSONL format with 'instruction' and 'response' fields
Accepted Modalities:
text
Performance Tips:
Set gradient accumulation steps to 1 to avoid potential bug in gradient accumulation.
LLM NameMpt 30B Qlora Compatible
Repository ๐Ÿค—https://huggingface.co/jondurbin/mpt-30b-qlora-compatible 
Model Size30b
Required VRAM60.1 GB
Updated2024-12-22
Maintainerjondurbin
Model Typempt
Model Files  9.8 GB: 1-of-7   9.9 GB: 2-of-7   9.9 GB: 3-of-7   9.9 GB: 4-of-7   9.9 GB: 5-of-7   9.9 GB: 6-of-7   0.8 GB: 7-of-7
Model ArchitectureMPTForCausalLM
Model Max Length8192
Transformers Version4.28.1
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50432
Torch Data Typebfloat16

Best Alternatives to Mpt 30B Qlora Compatible

Best Alternatives
Context / RAM
Downloads
Likes
Mpt 30B Chat0K / 60.1 GB1428203
Mpt 30B0K / 60.1 GB2044341
Mpt 30B Instruct0K / 60.1 GB1263101
Mpt 30B Orca Mini0K / 180.5 GB171
Mpt 30B V20K / 60.1 GB1310
Mpt 30B V30K / 60.1 GB122
Mpt 30B Qlora Multi GPU0K /  GB161
Mpt 30B Peft Compatible0K / 60.1 GB148
...s Mpt 30B Gpt4 1p4 Five Epochs0K / 60.1 GB147
...t 30B Instruct Peft Compatible0K / 60.1 GB132
Note: green Score (e.g. "73.2") means that the model is better than jondurbin/mpt-30b-qlora-compatible.

Rank the Mpt 30B Qlora Compatible Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217