Mpt 7B 8K Instruct Peft Compatible by eluzhnica

 ยป  All LLMs  ยป  eluzhnica  ยป  Mpt 7B 8K Instruct Peft Compatible   URL Share it on

  Arxiv:2010.04245   Arxiv:2108.12409   Arxiv:2205.14135   Autotrain compatible   Composer   Custom code   Ext 8k   Instruct   Llm-foundry   Mosaicml   Mpt   Pytorch   Region:us   Sharded

Mpt 7B 8K Instruct Peft Compatible Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mpt 7B 8K Instruct Peft Compatible (eluzhnica/mpt-7b-8k-instruct-peft-compatible)

Mpt 7B 8K Instruct Peft Compatible Parameters and Internals

Model Type 
text generation, decoder-only transformer
Additional Notes 
Model includes advanced architectural modifications like FlashAttention and ALiBi and doesn't utilize positional embeddings or biases.
Training Details 
Data Sources:
Dolly HHRLHF, Competition Math, Duorc, CoT GSM8k, Qasper, Quality, Summ Screen FD, Spider
Data Volume:
approximately 43.9 million tokens
Methodology:
Finetuning with custom decoder-only transformer architecture using MPT-7B-chat tokenizer
Context Length:
2048
Training Time:
6.3 hours
Hardware Used:
8 80GB A100 GPUs
Model Architecture:
Modification of a standard decoder-only transformer utilizing FlashAttention, ALiBi and no biases
Input Output 
Input Format:
Text input using MPT-7B-chat tokenizer
Accepted Modalities:
text
Output Format:
Text output
Performance Tips:
Use trust_remote_code=True in from_pretrained method
LLM NameMpt 7B 8K Instruct Peft Compatible
Repository ๐Ÿค—https://huggingface.co/eluzhnica/mpt-7b-8k-instruct-peft-compatible 
Model Size7b
Required VRAM13.3 GB
Updated2025-02-22
Maintainereluzhnica
Model Typempt
Instruction-BasedYes
Model Files  9.9 GB: 1-of-2   3.4 GB: 2-of-2
Context Length8k
Model ArchitectureMPTForCausalLM
Licensecc-by-sa-3.0
Model Max Length8192
Transformers Version4.30.2
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50432
Torch Data Typebfloat16

Best Alternatives to Mpt 7B 8K Instruct Peft Compatible

Best Alternatives
Context / RAM
Downloads
Likes
Mpt 7B Chat0K / 13.3 GB95245512
Mpt 7B Instruct0K / 13.3 GB15434469
Mpt 7B Int8 Ov0K / 0 GB270
Sea Lion 7B Instruct0K / 15 GB35723
Mpt 7B 8K Instruct0K / 13.3 GB205726
Sea Lion 7B Instruct Research0K / 15 GB5614
Results0K / 13.3 GB60
Mpt 7B 8K Chat Sharded Bf160K / 13.4 GB71
Vigogne Mpt 7B Instruct0K / 13.4 GB80
...pt 7B Instruct Peft Compatible0K / 13.3 GB430
Note: green Score (e.g. "73.2") means that the model is better than eluzhnica/mpt-7b-8k-instruct-peft-compatible.

Rank the Mpt 7B 8K Instruct Peft Compatible Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227