Airoboros C34B 2.1 AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Airoboros C34B 2.1 AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq Base model:jondurbin/airoboros... Base model:quantized:jondurbin... Dataset:jondurbin/airoboros-2....   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Airoboros C34B 2.1 AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Airoboros C34B 2.1 AWQ (TheBloke/Airoboros-c34B-2.1-AWQ)

Airoboros C34B 2.1 AWQ Parameters and Internals

Model Type 
llama
Additional Notes 
This model contains a prompt formatting bug. A new version (2.2) is anticipated to fix this. AG specific prompt format usage, reWOO style execution planning available. Contribute to the project's dataset or functionality.
Input Output 
Input Format:
A chat. USER: {prompt} ASSISTANT:
Performance Tips:
Recommendation for stopping criteria/early inference stopping on USER: for multi-round chats.
LLM NameAiroboros C34B 2.1 AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Airoboros-c34B-2.1-AWQ 
Model NameAiroboros c34B 2.1
Model CreatorJon Durbin
Base Model(s)  jondurbin/airoboros-c34b-2.1   jondurbin/airoboros-c34b-2.1
Model Size5b
Required VRAM18.3 GB
Updated2025-02-05
MaintainerTheBloke
Model Typellama
Model Files  9.9 GB: 1-of-2   8.4 GB: 2-of-2
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length16384
Model Max Length16384
Transformers Version4.31.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Airoboros C34B 2.1 AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Airoboros C34B 3.1.2 AWQ16K / 18.3 GB81
Airoboros C34b 2.2.1 AWQ16K / 18.3 GB120
Airoboros C34B 2.2 AWQ16K / 18.3 GB91
HelpingAI2.5 5B128K / 10.3 GB5602
HelpingAI2.5 5B128K / 10.3 GB612
Ko Llama 3.1 5B Instruct128K / 23.4 GB50
Llama 3.1 5B Instruct8K / 10.9 GB3397
Triangulum 5B8K / 10.9 GB1028
Triangulum 5B It8K / 10.9 GB658
Llama 3 5B Sheard8K / 11.7 GB99
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Airoboros-c34B-2.1-AWQ.

Rank the Airoboros C34B 2.1 AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227