AquilaChat2 34B by BAAI

 ยป  All LLMs  ยป  BAAI  ยป  AquilaChat2 34B   URL Share it on

  Arxiv:2408.07410   Aquila   Autotrain compatible   Custom code   Pytorch   Region:us   Sharded
Model Card on HF ๐Ÿค—: https://huggingface.co/BAAI/AquilaChat2-34B 

AquilaChat2 34B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
AquilaChat2 34B (BAAI/AquilaChat2-34B)

AquilaChat2 34B Parameters and Internals

Model Type 
chat model, language model
Use Cases 
Areas:
research, commercial applications
Limitations:
data leakage problem identified with GSM8K test data included in training dataset
Supported Languages 
en (high), zh (high)
Training Details 
Data Sources:
WTM22 (en-zh), CLUEWSC, Winograd, HellaSwag, OpenBookQA, PIQA, ARC-e, BUSTSM, BoolQ, TruthfulQA, RAFT, ChID, EPRSTMT, TNEWS, OCNLI, SEM-Chinese, MMLU, C-Eval, CMMLU, CSL, HumanEval
Data Volume:
2 trillion tokens
Context Length:
16000
Safety Evaluation 
Findings:
data leakage issue identified with GSM8K
Risk Categories:
misinformation, bias
Input Output 
Input Format:
text input
Accepted Modalities:
text
Output Format:
text output
Release Notes 
Version:
1.2
Date:
2023-10-25
Notes:
AquilaChat2-34B model is based on the previous version and exceeds or matches GPT3.5 in various evaluation dimensions.
LLM NameAquilaChat2 34B
Repository ๐Ÿค—https://huggingface.co/BAAI/AquilaChat2-34B 
Model Size34b
Required VRAM67.2 GB
Updated2025-02-05
MaintainerBAAI
Model Typeaquila
Model Files  10.0 GB: 1-of-7   9.8 GB: 2-of-7   9.7 GB: 3-of-7   9.7 GB: 4-of-7   9.7 GB: 5-of-7   9.7 GB: 6-of-7   8.6 GB: 7-of-7
Model ArchitectureAquilaForCausalLM
Licenseother
Context Length4096
Model Max Length4096
Transformers Version4.31.0
Tokenizer ClassGPT2Tokenizer
Vocabulary Size100008
Torch Data Typebfloat16

Quantized Models of the AquilaChat2 34B

Model
Likes
Downloads
VRAM
AquilaChat2 34B GGUF322614 GB
AquilaChat2 34B GPTQ23018 GB
AquilaChat2 34B AWQ1819 GB

Best Alternatives to AquilaChat2 34B

Best Alternatives
Context / RAM
Downloads
Likes
AquilaChat2 34B 16K16K / 67.2 GB13625
Aquila2 34B8K / 136.4 GB274417
H2ogpt 16K Aquilachat2 34B4K / 67.2 GB754
AquilaChat2 34B 16K GPTQ4K / 18.7 GB266
AquilaChat2 34B GPTQ4K / 18.7 GB302
AquilaChat2 34B 16K AWQ4K / 19.3 GB94
AquilaChat2 34B AWQ4K / 19.3 GB81
Note: green Score (e.g. "73.2") means that the model is better than BAAI/AquilaChat2-34B.

Rank the AquilaChat2 34B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42565 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227