Saily 220B AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Saily 220B AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq Base model:deepnight-research/... Base model:quantized:deepnight...   Dataset:eleutherai/pile   Dataset:meta-math/metamathqa Dataset:tiiuae/falcon-refinedw...   En   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/TheBloke/Saily_220B-AWQ 

Saily 220B AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Saily 220B AWQ (TheBloke/Saily_220B-AWQ)

Saily 220B AWQ Parameters and Internals

Model Type 
llama
Use Cases 
Limitations:
Saily 220B may generate incorrect or biased content.
Additional Notes 
Please don't refer to the config.json in the files, it isn't accurate.
Supported Languages 
en (English)
Training Details 
Data Sources:
tiiuae/falcon-refinedweb, EleutherAI/pile, meta-math/MetaMathQA, Unnatural Code (Javascript, Python, C++)
Hardware Used:
4 x A100 80GB, 2 x A100 80GB
Model Architecture:
Llama2-70B merges
Input Output 
Input Format:
Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Response:
Accepted Modalities:
text
Output Format:
text
Release Notes 
Version:
v1
Date:
17th December, 2023
Notes:
Releasing Saily_220B, built on top of Llama2-70B merges with various fine-tuning datasets.
LLM NameSaily 220B AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Saily_220B-AWQ 
Model NameSaily 220B
Model CreatorDEEPNIGHT
Base Model(s)  Saily 220B   deepnight-research/Saily_220B
Model Size220b
Required VRAM109.1 GB
Updated2025-02-05
MaintainerTheBloke
Model Typellama
Model Files  9.9 GB: 1-of-11   9.9 GB: 2-of-11   9.9 GB: 3-of-11   10.0 GB: 4-of-11   9.9 GB: 5-of-11   9.9 GB: 6-of-11   10.0 GB: 7-of-11   9.9 GB: 8-of-11   9.9 GB: 9-of-11   10.0 GB: 10-of-11   9.8 GB: 11-of-11
Supported Languagesen
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Saily 220B AWQ

Best Alternatives
Context / RAM
Downloads
Likes
...l Llama 220M GQA 32K Theta Sft32K / 0.4 GB82
Smol Llama 220M GQA 32K Theta32K / 0.4 GB81
Saily 220B4K / 417 GB238120
Saily 220B GPTQ4K / 105.2 GB151
Smol Llama 220M GQA2K / 0.4 GB363312
Smol Llama 220M Openhermes2K / 0.4 GB13485
...mol Llama 220M GQA Fineweb Edu2K / 0.4 GB331
Smol Llama 220M Open Instruct2K / 0.4 GB652
Smol Llama 220M Bees Internal2K / 0.4 GB91
Beecoder 220M Python2K / 0.4 GB112
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Saily_220B-AWQ.

Rank the Saily 220B AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227