Saily 220B GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Saily 220B GPTQ   URL Share it on

  4-bit   Autotrain compatible Base model:deepnight-research/... Base model:quantized:deepnight...   Dataset:eleutherai/pile   Dataset:meta-math/metamathqa Dataset:tiiuae/falcon-refinedw...   En   Gptq   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Saily 220B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Saily 220B GPTQ (TheBloke/Saily_220B-GPTQ)

Saily 220B GPTQ Parameters and Internals

Model Type 
llama
Additional Notes 
Limitations include the possibility of incorrect or biased content generation.
Supported Languages 
English (primary)
Training Details 
Data Sources:
tiiuae/falcon-refinedweb, EleutherAI/pile, meta-math/MetaMathQA, Unnatural Code
Methodology:
Built on top of Llama2-70B merges, 10 fine-tuned models; multiple merges created keeping Logical-Understanding and Reasoning models constant.
Hardware Used:
4 x A100 80GB GPUs, 2 x A100 80GB GPUs
Model Architecture:
Llama2-70B fine-tuned models with multiple merges.
Input Output 
Input Format:
Use the Alpaca Prompt Format with Instruction and Response tags.
Output Format:
Textual response output following given instruction.
Performance Tips:
Load in 4bit or 8bit for optimization; please use INSTRUCT or CHAT-INSTRUCT mode in Text-Generation-WebUI.
Release Notes 
Version:
v1
Date:
17th December, 2023
Notes:
Released as a powerful model built on Llama2-70B merges; 10 models fine-tuned on specific domains; uses public datasets and internal transcriptions.
LLM NameSaily 220B GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Saily_220B-GPTQ 
Model NameSaily 220B
Model CreatorDEEPNIGHT
Base Model(s)  Saily 220B   deepnight-research/Saily_220B
Model Size220b
Required VRAM105.2 GB
Updated2024-12-21
MaintainerTheBloke
Model Typellama
Model Files  49.0 GB: 1-of-3   48.9 GB: 2-of-3   7.3 GB: 3-of-3
Supported Languagesen
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Saily 220B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Smol Llama 220M GQA 32K Theta32K / 0.4 GB491
...l Llama 220M GQA 32K Theta Sft32K / 0.4 GB102
Saily 220B4K / 417 GB197920
Saily 220B AWQ4K / 109.1 GB170
Smol Llama 220M GQA2K / 0.4 GB271112
Smol Llama 220M Openhermes2K / 0.4 GB12085
...mol Llama 220M GQA Fineweb Edu2K / 0.4 GB551
Smol Llama 220M Open Instruct2K / 0.4 GB331
Smol Llama 220M Bees Internal2K / 0.4 GB211
Beecoder 220M Python2K / 0.4 GB342
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Saily_220B-GPTQ.

Rank the Saily 220B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217