Sabia 7B AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Sabia 7B AWQ   URL Share it on

  Arxiv:2304.07880   4-bit   Autotrain compatible   Awq Base model:maritaca-ai/sabia-7... Base model:quantized:maritaca-...   Llama   Pt   Quantized   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/TheBloke/sabia-7B-AWQ 

Sabia 7B AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Sabia 7B AWQ (TheBloke/sabia-7B-AWQ)

Sabia 7B AWQ Parameters and Internals

Model Type 
auto-regressive language model
Use Cases 
Primary Use Cases:
Recommended for few-shot tasks rather than zero-shot tasks.
Additional Notes 
The model is trained solely on a language modeling objective without fine-tuning for instruction following.
Supported Languages 
pt (Native proficiency)
Training Details 
Data Sources:
Portuguese subset of ClueWeb22, LLaMA-1-7B
Data Volume:
7 billion tokens, further trained on an additional 10 billion tokens (approx. 1.4 epochs)
Context Length:
2048
Model Architecture:
Sabiรก-7B is an auto-regressive language model that uses the same architecture of LLaMA-1-7B
Input Output 
Input Format:
Text input
Accepted Modalities:
text
Output Format:
Text generation
LLM NameSabia 7B AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/sabia-7B-AWQ 
Model NameSabia 7B
Model CreatorMaritaca AI
Base Model(s)  Sabia 7B   maritaca-ai/sabia-7b
Model Size7b
Required VRAM3.9 GB
Updated2025-03-13
MaintainerTheBloke
Model Typellama
Model Files  3.9 GB
Supported Languagespt
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length2048
Model Max Length2048
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Sabia 7B AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Smaugv0.1 AWQ195K / 19.3 GB81
Yarn Llama 2 7B 64K AWQ64K / 3.9 GB1200
Calm2 7B Chat AWQ32K / 4.4 GB1142
Llama 2 7B 32K Instruct AWQ32K / 3.9 GB1192
... SWE Llama 7B Updated 4bit AWQ16K / 3.9 GB810
...Llama 7B Python Hf W4 G128 AWQ16K / 3.9 GB39420
CodeLlama 7B Instruct AWQ16K / 3.9 GB14164
Pandalyst 7B V1.2 AWQ16K / 3.9 GB861
Tora Code 7B V1.0 AWQ16K / 3.9 GB780
...eechless Tora Code 7B V1.0 AWQ16K / 3.9 GB891
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/sabia-7B-AWQ.

Rank the Sabia 7B AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 44950 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227