Aira 2 Portuguese 560M by nicholasKluge

 ยป  All LLMs  ยป  nicholasKluge  ยป  Aira 2 Portuguese 560M   URL Share it on

  Alignment   Assistant   Autotrain compatible   Bloom   Co2 eq emissions   Conversation Dataset:nicholaskluge/instruct...   Endpoints compatible   Instruct   Pt   Pytorch   Region:us   Safetensors

Aira 2 Portuguese 560M Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Aira 2 Portuguese 560M (nicholasKluge/Aira-2-portuguese-560M)

Aira 2 Portuguese 560M Parameters and Internals

Model Type 
Instruction-tuned, Text generation, Assistant, Conversation
Use Cases 
Areas:
Research, Commercial applications
Applications:
Text generation, Conversation assistance
Primary Use Cases:
Creating conversational agents, Instruction-following tasks
Limitations:
Hallucinations, Biases and toxicity, Repetition and verbosity
Additional Notes 
The model aims to generate accurate in-context responses; however, care should be taken regarding its well-documented limitations.
Supported Languages 
Portuguese (Native with all dialects)
Training Details 
Data Sources:
nicholasKluge/instruct-aira-dataset
Methodology:
The model was trained with a dataset composed of prompts and completions generated synthetically by prompting already-tuned models (ChatGPT, Llama, Open-Assistant, etc).
Hardware Used:
1 NVIDIA A100-SXM4-40GB
Model Architecture:
Based on BLOOM
Input Output 
Input Format:
String with special tokens
Accepted Modalities:
Text
Output Format:
Text responses
Performance Tips:
Ensure repetition penalty, temperature, top_k, and top_p parameters are set to prevent repetitive or verbose outputs.
Release Notes 
Version:
1.0
Date:
2023
Notes:
Initial release with instruction-tuned capabilities and text generation.
LLM NameAira 2 Portuguese 560M
Repository ๐Ÿค—https://huggingface.co/nicholasKluge/Aira-2-portuguese-560M 
Model Size560m
Required VRAM0 GB
Updated2025-05-21
MaintainernicholasKluge
Model Typebloom
Instruction-BasedYes
Model Files  2.2 GB   2.7 GB   2.2 GB   0.0 GB   0.0 GB
Supported Languagespt
Model ArchitectureBloomForCausalLM
Licensebigscience-bloom-rail-1.0
Transformers Version4.33.1
Tokenizer ClassBloomTokenizer
Padding Token<|pad|>
Vocabulary Size250684
Torch Data Typefloat32

Rank the Aira 2 Portuguese 560M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227