GlorIA 1.3B by NOVA-vision-language

 Β»  All LLMs  Β»  NOVA-vision-language  Β»  GlorIA 1.3B   URL Share it on

  Autotrain compatible   Dataset:assin2   Dataset:dlb/plue   Dataset:europarl bilingual Dataset:nova-vision-language/c... Dataset:oscar-corpus/oscar-230...   Dataset:portulan/glue-ptpt   Decoder   Endpoints compatible   European portuguese   Foundation model   Glória   Gpt neo   Gptneo   Pt   Pytorch   Region:us   Safetensors   Vision

GlorIA 1.3B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
GlorIA 1.3B (NOVA-vision-language/GlorIA-1.3B)

GlorIA 1.3B Parameters and Internals

Model Type 
text generation, foundation model
Use Cases 
Areas:
research, educational applications
Primary Use Cases:
text generation with a focus on European Portuguese content
Limitations:
The usage is restricted to research-only purposes
Additional Notes 
GlΓ³rIA is specifically designed for European Portuguese support.
Supported Languages 
Portuguese (Native)
Training Details 
Data Sources:
ArquivoPT News PT-PT Dataset, ClueWeb-Large PT-PT, Europarl PT-PT, OpenSubtitles PT-PT, OSCAR PT-PT, PT WIKI
Data Volume:
approximately 35B tokens
Model Architecture:
GPTNeo with 24 layers and a hidden size of 2048.
Input Output 
Input Format:
Text input prompts
Accepted Modalities:
text
Output Format:
Generated text completion
Performance Tips:
Use recommended generation configuration settings for better performance.
LLM NameGlorIA 1.3B
Repository πŸ€—https://huggingface.co/NOVA-vision-language/GlorIA-1.3B 
Model Size1.3b
Required VRAM5.4 GB
Updated2024-12-26
MaintainerNOVA-vision-language
Model Typegpt_neo
Model Files  5.4 GB   5.4 GB
Supported Languagespt
Model ArchitectureGPTNeoForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.29.1
Tokenizer ClassGPT2Tokenizer
Vocabulary Size50258
Torch Data Typefloat32
Activation Functiongelu_new

Best Alternatives to GlorIA 1.3B

Best Alternatives
Context / RAM
Downloads
Likes
GPT NEO 1.3B Wiki2K / 5.3 GB431
GPT Neo 1.3B Lama2K / 5.3 GB180
GPT Neo 1.3B Alpaca2K / 5.3 GB90
...ni Neo 1.3B Mental Health Lora2K / 1.6 GB7292
Adonalsium GPT Neo 1.3B2K / 0 GB181
GPT Neo 1.3B Cs Finetuning2K / 0.5 GB220
GPT NeoX 1.3B Viet Long 702K / 5.3 GB140
...oX 1.3B Viet Custom 30 General2K / 5.3 GB130
GPT NeoX 1.3B Viet Custom 102K / 5.3 GB130
GPT NeoX 1.3B Viet Custom 302K / 5.3 GB140
Note: green Score (e.g. "73.2") means that the model is better than NOVA-vision-language/GlorIA-1.3B.

Rank the GlorIA 1.3B Capabilities

πŸ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40303 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227