Falcon2 5.5B Spanish by ssmits

 ยป  All LLMs  ยป  ssmits  ยป  Falcon2 5.5B Spanish   URL Share it on

  Merged Model   Autotrain compatible Base model:finetune:tiiuae/fal...   Base model:tiiuae/falcon-11b   Conversational   Custom code   Endpoints compatible   Es   Falcon   Region:us   Safetensors   Sharded   Tensorflow   Tiiuae/falcon-11b

Falcon2 5.5B Spanish Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Falcon2 5.5B Spanish (ssmits/Falcon2-5.5B-Spanish)

Falcon2 5.5B Spanish Parameters and Internals

Model Type 
pre-trained language model
Use Cases 
Primary Use Cases:
summarization, text generation, chatbot
Limitations:
Will not generalize appropriately to non-supported languages, Carries web stereotypes and biases
Additional Notes 
Pruning done based on layer similarity analysis to maintain performance while reducing model size.
Supported Languages 
English (high), German (high), Spanish (high), French (high), Italian (high), Portuguese (high), Polish (high), Dutch (high), Romanian (high), Czech (high), Swedish (high)
Training Details 
Data Sources:
wikimedia/wikipedia Spanish (es) subset
Data Volume:
5T tokens, ~1B of continued pre-training (~1M rows of 1k tokens)
Methodology:
mergekit
Context Length:
1000
Responsible Ai Considerations 
Fairness:
Falcon2-5.5B may reflect web stereotypes and biases.
Mitigation Strategies:
Finetuning for specific tasks; guardrails for production use.
LLM NameFalcon2 5.5B Spanish
Repository ๐Ÿค—https://huggingface.co/ssmits/Falcon2-5.5B-Spanish 
Base Model(s)  Falcon 11B   tiiuae/falcon-11B
Merged ModelYes
Model Size11b
Required VRAM10.9 GB
Updated2024-12-22
Maintainerssmits
Model Typefalcon
Model Files  0.9 GB: 1-of-12   1.0 GB: 2-of-12   0.9 GB: 3-of-12   0.9 GB: 4-of-12   1.0 GB: 5-of-12   0.9 GB: 6-of-12   0.9 GB: 7-of-12   1.0 GB: 8-of-12   0.9 GB: 9-of-12   0.9 GB: 10-of-12   1.0 GB: 11-of-12   0.6 GB: 12-of-12
Supported Languageses
Model ArchitectureFalconForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Is Biased0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size65024
Torch Data Typebfloat16

Best Alternatives to Falcon2 5.5B Spanish

Best Alternatives
Context / RAM
Downloads
Likes
Falcon 11B8K / 22.1 GB21564213
Falcon2 5.5B Multilingual8K / 10.9 GB4124
Falcon2 5.5B Polish8K / 10.9 GB21961
Falcon2 5.5B German8K / 10.9 GB3980
Falcon2 11B8K / 6.6 GB180
Enron Falcon 11B8K / 7.6 GB141
Falcon2 5.5B Czech8K / 10.9 GB420
Falcon2 5.5B Portuguese8K / 10.9 GB270
Falcon2 5.5B Norwegian8K / 10.9 GB211
Falcon2 5.5B Dutch8K / 10.9 GB291
Note: green Score (e.g. "73.2") means that the model is better than ssmits/Falcon2-5.5B-Spanish.

Rank the Falcon2 5.5B Spanish Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217