Falcon H1 1.5B Deep Base by tiiuae

 ยป  All LLMs  ยป  tiiuae  ยป  Falcon H1 1.5B Deep Base   URL Share it on

  Ar   Autotrain compatible   Cs   De   En   Endpoints compatible   Es   Falcon-h1   Falcon h1   Fr   Hi   It   Ja   Ko   Nl   Pl   Pt   Region:us   Ro   Ru   Safetensors   Sv   Ur   Zh

Falcon H1 1.5B Deep Base Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Falcon H1 1.5B Deep Base (tiiuae/Falcon-H1-1.5B-Deep-Base)

Falcon H1 1.5B Deep Base Parameters and Internals

LLM NameFalcon H1 1.5B Deep Base
Repository ๐Ÿค—https://huggingface.co/tiiuae/Falcon-H1-1.5B-Deep-Base 
Model Size1.5b
Required VRAM3.1 GB
Updated2025-06-01
Maintainertiiuae
Model Typefalcon_h1
Model Files  3.1 GB
Model ArchitectureFalconH1ForCausalLM
Licenseother
Context Length131072
Model Max Length131072
Transformers Version4.52.0.dev0
Tokenizer ClassPreTrainedTokenizer
Vocabulary Size65536
Torch Data Typebfloat16

Best Alternatives to Falcon H1 1.5B Deep Base

Best Alternatives
Context / RAM
Downloads
Likes
Falcon H1 1.5B Deep Instruct128K / 3.1 GB117810
Falcon H1 1.5B Instruct128K / 3.1 GB10804
Falcon H1 1.5B Base128K / 3.1 GB8012
Note: green Score (e.g. "73.2") means that the model is better than tiiuae/Falcon-H1-1.5B-Deep-Base.

Rank the Falcon H1 1.5B Deep Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47770 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227