Distill Bloom 1b3 10x by bigscience

 ยป  All LLMs  ยป  bigscience  ยป  Distill Bloom 1b3 10x   URL Share it on

  Arxiv:1909.08053   Arxiv:2108.12409   Arxiv:2110.02861   Ak   Ar   As   Autotrain compatible   Bloom   Bm   Bn   Ca   Code   En   Endpoints compatible   Es   Eu   Fon   Fr   Gu   Hi   Id   Ig   Ki   Kn   Lg   Ln   Ml   Mr   Ne   Nso   Ny   Or   Pa   Pt   Pytorch   Region:us   Rn   Rw   Safetensors   Sn   St   Sw   Ta   Te   Tn   Ts   Tum   Tw   Ur   Vi   Wo   Xh   Yo   Zh   Zhs   Zht   Zu

Distill Bloom 1b3 10x Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Distill Bloom 1b3 10x (bigscience/distill-bloom-1b3-10x)

Distill Bloom 1b3 10x Parameters and Internals

Model Type 
Transformer-based Language Model
Use Cases 
Areas:
Public research, Language generation
Applications:
Text generation, Information Extraction, Question Answering, Summarization
Primary Use Cases:
Pretrained base model for specific tasks
Additional Notes 
Model developed by BigScience; supported by the French government and organizations of contributors.
Supported Languages 
ak (Akan), ar (Arabic), as (Assamese), bm (Bambara), bn (Bengali), ca (Catalan), code (Programming Languages), en (English), es (Spanish), eu (Basque), fon (Fon), fr (French), gu (Gujarati), hi (Hindi), id (Indonesian), ig (Igbo), ki (Kikuyu), kn (Kannada), lg (Luganda), ln (Lingala), ml (Malayalam), mr (Marathi), ne (Nepali), nso (Northern Sotho), ny (Chichewa), or (Odia), pa (Punjabi), pt (Portuguese), rn (Kirundi), rw (Kinyarwanda), sn (Shona), st (Sesotho), sw (Swahili), ta (Tamil), te (Telugu), tn (Tswana), ts (Tsonga), tum (Tumbuka), tw (Twi), ur (Urdu), vi (Vietnamese), wo (Wolof), xh (Xhosa), yo (Yoruba), zh (Chinese), zhs (Simplified Chinese), zht (Traditional Chinese), zu (Zulu)
Training Details 
Data Volume:
1.5TB pre-processed text
Methodology:
10x distillation
Context Length:
2048
Training Time:
Estimated: March-July 2022
Hardware Used:
384 A100 80GB GPUs, Jean Zay Public Supercomputer
Model Architecture:
Modified from Megatron-LM GPT2
Input Output 
Input Format:
Text
Accepted Modalities:
text
Output Format:
Text
Performance Tips:
Based on the BLOOM license, use in high-stakes settings is prohibited.
Release Notes 
Version:
1.0.0
Date:
18.July.2022
Notes:
Intermediary checkpoint and WIP project. This model is a distilled version of Bloom-1B3 (10x distillation).
LLM NameDistill Bloom 1b3 10x
Repository ๐Ÿค—https://huggingface.co/bigscience/distill-bloom-1b3-10x 
Model Size166.3m
Required VRAM0.3 GB
Updated2025-02-22
Maintainerbigscience
Model Typebloom
Model Files  0.3 GB   0.3 GB
Supported Languagesak ar as bm bn ca code en es eu fr gu hi id ig ki kn lg ln ml mr ne ny or pa pt rn rw sn st sw ta te tn ts tw ur vi wo xh yo zh zu
Model ArchitectureBloomForCausalLM
Licensebigscience-bloom-rail-1.0
Transformers Version4.21.0.dev0
Vocabulary Size250880

Best Alternatives to Distill Bloom 1b3 10x

Best Alternatives
Context / RAM
Downloads
Likes
Bloom Small 1660K / 0.7 GB2362
Note: green Score (e.g. "73.2") means that the model is better than bigscience/distill-bloom-1b3-10x.

Rank the Distill Bloom 1b3 10x Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227