OLMo Bitnet 1B by NousResearch

 ยป  All LLMs  ยป  NousResearch  ยป  OLMo Bitnet 1B   URL Share it on

  Arxiv:2402.17764   Autotrain compatible   Custom code   Dataset:allenai/dolma   Endpoints compatible   Olmo   Pytorch   Region:us

OLMo Bitnet 1B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
OLMo Bitnet 1B (NousResearch/OLMo-Bitnet-1B)

OLMo Bitnet 1B Parameters and Internals

Model Type 
1-bit LLM
Additional Notes 
Trained using the OLMo platform for research purposes.
Training Details 
Data Sources:
first 60B tokens of the Dolma dataset
Data Volume:
60B tokens
Methodology:
1-bit LLMs training method described in 'The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits'
LLM NameOLMo Bitnet 1B
Repository ๐Ÿค—https://huggingface.co/NousResearch/OLMo-Bitnet-1B 
Model Size1b
Required VRAM4.7 GB
Updated2024-12-22
MaintainerNousResearch
Model Typeolmo
Model Files  4.7 GB
Model ArchitectureOLMoModelForCausalLM
Licenseapache-2.0
Transformers Version4.38.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|padding|>
Vocabulary Size50280

Rank the OLMo Bitnet 1B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217