Bitnet B1 58 Large by 1bitLLM

 ยป  All LLMs  ยป  1bitLLM  ยป  Bitnet B1 58 Large   URL Share it on

  Arxiv:2402.17764   Autotrain compatible   Endpoints compatible   Llama   Region:us   Safetensors

Bitnet B1 58 Large Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Bitnet B1 58 Large (1bitLLM/bitnet_b1_58-large)

Bitnet B1 58 Large Parameters and Internals

Additional Notes 
This is a reproduction of the BitNet b1.58 paper with expected evaluations done using provided commands.
Training Details 
Data Sources:
RedPajama dataset
Data Volume:
100B tokens
Methodology:
two-stage LR and weight decay
Context Length:
2048
LLM NameBitnet B1 58 Large
Repository ๐Ÿค—https://huggingface.co/1bitLLM/bitnet_b1_58-large 
Model Size728.8m
Required VRAM2.9 GB
Updated2025-05-04
Maintainer1bitLLM
Model Typellama
Model Files  2.9 GB
Model ArchitectureBitnetForCausalLM
Licensemit
Context Length2048
Model Max Length2048
Transformers Version4.39.0
Tokenizer ClassBitnetTokenizer
Padding Token<pad>
Vocabulary Size32002
Torch Data Typefloat16

Best Alternatives to Bitnet B1 58 Large

Best Alternatives
Context / RAM
Downloads
Likes
Bitnet Instruct Q16 Gguf2K / 2.9 GB60

Rank the Bitnet B1 58 Large Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 46981 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227