Baichuan 7B by sharpbai

 ยป  All LLMs  ยป  sharpbai  ยป  Baichuan 7B   URL Share it on

  Arxiv:1910.07467   Arxiv:2009.03300   Autotrain compatible   Baichuan   Custom code   En   Endpoints compatible   Pytorch   Region:us   Sharded   Zh
Model Card on HF ๐Ÿค—: https://huggingface.co/sharpbai/Baichuan-7B 

Baichuan 7B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Baichuan 7B (sharpbai/Baichuan-7B)

Baichuan 7B Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, commercial applications
Supported Languages 
Chinese (supported), English (supported)
Training Details 
Data Volume:
1.2 trillion tokens
Methodology:
Based on Transformer architecture
Context Length:
4096
Model Architecture:
Transformer structure with <fn> rotary-embedding</fn>, <fn> SwiGLU feedforward</fn>, and Pre-Normalization based on RMSNorm
Input Output 
Input Format:
text
Accepted Modalities:
text
Output Format:
generated text
LLM NameBaichuan 7B
Repository ๐Ÿค—https://huggingface.co/sharpbai/Baichuan-7B 
Model Size7b
Required VRAM13.8 GB
Updated2025-04-19
Maintainersharpbai
Model Typebaichuan
Model Files  0.0 GB: 1-of-35   0.5 GB: 2-of-35   0.4 GB: 3-of-35   0.4 GB: 4-of-35   0.4 GB: 5-of-35   0.4 GB: 6-of-35   0.4 GB: 7-of-35   0.4 GB: 8-of-35   0.4 GB: 9-of-35   0.4 GB: 10-of-35   0.4 GB: 11-of-35   0.4 GB: 12-of-35   0.4 GB: 13-of-35   0.4 GB: 14-of-35   0.4 GB: 15-of-35   0.4 GB: 16-of-35   0.4 GB: 17-of-35   0.4 GB: 18-of-35   0.4 GB: 19-of-35   0.4 GB: 20-of-35   0.4 GB: 21-of-35   0.4 GB: 22-of-35   0.4 GB: 23-of-35   0.4 GB: 24-of-35   0.4 GB: 25-of-35   0.4 GB: 26-of-35   0.4 GB: 27-of-35   0.4 GB: 28-of-35   0.4 GB: 29-of-35   0.4 GB: 30-of-35   0.4 GB: 31-of-35   0.4 GB: 32-of-35   0.4 GB: 33-of-35   0.4 GB: 34-of-35   0.5 GB: 35-of-35
Supported Languageszh en
Model ArchitectureBaiChuanForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.30.2
Tokenizer ClassBaiChuanTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size64000
Torch Data Typefloat16

Best Alternatives to Baichuan 7B

Best Alternatives
Context / RAM
Downloads
Likes
Baichuan 7B4K / 14 GB16481838
WisdomInterrogatory4K / 13.9 GB1025
MedChatZH4K / 14 GB658
Qiaoban Bc4K / 28 GB487
Baichuan 7B Instruction4K / 14 GB72
Baichuan 7B Sft 0014K / 14 GB103
Baichuan 7B Sft4K / 14 GB4976
HuatuoGPT 7B4K / 28 GB6522
Baichuan 7B Sharded4K / 13.9 GB61
Firefly Baichuan 7B4K / 14 GB109
Note: green Score (e.g. "73.2") means that the model is better than sharpbai/Baichuan-7B.

Rank the Baichuan 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 46490 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227