Jetmoe 8B Chat by jetmoe

 ยป  All LLMs  ยป  jetmoe  ยป  Jetmoe 8B Chat   URL Share it on

  Arxiv:2404.07413   Alignment-handbook   Autotrain compatible Base model:finetune:jetmoe/jet...   Base model:jetmoe/jetmoe-8b   Conversational   Custom code Dataset:huggingfaceh4/airoboro...   Dataset:huggingfaceh4/capybara Dataset:huggingfaceh4/code-fee... Dataset:huggingfaceh4/orca-mat... Dataset:huggingfaceh4/systemch... Dataset:huggingfaceh4/ultracha...   Generated from trainer   Jetmoe   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/jetmoe/jetmoe-8b-chat 

Jetmoe 8B Chat Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Jetmoe 8B Chat (jetmoe/jetmoe-8b-chat)

Jetmoe 8B Chat Parameters and Internals

Model Type 
text generation
Additional Notes 
JetMoE-8B is fully open-sourced and academia-friendly; uses public datasets and can be finetuned on consumer-grade hardware.
Training Details 
Data Sources:
HuggingFaceH4/ultrachat_200k, HuggingFaceH4/airoboros-3.2, HuggingFaceH4/Code-Feedback, HuggingFaceH4/orca-math-word-problems-200k, HuggingFaceH4/SystemChat, HuggingFaceH4/capybara
Data Volume:
1.25T tokens
Methodology:
MiniCPM's two-phases training method with a mixture of open-source datasets.
Training Time:
2 weeks on a 96ร—H100 GPU cluster
Hardware Used:
96ร—H100 GPU cluster
Model Architecture:
24 blocks, with Mixture of Attention heads (MoA) and Mixture of MLP Experts (MoE), 8 experts with 2 activated for each input token.
LLM NameJetmoe 8B Chat
Repository ๐Ÿค—https://huggingface.co/jetmoe/jetmoe-8b-chat 
Base Model(s)  Jetmoe 8B   jetmoe/jetmoe-8b
Model Size8b
Required VRAM17 GB
Updated2025-02-22
Maintainerjetmoe
Model Typejetmoe
Model Files  4.9 GB: 1-of-4   4.9 GB: 2-of-4   4.9 GB: 3-of-4   2.3 GB: 4-of-4
Model ArchitectureJetMoEForCausalLM
Licenseapache-2.0
Model Max Length4096
Is Biased1
Padding Token</s>
Vocabulary Size32000
Activation Functionsilu

Best Alternatives to Jetmoe 8B Chat

Best Alternatives
Context / RAM
Downloads
Likes
Jetmoe 8B0K / 17 GB3227245
Jetmoe 8B Sft0K / 17 GB7246
Note: green Score (e.g. "73.2") means that the model is better than jetmoe/jetmoe-8b-chat.

Rank the Jetmoe 8B Chat Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227