Jamba V0.1 9B by TechxGenus

 ยป  All LLMs  ยป  TechxGenus  ยป  Jamba V0.1 9B   URL Share it on

  Autotrain compatible   Custom code   Endpoints compatible   Jamba   Mamba   Moe   Region:us   Safetensors   Sharded   Tensorflow

Jamba V0.1 9B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Jamba V0.1 9B (TechxGenus/Jamba-v0.1-9B)

Jamba V0.1 9B Parameters and Internals

Model Type 
Joint Attention, Mamba, Generative Text Model, Dense Model, Mixture-of-Experts
Use Cases 
Areas:
Research, Commercial applications
Applications:
Fine-tuning for chat/instruct versions
Primary Use Cases:
Foundation layer for training and developing custom solutions
Limitations:
Did not undergo any alignment for instruct/chat interactions
Considerations:
Guardrails should be added for responsible and safe use.
Additional Notes 
Jamba is the first production-scale Mamba implementation. It is a state-of-the-art, hybrid SSM-Transformer LLM.
Training Details 
Methodology:
Joint Attention and Mamba
Context Length:
256000
Model Architecture:
Hybrid SSM-Transformer LLM
Responsible Ai Considerations 
Mitigation Strategies:
Does not have safety moderation mechanisms and guardrails.
Input Output 
Input Format:
Text prompts should include the 'BOS' token for evaluation.
Accepted Modalities:
Text
Output Format:
Generated text
Performance Tips:
Model can be loaded in BF16/FP16 using `torch_dtype` for better performance. Use `attn_implementation` for FlashAttention2. Quantization with bitsandbytes is supported to fit larger sequences.
Release Notes 
Version:
v0.1
Notes:
Dense version without Mixture-of-Experts. Extracts weights of the first expert.
LLM NameJamba V0.1 9B
Repository ๐Ÿค—https://huggingface.co/TechxGenus/Jamba-v0.1-9B 
Model Size9b
Required VRAM18.5 GB
Updated2025-02-05
MaintainerTechxGenus
Model Typejamba
Model Files  4.9 GB: 1-of-4   4.9 GB: 2-of-4   4.9 GB: 3-of-4   3.8 GB: 4-of-4
Model ArchitectureJambaForCausalLM
Licenseapache-2.0
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<|pad|>
Vocabulary Size65536
Torch Data Typebfloat16

Best Alternatives to Jamba V0.1 9B

Best Alternatives
Context / RAM
Downloads
Likes
Jamba 2xMoE256K / 48.6 GB50
Asp 9B Inst Base0K / 18.5 GB141

Rank the Jamba V0.1 9B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227