OLMoE 1B 7B 0924 by allenai

 ยป  All LLMs  ยป  allenai  ยป  OLMoE 1B 7B 0924   URL Share it on

  Arxiv:2409.02060   Autotrain compatible   Co2 eq emissions   Dataset:allenai/olmoe-mix-0924   En   Endpoints compatible   Moe   Olmo   Olmoe   Region:us   Safetensors   Sharded   Tensorflow

OLMoE 1B 7B 0924 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

OLMoE 1B 7B 0924 Parameters and Internals

LLM NameOLMoE 1B 7B 0924
Repository ๐Ÿค—https://huggingface.co/allenai/OLMoE-1B-7B-0924 
Model Size1b
Required VRAM13.8 GB
Updated2024-09-20
Maintainerallenai
Model Typeolmoe
Model Files  5.0 GB: 1-of-3   5.0 GB: 2-of-3   3.8 GB: 3-of-3
Supported Languagesen
Model ArchitectureOlmoeForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.43.0.dev0
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|padding|>
Vocabulary Size50304
Torch Data Typebfloat16
OLMoE 1B 7B 0924 (allenai/OLMoE-1B-7B-0924)

Best Alternatives to OLMoE 1B 7B 0924

Best Alternatives
Context / RAM
Downloads
Likes
OLMoE 1B 7B 0924 Instruct4K / 13.8 GB150965
OLMoE 1B 7B 0924 SFT4K / 13.8 GB24915

Rank the OLMoE 1B 7B 0924 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36073 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803