OLMo 1B 0724 Hf by allenai

 ยป  All LLMs  ยป  allenai  ยป  OLMo 1B 0724 Hf   URL Share it on

  Arxiv:2402.00838   Autotrain compatible   Dataset:allenai/dolma   En   Endpoints compatible   Olmo   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/allenai/OLMo-1B-0724-hf 

OLMo 1B 0724 Hf Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
OLMo 1B 0724 Hf (allenai/OLMo-1B-0724-hf)

OLMo 1B 0724 Hf Parameters and Internals

Model Type 
Transformer, autoregressive, language model
Use Cases 
Areas:
Research, NLP Applications
Primary Use Cases:
Language Modeling, Text Generation
Limitations:
May generate harmful content, Bias issues
Considerations:
Users should verify generated facts for validity
Additional Notes 
Developed with support from Databricks, Kempner Institute, AMD, CSC, and UW.
Supported Languages 
languages_supported (English), proficiency_levels ()
Training Details 
Data Sources:
Dolma dataset version 1.7
Data Volume:
3.05 Trillion tokens
Methodology:
Staged training with a cosine learning rate schedule
Hardware Used:
MI250X GPUs, A100-40GB GPUs
Model Architecture:
Transformer style autoregressive model
Release Notes 
Version:
1B July 2024
Notes:
Update with improved HellaSwag performance and refined dataset training.
LLM NameOLMo 1B 0724 Hf
Repository ๐Ÿค—https://huggingface.co/allenai/OLMo-1B-0724-hf 
Model Size1b
Required VRAM5.1 GB
Updated2025-03-13
Maintainerallenai
Model Typeolmo
Model Files  4.7 GB: 1-of-2   0.4 GB: 2-of-2
Supported Languagesen
Model ArchitectureOlmoForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.40.2
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|padding|>
Vocabulary Size50304
Torch Data Typefloat32

Best Alternatives to OLMo 1B 0724 Hf

Best Alternatives
Context / RAM
Downloads
Likes
OLMo 1B Base Shakespeare4K / 5.1 GB260
OLMo 1B Hf2K / 4.7 GB2019820
AMD OLMo 1B2K / 4.7 GB351125
AMD OLMo 1B SFT DPO2K / 4.7 GB131120
AMD OLMo 1B SFT2K / 4.7 GB66319
Olmo Oasst 2e2K / 4.7 GB890
Olmo Oasst 1e2K / 4.7 GB70
Note: green Score (e.g. "73.2") means that the model is better than allenai/OLMo-1B-0724-hf.

Rank the OLMo 1B 0724 Hf Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 44950 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227