Evo 1 8K Base by togethercomputer

 ยป  All LLMs  ยป  togethercomputer  ยป  Evo 1 8K Base   URL Share it on

  Arxiv:2102.02611   Arxiv:2203.14343   Arxiv:2206.11893   Arxiv:2210.09298   Arxiv:2302.10866   Arxiv:2303.06349   Arxiv:2310.18780   Autotrain compatible   Biology   Custom code   Deep signal processing   Ext 8k   Genomics   Hybrid   Long context   Region:us   Safetensors   Sharded   Stripedhyena   Tensorflow

Evo 1 8K Base Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Evo 1 8K Base (togethercomputer/evo-1-8k-base)

Evo 1 8K Base Parameters and Internals

Model Type 
biological foundation model, long-context modeling, sequence modeling
Additional Notes 
Weights of 15 intermediate pretraining checkpoints for phase 1 and 2 are released on HuggingFace repository branches.
Training Details 
Data Sources:
OpenGenome, a prokaryotic whole-genome dataset
Data Volume:
~300 billion tokens
Context Length:
8192
Model Architecture:
StripedHyena, hybrid architecture with multi-head attention and gated convolutions arranged in Hyena blocks
Input Output 
Performance Tips:
Keep 'poles' and 'residues' in 'float32' precision for longer prompts or training.
Release Notes 
Version:
1.1_fix
Notes:
Fixed wrong permutation of some projections affecting generation quality.
LLM NameEvo 1 8K Base
Repository ๐Ÿค—https://huggingface.co/togethercomputer/evo-1-8k-base 
Model Size6.5b
Required VRAM12.9 GB
Updated2025-02-23
Maintainertogethercomputer
Model Typestripedhyena
Model Files  5.0 GB: 1-of-3   4.9 GB: 2-of-3   3.0 GB: 3-of-3   16.8 GB
Context Length8k
Model ArchitectureStripedHyenaModelForCausalLM
Licenseapache-2.0
Vocabulary Size512
Torch Data Typebfloat16

Best Alternatives to Evo 1 8K Base

Best Alternatives
Context / RAM
Downloads
Likes
Evo 1 131K Base0K / 12.9 GB7035106
Evo 1K Test0K / 12.9 GB111

Rank the Evo 1 8K Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227