EveryoneLLM 7B Gemma Base by rombodawg

 ยป  All LLMs  ยป  rombodawg  ยป  EveryoneLLM 7B Gemma Base   URL Share it on

  Autotrain compatible   Endpoints compatible   Gemma   License:other   Merge   Region:us   Safetensors   Sharded   Tensorflow

Rank the EveryoneLLM 7B Gemma Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
EveryoneLLM 7B Gemma Base (rombodawg/EveryoneLLM-7b-Gemma-Base)

Best Alternatives to EveryoneLLM 7B Gemma Base

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Kaggle Math Model Gemma V112K / 17.1 GB60
Gemma Tiny Random8K / 0 GB190
... Gemma Sft African Ultraalpaca8K / 0 GB80
...emma 7b Model Pratikndwaipayan8K / 5.6 GB212
Quantized Gemma 7B It8K / 5.9 GB874
SeaLLM 7B V2.5 Mlx Quantized8K / 5.9 GB912
Quantized Gemma 7B8K / 5.9 GB580
Gemma 7B Finetuned8K / 7.6 GB134
Temma 7B V08K / 7.6 GB11
...egemma 7B It Openvino Int8 Cpu8K / 8.6 GB60

EveryoneLLM 7B Gemma Base Parameters and Internals

LLM NameEveryoneLLM 7B Gemma Base
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM17.1 GB
Updated2024-07-05
Maintainerrombodawg
Model Typegemma
Model Files  10.0 GB: 1-of-2   7.1 GB: 2-of-2
Model ArchitectureGemmaForCausalLM
Licenseother
Context Length8192
Model Max Length8192
Transformers Version4.39.0.dev0
Tokenizer ClassGemmaTokenizer
Padding Token<pad>
Vocabulary Size256000
Initializer Range0.02
Torch Data Typefloat16

What open-source LLMs or SLMs are you in search of? 34431 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801