Gemma Mling 7B by beomi

 ยป  All LLMs  ยป  beomi  ยป  Gemma Mling 7B   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Gemma   Ja   Ko   Model-index   Pytorch   Region:us   Safetensors   Sharded   Tensorflow   Zh
Model Card on HF ๐Ÿค—: https://huggingface.co/beomi/gemma-mling-7b 

Gemma Mling 7B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Gemma Mling 7B (beomi/gemma-mling-7b)

Gemma Mling 7B Parameters and Internals

Model Type 
text generation, multilingual
Use Cases 
Areas:
Content Creation and Communication, Research and Education
Applications:
Text Generation, Natural Language Processing Research, Language Learning Tools, Knowledge Exploration
Limitations:
Biases in training data, Task complexity challenges, Language nuance difficulties, Factual inaccuracies, Lack of common sense
Considerations:
Guidelines and precautions for responsible use are provided
Additional Notes 
Model card provides comprehensive details on performance, risks, and ethical considerations
Supported Languages 
ko (primary), en (primary), zh (primary), ja (primary)
Training Details 
Data Sources:
range3/cc100-ja, Skywork/SkyPile-150B, llama2ko dataset (ko/en), cis-lmu/Glot500
Data Volume:
100B tokens
Responsible Ai Considerations 
Fairness:
Screening for socio-cultural biases and pre-processing input data
Transparency:
Model card details the architecture, capabilities, limitations, and evaluation processes
Accountability:
Open model development with considerations for ethical risks
Mitigation Strategies:
Continuous monitoring, evaluation metrics, human review, and de-biasing techniques
Input Output 
Input Format:
Text string
Accepted Modalities:
text
Output Format:
Generated Multilingual-language text
Performance Tips:
Longer context generally leads to better outputs, up to a certain point
Release Notes 
Version:
2024.04.15
Notes:
First release of Gemma-Mling 7B model
LLM NameGemma Mling 7B
Repository ๐Ÿค—https://huggingface.co/beomi/gemma-mling-7b 
Model Size7b
Required VRAM17.8 GB
Updated2024-12-21
Maintainerbeomi
Model Typegemma
Model Files  1.6 GB: 1-of-18   1.0 GB: 2-of-18   1.0 GB: 3-of-18   1.0 GB: 4-of-18   0.9 GB: 5-of-18   1.0 GB: 6-of-18   1.0 GB: 7-of-18   0.9 GB: 8-of-18   1.0 GB: 9-of-18   1.0 GB: 10-of-18   0.9 GB: 11-of-18   1.0 GB: 12-of-18   1.0 GB: 13-of-18   0.9 GB: 14-of-18   1.0 GB: 15-of-18   1.0 GB: 16-of-18   0.9 GB: 17-of-18   0.7 GB: 18-of-18
Supported Languagesko en zh ja
Model ArchitectureGemmaForCausalLM
Licenseother
Context Length8192
Model Max Length8192
Transformers Version4.38.2
Tokenizer ClassGemmaTokenizer
Padding Token<pad>
Vocabulary Size256000
Torch Data Typebfloat16

Best Alternatives to Gemma Mling 7B

Best Alternatives
Context / RAM
Downloads
Likes
Kaggle Math Model Gemma V112K / 17.1 GB240
Gemma 1.1 7B It8K / 17.1 GB15665267
SeaLLM 7B V2.58K / 17.1 GB1495849
SauerkrautLM Gemma 7B8K / 17.1 GB777813
Zephyr 7B Gemma V0.18K / 17.1 GB763121
... Codegemma 2 7B It Alpaca V1.38K / 17.1 GB190
Codegemma 7B8K / 17.1 GB3463170
Codegemma 7B It8K / 17.1 GB1964205
DiscoPOP Zephyr 7B Gemma8K / 17.1 GB708536
Gemma 7B Openhermes V0.808K / 17 GB74681
Note: green Score (e.g. "73.2") means that the model is better than beomi/gemma-mling-7b.

Rank the Gemma Mling 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217