Model Type | text generation, multilingual |
|
Use Cases |
Areas: | Content Creation and Communication, Research and Education |
|
Applications: | Text Generation, Natural Language Processing Research, Language Learning Tools, Knowledge Exploration |
|
Limitations: | Biases in training data, Task complexity challenges, Language nuance difficulties, Factual inaccuracies, Lack of common sense |
|
Considerations: | Guidelines and precautions for responsible use are provided |
|
|
Additional Notes | Model card provides comprehensive details on performance, risks, and ethical considerations |
|
Supported Languages | ko (primary), en (primary), zh (primary), ja (primary) |
|
Training Details |
Data Sources: | range3/cc100-ja, Skywork/SkyPile-150B, llama2ko dataset (ko/en), cis-lmu/Glot500 |
|
Data Volume: | |
|
Responsible Ai Considerations |
Fairness: | Screening for socio-cultural biases and pre-processing input data |
|
Transparency: | Model card details the architecture, capabilities, limitations, and evaluation processes |
|
Accountability: | Open model development with considerations for ethical risks |
|
Mitigation Strategies: | Continuous monitoring, evaluation metrics, human review, and de-biasing techniques |
|
|
Input Output |
Input Format: | |
Accepted Modalities: | |
Output Format: | Generated Multilingual-language text |
|
Performance Tips: | Longer context generally leads to better outputs, up to a certain point |
|
|
Release Notes |
Version: | |
Notes: | First release of Gemma-Mling 7B model |
|
|
|