Model Type | |
Use Cases |
Areas: | research, foundation for further specialization, fine-tuning |
|
Applications: | summarization, text generation, chatbot |
|
Primary Use Cases: | Quantum computing concept explanation |
|
Limitations: | limited generalization to non-model-specific languages |
|
Considerations: | Requires adequate assessment and mitigation of risks for production use. |
|
|
Additional Notes | The merge method ensures layer similarity with Danish text inputs. |
|
Supported Languages | da (primary), en (limited), de (limited), es (limited), fr (limited), it (limited), pt (limited), pl (limited), nl (limited), ro (limited), cs (limited), sv (limited) |
|
Training Details |
Data Sources: | wikimedia/wikipedia Danish subset |
|
Data Volume: | |
Methodology: | Continued pre-training with pruning strategy using PruneMe |
|
Model Architecture: | Pruned model layers [0, 25] and [56, 59] undergo passthrough merge method |
|
|
Responsible Ai Considerations |
Fairness: | The model may carry stereotypes and biases encountered online. |
|
Mitigation Strategies: | Recommend finetuning for specific tasks and precautions in production use. |
|
|
Input Output |
Input Format: | |
Accepted Modalities: | |
Output Format: | |
Performance Tips: | Use finetuning and specify tasks of interest. |
|
|