Model Type | phi-msft, text-generation |
|
Use Cases |
Areas: | |
Limitations: | Generate Inaccurate Code and Facts, Unreliable Responses to Instruction, Language Limitations, Potential Societal Biases, Toxicity, Verbosity |
|
|
Additional Notes | Phi-2 is intended for research purposes only and has not been fine-tuned through reinforcement learning from human feedback. |
|
Supported Languages | |
Training Details |
Data Sources: | NLP synthetic data created by AOAI GPT-3.5, filtered web data from Falcon RefinedWeb and SlimPajama |
|
Data Volume: | |
Context Length: | |
Training Time: | |
Hardware Used: | |
Model Architecture: | Transformer-based model with next-word prediction objective |
|
|
Input Output |
Input Format: | Instruct: {prompt}\nOutput: |
|
Accepted Modalities: | |
Output Format: | |
|