Phi3 Mini Context Ignore by Satandon1999

 ยป  All LLMs  ยป  Satandon1999  ยป  Phi3 Mini Context Ignore   URL Share it on

  Autotrain compatible   Code   Conversational   Custom code   En   Endpoints compatible   Instruct   Phi3   Region:us   Safetensors   Sharded   Tensorflow

Phi3 Mini Context Ignore Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phi3 Mini Context Ignore (Satandon1999/phi3-mini-context-ignore)

Phi3 Mini Context Ignore Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
commercial, research
Applications:
Memory/compute constrained environments, Latency bound scenarios, Strong reasoning tasks like code, math and logic
Primary Use Cases:
language and multimodal research, generative AI-powered features
Limitations:
not specifically evaluated for all downstream purposes, performance varies across different modalities
Considerations:
Developers should apply debiasing and further mitigate for accuracy, safety, and fairness.
Supported Languages 
languages_supported (en), proficiency_levels ()
Training Details 
Data Sources:
Publicly available documents filtered for quality, high-quality educational data, code
Data Volume:
3.3T tokens
Methodology:
Supervised fine-tuning (SFT) and Direct Preference Optimization (DPO)
Context Length:
128000
Training Time:
7 days
Hardware Used:
512 H100-80G
Model Architecture:
3.8B parameter dense decoder-only Transformer model
Safety Evaluation 
Methodologies:
Supervised fine-tuning (SFT), Direct Preference Optimization (DPO)
Findings:
Can potentially behave unfairly or offend, Possibility of generating nonsensical content, Quality of Service may vary based on language variety
Risk Categories:
misinformation, stereotype perpetuation
Ethical Considerations:
Developers must adhere to responsible AI practices and ensure compliance with laws and regulations.
Responsible Ai Considerations 
Fairness:
Models may under/over-represent groups and decisions on use-cases should be sensitive to model limitations.
Transparency:
Detailed transparency related to the training and evaluation process is provided.
Accountability:
Developers are responsible for ensuring fair and compliant use.
Mitigation Strategies:
Supervised fine-tuning and direct preference optimizations are used to align with human preferences and safety guidelines.
Input Output 
Input Format:
Chat format. E.g. <|user|>Question<|end|><|assistant|>...
Accepted Modalities:
text
Output Format:
Generated text in response to inputs
LLM NamePhi3 Mini Context Ignore
Repository ๐Ÿค—https://huggingface.co/Satandon1999/phi3-mini-context-ignore 
Model Size3.8b
Required VRAM7.7 GB
Updated2025-02-22
MaintainerSatandon1999
Model Typephi3
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   2.7 GB: 2-of-2
Supported Languagesen
Model ArchitecturePhi3ForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32064
Torch Data Typebfloat16

Best Alternatives to Phi3 Mini Context Ignore

Best Alternatives
Context / RAM
Downloads
Likes
Phi 3.5 Mini Instruct128K / 7.7 GB721615819
Phi 3 Mini 128K Instruct128K / 7.7 GB1741761636
NuExtract 1.5128K / 7.7 GB115378198
NuExtract V1.5128K / 7.7 GB10851189
Phi 3.5 Mini TitanFusion 0.1128K / 7.7 GB1650
Glider128K / 15.4 GB150436
Saka 3.8B128K / 7.7 GB3091
ECE EIFFEL 3Bv2128K / 7.7 GB100
Samantha2.0 Phi 3.5 Mini ITA128K / 7.7 GB41210
Artemide 3.5128K / 7.7 GB74532
Note: green Score (e.g. "73.2") means that the model is better than Satandon1999/phi3-mini-context-ignore.

Rank the Phi3 Mini Context Ignore Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227