Einstein V4 Phi2 by Weyaxi

 ยป  All LLMs  ยป  Weyaxi  ยป  Einstein V4 Phi2   URL Share it on

  Autotrain compatible   Axolotl Base model:finetune:microsoft/...   Base model:microsoft/phi-2   Biology   Chatml   Chemistry   Conversational   Custom code   Dataset:allenai/ai2 arc   Dataset:bigbio/med qa   Dataset:camel-ai/biology   Dataset:camel-ai/chemistry   Dataset:camel-ai/math   Dataset:camel-ai/physics Dataset:cot-alpaca-gpt4-from-o...   Dataset:derek-thomas/scienceqa Dataset:glaiveai/glaive-code-a... Dataset:jondurbin/airoboros-3.... Dataset:knowrohit07/saraswati-...   Dataset:ldjnr/capybara   Dataset:lmsys/lmsys-chat-1m   Dataset:mandyyyyii/scibench Dataset:meta-math/metamathqa-4...   Dataset:metaeval/reclor Dataset:migtissera/synthia-v1....   Dataset:open-orca/slimorca   Dataset:openbookqa   Dataset:piqa   Dataset:sablo/oasst2 curated   Dataset:scibench   Dataset:sciq Dataset:stem-ai-mtl/electrical...   Dataset:tiger-lab/mathinstruct   Dataset:tiger-lab/scienceeval   Einstein   En   Endpoints compatible   Finetuned   Generated from trainer   Gpt4   Instruct   Math   Model-index   Phi   Phi2   Physics   Region:us   Safetensors   Science   Sharded   Synthetic data   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/Weyaxi/Einstein-v4-phi2 

Einstein V4 Phi2 Benchmarks

Einstein V4 Phi2 (Weyaxi/Einstein-v4-phi2)

Einstein V4 Phi2 Parameters and Internals

Model Type 
text-generation
Additional Notes 
This model is a full fine-tuned version of [microsoft/phi-2] on diverse datasets. Quantizationed versions are available.
Training Details 
Data Sources:
allenai/ai2_arc, camel-ai/physics, camel-ai/chemistry, camel-ai/biology, camel-ai/math, metaeval/reclor, openbookqa, mandyyyyii/scibench, derek-thomas/ScienceQA, TIGER-Lab/ScienceEval, jondurbin/airoboros-3.2, LDJnr/Capybara, Cot-Alpaca-GPT4-From-OpenHermes-2.5, STEM-AI-mtl/Electrical-engineering, knowrohit07/saraswati-stem, sablo/oasst2_curated, glaiveai/glaive-code-assistant, lmsys/lmsys-chat-1m, TIGER-Lab/MathInstruct, bigbio/med_qa, meta-math/MetaMathQA-40K, piqa, metaeval/reclor
Context Length:
2048
Hardware Used:
8xRTX3090, 1xRTXA6000
LLM NameEinstein V4 Phi2
Repository ๐Ÿค—https://huggingface.co/Weyaxi/Einstein-v4-phi2 
Base Model(s)  Phi 2   microsoft/phi-2
Model Size2.8b
Required VRAM5.6 GB
Updated2025-02-05
MaintainerWeyaxi
Model Typephi
Model Files  5.0 GB: 1-of-2   0.6 GB: 2-of-2   0.0 GB
Supported Languagesen
Model ArchitecturePhiForCausalLM
Licenseother
Context Length2048
Model Max Length2048
Transformers Version4.38.2
Tokenizer ClassCodeGenTokenizer
Padding Token<|endoftext|>
Vocabulary Size51200
Torch Data Typebfloat16

Best Alternatives to Einstein V4 Phi2

Best Alternatives
Context / RAM
Downloads
Likes
MFANN3bv0.24128K / 11.1 GB50
MFANN3b128K / 11.1 GB1160
MFANN3bv1.3128K / 11.1 GB130
MFANN3bv1.1128K / 11.1 GB160
MFANN3bv0.23128K / 11.1 GB60
MFANN3b SFT128K / 5.6 GB1690
MFANN3b Rebase128K / 11.1 GB100
MFANN3bv1.2126K / 11.1 GB320
MFANN Phigments Slerp V232K / 5.6 GB1340
MFANN3bv0.2232K / 11.1 GB50
Note: green Score (e.g. "73.2") means that the model is better than Weyaxi/Einstein-v4-phi2.

Rank the Einstein V4 Phi2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227