Gemma 2B FT 500 Orca Maths by anwesh

 ยป  All LLMs  ยป  anwesh  ยป  Gemma 2B FT 500 Orca Maths   URL Share it on

  4bit   Autotrain compatible Base model:finetune:unsloth/ge... Base model:unsloth/gemma-2b-it...   Conversational   En   Endpoints compatible   Gemma   Pytorch   Quantized   Region:us   Sft   Sharded   Trl   Unsloth

Gemma 2B FT 500 Orca Maths Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Gemma 2B FT 500 Orca Maths (anwesh/gemma-2B-FT-500-orca-maths)

Gemma 2B FT 500 Orca Maths Parameters and Internals

Model Type 
text-generation-inference, transformers, unsloth, gemma, trl, sft
Additional Notes 
The model utilizes Unsloth techniques to reduce training time and achieve faster fine-tuning, enhancing model performance and efficiency.
Training Details 
Methodology:
Finetuned using Huggingface's TRL library and Unsloth to achieve 2x faster training.
LLM NameGemma 2B FT 500 Orca Maths
Repository ๐Ÿค—https://huggingface.co/anwesh/gemma-2B-FT-500-orca-maths 
Base Model(s)  Gemma 2B It Bnb 4bit   unsloth/gemma-2b-it-bnb-4bit
Model Size2b
Required VRAM5.1 GB
Updated2024-12-22
Maintaineranwesh
Model Typegemma
Model Files  5.0 GB: 1-of-2   0.1 GB: 2-of-2
Supported Languagesen
Quantization Type4bit
Model ArchitectureGemmaForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.41.2
Tokenizer ClassGemmaTokenizer
Padding Token<pad>
Vocabulary Size256000
Torch Data Typefloat16

Best Alternatives to Gemma 2B FT 500 Orca Maths

Best Alternatives
Context / RAM
Downloads
Likes
Vi Gemma 2B RAG8K / 5.1 GB89113
... 2B It Hermes Function Calling8K / 5.1 GB210
Gemma 2B Bnb 4bit8K / 2.1 GB334015
Gemma 1.1 2B It Bnb 4bit8K / 2.1 GB12654
Gemma 2B It Bnb 4bit8K / 2.1 GB186918
My AwesomeFinance Model8K / 2.1 GB130
... 2.8 Gemma 2B HQQ 1bit Smashed8K / 1.3 GB120
... 2.8 Gemma 2B HQQ 4bit Smashed8K / 2.1 GB120
... 2.8 Gemma 2B HQQ 2bit Smashed8K / 1.6 GB110
Emo AI 3B8K / 5.1 GB121
Note: green Score (e.g. "73.2") means that the model is better than anwesh/gemma-2B-FT-500-orca-maths.

Rank the Gemma 2B FT 500 Orca Maths Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217