Gemma 7B Us Minecraft by emre570

 ยป  All LLMs  ยป  emre570  ยป  Gemma 7B Us Minecraft   URL Share it on

  4bit   Adapter Base model:adapter:unsloth/gem... Base model:unsloth/gemma-1.1-7... Dataset:naklecha/minecraft-que...   Finetuned   Gemma   Lora   Peft   Quantized   Region:us   Safetensors

Gemma 7B Us Minecraft Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Gemma 7B Us Minecraft (emre570/gemma-7b-us-minecraft)

Gemma 7B Us Minecraft Parameters and Internals

Use Cases 
Limitations:
Sometimes generates inappropriate answers, Sometimes generates answers with no meanings
Considerations:
Ongoing investigation for meaningless answers by developer
Additional Notes 
Model sometimes generates answers with no meanings. Developer is investigating this. Developer is a beginner in the field.
Training Details 
Data Sources:
naklecha/minecraft-question-answer-700k
Data Volume:
100k rows
Methodology:
Fine-tuning with 1 epoch
Training Time:
2 hours 20 minutes
Hardware Used:
NVIDIA RTX 4090
LLM NameGemma 7B Us Minecraft
Repository ๐Ÿค—https://huggingface.co/emre570/gemma-7b-us-minecraft 
Base Model(s)  unsloth/gemma-1.1-7b-it-bnb-4bit   unsloth/gemma-1.1-7b-it-bnb-4bit
Model Size7b
Required VRAM0 GB
Updated2025-02-05
Maintaineremre570
Model Files  0.2 GB   0.2 GB   0.0 GB   0.0 GB
Quantization Type4bit
Model ArchitectureAdapter
Is Biasednone
Tokenizer ClassGemmaTokenizer
Padding Token<pad>
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesv_proj|o_proj|gate_proj|q_proj|up_proj|k_proj|down_proj
LoRA Alpha32
LoRA Dropout0
R Param32

Best Alternatives to Gemma 7B Us Minecraft

Best Alternatives
Context / RAM
Downloads
Likes
Atomgpt Mistral Tc Supercon0K / 0.2 GB1773
...hat Hf Hotel Booking Assistant0K / 0.2 GB60
Mistral Numericnlg FV0K / 0.3 GB50
Mistral Wikitable FV0K / 0.3 GB50
Mistral Charttotext FV0K / 0.3 GB50
Gemma7b SFT0K / 0 GB190
Gemma7b SFT0K / 0 GB190
RP Format Test0K / 0.7 GB34
Synthetic Soul 1k Mistral 1280K / 1.3 GB540
Mistral Ielts Evaluation Base0K / 0.2 GB31
Note: green Score (e.g. "73.2") means that the model is better than emre570/gemma-7b-us-minecraft.

Rank the Gemma 7B Us Minecraft Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227