Phibode 3 Mini 4K Ultraalpaca by recogna-nlp

 ยป  All LLMs  ยป  recogna-nlp  ยป  Phibode 3 Mini 4K Ultraalpaca   URL Share it on

  Autotrain compatible   Conversational   Custom code   Endpoints compatible   Instruct   Phi3   Region:us   Safetensors   Sharded   Tensorflow

Phibode 3 Mini 4K Ultraalpaca Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Phibode 3 Mini 4K Ultraalpaca (recogna-nlp/phibode-3-mini-4k-ultraalpaca)

Phibode 3 Mini 4K Ultraalpaca Parameters and Internals

Model Type 
text generation
Use Cases 
Limitations:
The model is a work in progress and still presents issues in generating text in Portuguese.
Additional Notes 
Designed for users with limited computational resources.
Supported Languages 
Portuguese (refined for Portuguese language)
Training Details 
Data Sources:
UltraAlpaca dataset translated for Portuguese
Methodology:
LoRa fine-tuning
Input Output 
Input Format:
Expected input roles: system, user
Accepted Modalities:
text
Output Format:
Generated text
LLM NamePhibode 3 Mini 4K Ultraalpaca
Repository ๐Ÿค—https://huggingface.co/recogna-nlp/phibode-3-mini-4k-ultraalpaca 
Model Size3.8b
Required VRAM15.4 GB
Updated2025-01-24
Maintainerrecogna-nlp
Model Typephi3
Instruction-BasedYes
Model Files  5.0 GB: 1-of-4   5.0 GB: 2-of-4   5.0 GB: 3-of-4   0.4 GB: 4-of-4
Model ArchitecturePhi3ForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.40.1
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32064
Torch Data Typefloat32

Best Alternatives to Phibode 3 Mini 4K Ultraalpaca

Best Alternatives
Context / RAM
Downloads
Likes
Phi 3.5 Mini Instruct128K / 7.7 GB802329762
Phi 3 Mini 128K Instruct128K / 7.7 GB3551711627
NuExtract V1.5128K / 7.7 GB10851189
NuExtract 1.5128K / 7.7 GB21585190
Glider128K / 15.4 GB108331
Phi 3.5 Mini TitanFusion 0.1128K / 7.7 GB200
ECE EIFFEL 3Bv2128K / 7.7 GB130
Phi 3.5 Multi Ability Chatbot128K / 7.7 GB702
Samantha2.0 Phi 3.5 Mini ITA128K / 7.7 GB41810
Phi3.5 Comets 3.8B128K / 7.7 GB1200
Note: green Score (e.g. "73.2") means that the model is better than recogna-nlp/phibode-3-mini-4k-ultraalpaca.

Rank the Phibode 3 Mini 4K Ultraalpaca Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41815 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227