TinyJ.O.S.I.E. 2x1.1B 32K Base by Goekdeniz-Guelmez

 ยป  All LLMs  ยป  Goekdeniz-Guelmez  ยป  TinyJ.O.S.I.E. 2x1.1B 32K Base   URL Share it on

  Autotrain compatible Base model:finetune:goekdeniz-... Base model:goekdeniz-guelmez/t...   Endpoints compatible   Frankenmoe Isaak-carter/tinyj.o.s.i.e.-1....   Lazymergekit   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

TinyJ.O.S.I.E. 2x1.1B 32K Base Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
TinyJ.O.S.I.E. 2x1.1B 32K Base (Goekdeniz-Guelmez/TinyJ.O.S.I.E.-2x1.1B-32k-Base)

TinyJ.O.S.I.E. 2x1.1B 32K Base Parameters and Internals

Model Type 
moe, frankenmoe, merge
LLM NameTinyJ.O.S.I.E. 2x1.1B 32K Base
Repository ๐Ÿค—https://huggingface.co/Goekdeniz-Guelmez/TinyJ.O.S.I.E.-2x1.1B-32k-Base 
Base Model(s)  Isaak-Carter/TinyJ.O.S.I.E.-1.1B-32k-Base   Isaak-Carter/TinyJ.O.S.I.E.-1.1B-32k-Base   Isaak-Carter/TinyJ.O.S.I.E.-1.1B-32k-Base   Isaak-Carter/TinyJ.O.S.I.E.-1.1B-32k-Base
Model Size1.9b
Required VRAM7.5 GB
Updated2024-12-17
MaintainerGoekdeniz-Guelmez
Model Typemixtral
Model Files  7.5 GB: 1-of-1
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token<|startoftext|>
Vocabulary Size32024
Torch Data Typefloat32

Best Alternatives to TinyJ.O.S.I.E. 2x1.1B 32K Base

Best Alternatives
Context / RAM
Downloads
Likes
TinyJ.O.S.I.E. 2x1.1B 32K Base32K / 7.5 GB81
HelpingAI Lite 2x1B2K / 7.5 GB1562
Karasu Moexdareties2K / 3.7 GB920
TinyLlamaHerd 2x1.1B2K / 3.7 GB821
Tiny Llamix 2x1B2K / 7.5 GB950
NeuvilletteBot2K / 3.7 GB1570
Karasu Instruct112K / 3.7 GB1550
Note: green Score (e.g. "73.2") means that the model is better than Goekdeniz-Guelmez/TinyJ.O.S.I.E.-2x1.1B-32k-Base.

Rank the TinyJ.O.S.I.E. 2x1.1B 32K Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227