Pythia 160M C2s by vandijklab

 ยป  All LLMs  ยป  vandijklab  ยป  Pythia 160M C2s   URL Share it on

  Arxiv:2304.01373   Autotrain compatible   Dataset:vandijklab/immune-c2s   En   Endpoints compatible   Gpt neox   Pytorch   Region:us   Safetensors   Scrna-seq

Pythia 160M C2s Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pythia 160M C2s (vandijklab/pythia-160m-c2s)

Pythia 160M C2s Parameters and Internals

Model Type 
causal-lm
Use Cases 
Areas:
Research, Single-cell transcriptomics
Applications:
Single-cell RNA sequencing analyses, Cell type prediction
Primary Use Cases:
Conditional cell generation, Unconditional cell generation
Considerations:
Best used with adequate hardware for full cell generation.
Additional Notes 
Cell2Sentence provides a novel approach for single-cell RNA sequencing data analysis by transforming it into cell sentences.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
immune tissue dataset from Domรญnguez et al.
Methodology:
Cell2Sentence method for adapting large language models to single-cell transcriptomics.
Training Time:
20 hours
Hardware Used:
8 A100 40GB GPUs
Model Architecture:
Transform single-cell RNA sequencing data into sequences of gene names ordered by expression level, termed "cell sentences".
Input Output 
Input Format:
Gene names ordered by expression level
Accepted Modalities:
text
Output Format:
Cell sentences with gene expressions
Performance Tips:
Use an A100 GPU for better inference speed and memory capacity.
LLM NamePythia 160M C2s
Repository ๐Ÿค—https://huggingface.co/vandijklab/pythia-160m-c2s 
Model Size160m
Required VRAM0.6 GB
Updated2025-02-05
Maintainervandijklab
Model Typegpt_neox
Model Files  0.6 GB
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licensecc-by-nc-nd-4.0
Context Length9200
Model Max Length9200
Transformers Version4.37.1
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typefloat32

Best Alternatives to Pythia 160M C2s

Best Alternatives
Context / RAM
Downloads
Likes
Pythia 160m Sft2K / 0 GB1250
Pythia 160M2K / 0.4 GB20444530
Pythia 160M Dolphin Extended2K / 0.3 GB1900
Pythia 160m Ft CookingRecipes2K / 0.6 GB1330
Sheared Pythia 160M2K / 0.7 GB1314
Pythia160m Sft Tldr2K / 0.6 GB490
Pythia 160M Storytelling2K / 0.3 GB80
Ppo2K / 0.3 GB1380
... Llm Pythia 160M Pm Gen Ian Nd2K / 0.6 GB1340
Skibidi Lm2K / 0.6 GB770
Note: green Score (e.g. "73.2") means that the model is better than vandijklab/pythia-160m-c2s.

Rank the Pythia 160M C2s Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227