Biobert Base Cased V1.1 by dmis-lab

 ยป  All LLMs  ยป  dmis-lab  ยป  Biobert Base Cased V1.1   URL Share it on

  Endpoints compatible   Pytorch   Region:us

Biobert Base Cased V1.1 Parameters and Internals

LLM NameBiobert Base Cased V1.1
RepositoryOpen on ๐Ÿค— 
Required VRAM0.4 GB
Model Files  0.4 GB
Model ArchitectureAutoModel
Context Length512
Model Max Length512
Vocabulary Size28996
Biobert Base Cased V1.1 (dmis-lab/biobert-base-cased-v1.1)

Best Alternatives to Biobert Base Cased V1.1

Best Alternatives
HF Rank
Distil Longformer Base 40960.14K / 0.4 GB400
Daedalus 10.21K /  GB131
Tiny Random Detr1K / 0.2 GB80
Opengpt2 Pytorch Backward1K / 6 GB401
Opengpt2 Pytorch Forward1K / 6 GB61
Finsent Transformer0.5K / 0.4 GB61
Coref Roberta Large0.5K / 1.4 GB51
Simbert Chinese Tiny0.20.5K / 0 GB160
Bert Chinese L 12 H 768 A 120.20.5K / 0.4 GB51
Simbert Chinese Base0.20.5K / 0.4 GB50
Note: green Score (e.g. "73.2") means that the model is better than dmis-lab/biobert-base-cased-v1.1.

Rank the Biobert Base Cased V1.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34266 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024071601