Skip to main content
[Preprint]. 2024 May 13:2024.05.13.593807. [Version 1] doi: 10.1101/2024.05.13.593807

Table 1.

Parameters of publicly available pre-trained antibody language models.

Model Parameters Hidden Size Intermediate Size Attention Heads Layers
antiBERTy 25.76M 512 2048 8 8
antiBERTa2 202.64M 1024 4096 16 16
BALM-paired 303.92M 1024 4096 16 24
ft-ESM2 652.36M 1280 5120 20 33