Nucleotide Transformer: building and evaluating robust foundation models for human genomics
MainFoundation models in artificial intelligence (AI) are characterized by their large-scale nature, incorporating millions of parameters trained on extensive datasets. These models can be adapted for a wide range of subsequent predictive tasks and have profoundly transformed the AI field. Notable examples in natural language processing (NLP) include the so-called language models (LMs) BERT1 and GPT2. LMs have gained significant popularity in recent years owing to their ability to be trained on ...
Read more at nature.com