Pretraining foundation models that adapt to a wide range of molecule tasks have been long pursued by the community of drug discovery. While self-supervised learning methods are developed to leverage the sheer number of unlabeled molecules for pretraining, the landscape of supervised learning is much underexplored due to the absence of proper datasets and codebases. To facilitate the study of supervised learning on molecules, we curate 7 datasets with node- and graph-level supervision, and develop a library for studying multi-task learning models. The datasets are separated into 2 categories. First, the Toy-mix category contains 3 small datasets that are well known and well studied in the literature, but with the additional constraint that t...
Self-supervised representation learning (SSL) on biomedical networks provides new opportunities for ...
Molecular pretraining, which learns molecular representations over massive unlabeled data, has becom...
The goal of quantitative structure activity relationship (QSAR) learning is to learn a function that...
Pretraining foundation models that adapt to a wide range of molecule tasks have been long pursued by...
<p>Recently, pre-trained foundation models have shown significant advancements in multiple fie...
Recently, pre-trained foundation models have shown significant advancements in multiple fields. Howe...
Recently, pre-trained foundation models have enabled significant advancements in multiple fields. In...
Models that accurately predict properties based on chemical structure are valuable tools in drug dis...
Multi-task learning for molecular property prediction is becoming increasingly important in drug dis...
Despite the increasing volume of available data, the proportion of experimentally measured data rema...
This directory contains sets of molecules used to train chemical language models in the paper, "Lear...
Virtual (computational) high-throughput screening provides a strategy for prioritizing compounds for...
Extracting informative representations of molecules using Graph neural networks (GNNs) is crucial in...
Models based on machine learning can enable accurate and fast molecular property predictions, which ...
Chemistry research has both high material and computational costs to conduct experiments. Institutio...
Self-supervised representation learning (SSL) on biomedical networks provides new opportunities for ...
Molecular pretraining, which learns molecular representations over massive unlabeled data, has becom...
The goal of quantitative structure activity relationship (QSAR) learning is to learn a function that...
Pretraining foundation models that adapt to a wide range of molecule tasks have been long pursued by...
<p>Recently, pre-trained foundation models have shown significant advancements in multiple fie...
Recently, pre-trained foundation models have shown significant advancements in multiple fields. Howe...
Recently, pre-trained foundation models have enabled significant advancements in multiple fields. In...
Models that accurately predict properties based on chemical structure are valuable tools in drug dis...
Multi-task learning for molecular property prediction is becoming increasingly important in drug dis...
Despite the increasing volume of available data, the proportion of experimentally measured data rema...
This directory contains sets of molecules used to train chemical language models in the paper, "Lear...
Virtual (computational) high-throughput screening provides a strategy for prioritizing compounds for...
Extracting informative representations of molecules using Graph neural networks (GNNs) is crucial in...
Models based on machine learning can enable accurate and fast molecular property predictions, which ...
Chemistry research has both high material and computational costs to conduct experiments. Institutio...
Self-supervised representation learning (SSL) on biomedical networks provides new opportunities for ...
Molecular pretraining, which learns molecular representations over massive unlabeled data, has becom...
The goal of quantitative structure activity relationship (QSAR) learning is to learn a function that...