Predicting multiple heterogeneous biological and medical targets is a challenge for traditional deep learning models. In contrast to single-task learning, in which a separate model is trained for each target, multi-task learning (MTL) optimizes a single model to predict multiple related targets simultaneously. To address this challenge, we propose the Multi-gate Mixture-of-Experts with Exclusivity (MMoEEx). Our work aims to tackle the heterogeneous MTL setting, in which the same model optimizes multiple tasks with different characteristics. Such a scenario can overwhelm current MTL approaches due to the challenges in balancing shared and task-specific representations and the need to optimize tasks with competing optimization paths. Our method makes two key contributions: first, we introduce an approach to induce more diversity among experts, thus creating representations more suitable for highly imbalanced and heterogenous MTL learning; second, we adopt a two-step optimization [6, 11] approach to balancing the tasks at the gradient level. We validate our method on three MTL benchmark datasets, including Medical Information Mart for Intensive Care (MIMIC-III) and PubChem BioAssay (PCBA).

Bibtex

@article{DBLP:journals/corr/abs-2106-10595,
 author  = {Raquel Aoki and
        Frederick Tung and
        Gabriel L. Oliveira},
 title   = {Heterogeneous Multi-task Learning with Expert Diversity},
 journal  = {CoRR},
 volume  = {abs/2106.10595},
 year   = {2021},
 url    = {https://arxiv.org/abs/2106.10595},
 archivePrefix = {arXiv},
 eprint  = {2106.10595},
 timestamp = {Tue, 29 Jun 2021 16:55:04 +0200},
 biburl  = {https://dblp.org/rec/journals/corr/abs-2106-10595.bib},
 bibsource = {dblp computer science bibliography, https://dblp.org}
}

Related Research