Abstract | ||
---|---|---|
Bayesian optimization is a powerful paradigm to optimize black-box functions based on scarce and noisy data. Its data efficiency can be further improved by transfer learning from related tasks. While recent transfer models meta-learn a prior based on large amount of data, in the low-data regime methods that exploit the closed-form posterior of Gaussian processes (GPs) have an advantage. In this setting, several analytically tractable transfer-model posteriors have been proposed, but the relative advantages of these methods are not well understood. In this paper, we provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods. As part of the analysis, we develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity. We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods. |
Year | Venue | DocType |
---|---|---|
2022 | INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151 | Conference |
Volume | ISSN | Citations |
151 | 2640-3498 | 0 |
PageRank | References | Authors |
0.34 | 0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Petru Tighineanu | 1 | 0 | 0.34 |
Kathrin Skubch | 2 | 0 | 0.34 |
Paul Baireuther | 3 | 0 | 0.34 |
Attila Reiss | 4 | 0 | 0.34 |
Felix Berkenkamp | 5 | 0 | 0.34 |
Julia Vinogradska | 6 | 0 | 0.34 |