Ferenc Huszár 11/14/2019

Meta-Learning Millions of Hyper-parameters using the Implicit Function Theorem

Read Original

This technical article analyzes a 2019 research paper on meta-learning that uses the Implicit Function Theorem and implicit differentiation to optimize vast numbers of hyperparameters. It explains the nested optimization problem, compares the proposed Neumann series approximation for the inverse Hessian to methods like iMAML, and discusses experiments demonstrating the approach's versatility, such as treating the training dataset as a hyperparameter.

Meta-Learning Millions of Hyper-parameters using the Implicit Function Theorem

Comments

No comments yet

Be the first to share your thoughts!