Meta-Learning with Complex Tasks

PhD Thesis Proposal Defence


Title: "Meta-Learning with Complex Tasks"

by

Mr. Weisen JIANG


Abstract:

Meta-Learning aims at extracting knowledge from historical tasks to accelerate
learning on new tasks. It has achieved promising performance in various
applications and many researchers have developed algorithms to learn a
meta-model that can be used as an initialization/regularization for
task-specific finetuning algorithms. In this thesis, we focus on meta-learning
with complex tasks, thus, task-specific models are diverse and a simple
meta-model cannot represent all meta-knowledge.

First, we extend learning an efficient meta-regularization for linear models to
nonlinear models by kernelized proximal regularization, allowing more powerful
models like deep networks to deal with complex tasks. The inner problem is
reformulated into a dual problem and a learnable proximal regularizer is
introduced to the base learner. We propose a novel meta-learning algorithm to
learn the proximal regularizer and establish its local/global convergence.

Second, we formulate the task-specific model parameters into a subspace mixture
and propose a model-agnostic meta-learning algorithm to learn the subspace
bases. Each subspace represents one type of meta-knowledge and structural
meta-knowledge accelerates learning complex tasks more effectively than a
simple meta-model. The proposed algorithm can be used for both linear and
nonlinear models Empirical results show that the proposed algorithm can
discover the underlying subspace of task model parameters.

Third, we propose an effective and parameter-efficient meta-learning algorithm
for language models. The proposed algorithm learns a pool of multiple
meta-prompts to extract knowledge from meta-training tasks and then constructs
instance-dependent prompts as weighted combinations of all the prompts in the
pool by attention. Prompts in the pool are meta-parameters while the language
model is frozen, thus very parameter-efficient. A novel soft verbalizer is
proposed to reduce human effort in annotating words for labels.


Date:                   Tuesday, 5 March 2024

Time:                   4:00pm - 6:00pm

Venue:                  Room 5501
                        Lifts 25/26

Committee Members:      Prof. James Kwok (Supervisor)
                        Dr. Brian Mak (Chairperson)
                        Dr. Junxian He
                        Dr. Yangqiu Song