-
Notifications
You must be signed in to change notification settings - Fork 3
Description
When we specify model-model, trainer-trainer, model-training combination, e.g. https://github.com/marrlab/DomainLab/blob/master/examples/conf/vlcs_diva_mldg_dial.yaml
It can be there are name collisions of the multiplier
| self.lambda_ctr = self.aconf.gamma_reg |
-https://github.com/marrlab/DomainLab/blob/6126ddeb2df0fe3a07de458cac68e4ad02da3c66/domainlab/algos/trainers/train_mldg.py#L111C19-L111C39
-
| return [loss_dial], [self.aconf.gamma_reg] |
In this case, when i set commandline argument gamma_reg=1.0, it simultaneously set both
Further importance of this Issue:
The advantage/one of the major feature for DomainLab is that we can handle training multiple loss terms with multiplier weighting each term, to circumsvent setting those multipliers mannually, there are different multiplier schedulers to adapt those multiplier values at each epoch:
This file is the Trainer which could schedue the values of those multipliers at each epoch,
https://github.com/marrlab/DomainLab/blob/master/domainlab/algos/trainers/train_hyper_scheduler.py
Inside the Trainer, a hyper_scheduler has to be specified (In the branch fbopt, we define a fully automatic contoller for scheduling the multipliers, in the main branch, the schedulers are
https://github.com/marrlab/DomainLab/blob/master/domainlab/algos/trainers/hyper_scheduler.py
First, those multipliers (a.k.a. hyperparameters) are initialized
| self.hyper_scheduler = self.model.hyper_init(scheduler) |
At each epoch, the multipliers (a.k.a. hyperparameters) are updated via the scheduler
| self.model.hyper_update(epoch, self.hyper_scheduler) |
All
DomainLab/domainlab/models/model_diva.py
Line 112 in 43d1cda
| trainer=None, beta_d=self.beta_d, beta_y=self.beta_y, beta_x=self.beta_x |