Deep Models

DCCA

class cca_zoo.deepmodels._dcca.DCCA(latent_dims, objective=<class 'cca_zoo.deepmodels._objectives.MCCA'>, encoders=None, r=0, eps=1e-05, **kwargs)[source]

A class used to fit a DCCA model.

Citation

Andrew, Galen, et al. “Deep canonical correlation analysis.” International conference on machine learning. PMLR, 2013.

Constructor class for DCCA

Parameters
  • latent_dims (int) – # latent dimensions

  • objective – # CCA objective: normal tracenorm CCA by default

  • encoders – list of encoder networks

  • r (float) – regularisation parameter of tracenorm CCA like ridge CCA. Needs to be VERY SMALL. If you get errors make this smaller

  • eps (float) – epsilon used throughout. Needs to be VERY SMALL. If you get errors make this smaller

DCCA by Non-Linear Orthogonal Iterations

class cca_zoo.deepmodels._dcca_noi.DCCA_NOI(latent_dims, N, encoders=None, r=0, rho=0.2, eps=1e-09, shared_target=False, **kwargs)[source]

A class used to fit a DCCA model by non-linear orthogonal iterations

Citation

Wang, Weiran, et al. “Stochastic optimization for deep CCA via nonlinear orthogonal iterations.” 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton). IEEE, 2015.

Constructor class for DCCA_NOI

Parameters
  • latent_dims (int) – # latent dimensions

  • N (int) – # samples used to estimate covariance

  • encoders – list of encoder networks

  • r (float) – regularisation parameter of tracenorm CCA like ridge CCA

  • rho (float) – covariance memory like DCCA non-linear orthogonal iterations paper

  • eps (float) – epsilon used throughout

  • shared_target (bool) – not used

Deep Canonically Correlated Autoencoders

class cca_zoo.deepmodels._dccae.DCCAE(latent_dims, objective=<class 'cca_zoo.deepmodels._objectives.MCCA'>, encoders=None, decoders=None, r=0, eps=1e-05, lam=0.5, latent_dropout=0, img_dim=None, recon_loss_type='mse', **kwargs)[source]

A class used to fit a DCCAE model.

Citation

Wang, Weiran, et al. “On deep multi-view representation learning.” International conference on machine learning. PMLR, 2015.

Constructor class for DCCAE

Parameters
  • latent_dims (int) – # latent dimensions

  • objective – # CCA objective: normal tracenorm CCA by default

  • encoders – list of encoder networks

  • decoders – list of decoder networks

  • r (float) – regularisation parameter of tracenorm CCA like ridge CCA. Needs to be VERY SMALL. If you get errors make this smaller

  • eps (float) – epsilon used throughout. Needs to be VERY SMALL. If you get errors make this smaller

  • lam – weight of reconstruction loss (1 minus weight of correlation loss)

Deep Tensor CCA

class cca_zoo.deepmodels._dtcca.DTCCA(latent_dims, encoders=None, r=0, eps=1e-05, **kwargs)[source]

A class used to fit a DTCCA model.

Is just a thin wrapper round DCCA with the DTCCA objective and a TCCA post-processing

Citation

Wong, Hok Shing, et al. “Deep Tensor CCA for Multi-view Learning.” IEEE Transactions on Big Data (2021).

Constructor class for DTCCA

Parameters
  • latent_dims (int) – # latent dimensions

  • encoders – list of encoder networks

  • r (float) – regularisation parameter of tracenorm CCA like ridge CCA. Needs to be VERY SMALL. If you get errors make this smaller

  • eps (float) – epsilon used throughout. Needs to be VERY SMALL. If you get errors make this smaller

Deep Variational CCA

class cca_zoo.deepmodels._dvcca.DVCCA(latent_dims, encoders=None, decoders=None, private_encoders=None, latent_dropout=0, img_dim=None, recon_loss_type='mse', **kwargs)[source]

A class used to fit a DVCCA model.

Citation

Wang, Weiran, et al. ‘Deep variational canonical correlation analysis.’ arXiv preprint arXiv:1610.03454 (2016).

https: // arxiv.org / pdf / 1610.03454.pdf

https: // github.com / pytorch / examples / blob / master / vae / main.py

Constructor class for DVCCA

Parameters
  • latent_dims (int) – # latent dimensions

  • encoders – list of encoder networks

  • decoders – list of decoder networks

  • private_encoders (Optional[Iterable[BaseEncoder]]) – list of private (view specific) encoder networks

Deep CCA by Stochastic Decorrelation Loss

class cca_zoo.deepmodels._dcca_sdl.DCCA_SDL(latent_dims, N, encoders=None, r=0, rho=0.2, eps=1e-05, shared_target=False, lam=0.5, **kwargs)[source]

A class used to fit a Deep CCA by Stochastic Decorrelation model.

Citation

Chang, Xiaobin, Tao Xiang, and Timothy M. Hospedales. “Scalable and effective deep CCA via soft decorrelation.” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.

Constructor class for DCCA_SDL

Parameters
  • latent_dims (int) – # latent dimensions

  • encoders – list of encoder networks

  • r (float) – regularisation parameter of tracenorm CCA like ridge CCA

  • rho (float) – covariance memory like DCCA non-linear orthogonal iterations paper

  • eps (float) – epsilon used throughout

  • shared_target (bool) – not used

Deep CCA by Barlow Twins

class cca_zoo.deepmodels._dcca_barlow_twins.BarlowTwins(latent_dims, encoders=None, lam=1, **kwargs)[source]

A class used to fit a Barlow Twins model.

Citation

Zbontar, Jure, et al. “Barlow twins: Self-supervised learning via redundancy reduction.” arXiv preprint arXiv:2103.03230 (2021).

Constructor class for Barlow Twins

Parameters
  • latent_dims (int) – # latent dimensions

  • encoders – list of encoder networks

  • lam – weighting of off diagonal loss terms

Split Autoencoders

class cca_zoo.deepmodels._splitae.SplitAE(latent_dims, encoder=<class 'cca_zoo.deepmodels._architectures.Encoder'>, decoders=None, latent_dropout=0, recon_loss_type='mse', img_dim=None, **kwargs)[source]

A class used to fit a Split Autoencoder model.

Citation

Ngiam, Jiquan, et al. “Multimodal deep learning.” ICML. 2011.

Parameters
  • latent_dims (int) – # latent dimensions

  • encoder (BaseEncoder) – list of encoder networks

  • decoders – list of decoder networks

Deep Objectives

class cca_zoo.deepmodels._objectives.CCA(latent_dims, r=0, eps=0.001)[source]

Differentiable CCA Loss. Loss() method takes the outputs of each view’s network and solves the CCA problem as in Andrew’s original paper

Parameters
  • latent_dims (int) – the number of latent dimensions

  • r (float) – regularisation as in regularized CCA. Makes the problem well posed when batch size is similar to the number of latent dimensions

  • eps (float) – an epsilon parameter used in some operations

class cca_zoo.deepmodels._objectives.MCCA(latent_dims, r=0, eps=0.001)[source]

Differentiable MCCA Loss. Loss() method takes the outputs of each view’s network and solves the multiset eigenvalue problem as in e.g. https://arxiv.org/pdf/2005.11914.pdf

Parameters
  • latent_dims (int) – the number of latent dimensions

  • r (float) – regularisation as in regularized CCA. Makes the problem well posed when batch size is similar to

the number of latent dimensions :type eps: float :param eps: an epsilon parameter used in some operations

class cca_zoo.deepmodels._objectives.GCCA(latent_dims, r=0, eps=0.001)[source]

Differentiable GCCA Loss. Loss() method takes the outputs of each view’s network and solves the generalized CCA eigenproblem as in https://arxiv.org/pdf/2005.11914.pdf

Parameters
  • latent_dims (int) – the number of latent dimensions

  • r (float) – regularisation as in regularized CCA. Makes the problem well posed when batch size is similar to

the number of latent dimensions :type eps: float :param eps: an epsilon parameter used in some operations

class cca_zoo.deepmodels._objectives.TCCA(latent_dims, r=0, eps=0.0001)[source]

Differentiable TCCA Loss.

Parameters
  • latent_dims (int) – the number of latent dimensions

  • r (float) – regularisation as in regularized CCA. Makes the problem well posed when batch size is similar to the number of latent dimensions

  • eps (float) – an epsilon parameter used in some operations

Callbacks

class cca_zoo.deepmodels._callbacks.CorrelationCallback[source]
on_train_epoch_end(trainer, pl_module)[source]

Called when the train epoch ends.

To access all batch outputs at the end of the epoch, either:

  1. Implement training_epoch_end in the LightningModule and access outputs via the module OR

  2. Cache data across train batch hooks inside the callback implementation to post-process in this hook.

Return type

None

on_validation_epoch_end(trainer, pl_module)[source]

Called when the val epoch ends.

Return type

None

class cca_zoo.deepmodels._callbacks.GenerativeCallback[source]
on_validation_epoch_end(trainer, pl_module)[source]

Called when the val epoch ends.

Return type

None

Model Architectures