Abstract
We review a new method of performing Canonical Correlation Analysis (CCA) with Artificial Neural Networks. We have previously [4, 5] compared its capabilities with standard statistical methods on simple data sets such as an abstraction of random dot stereograms [2]. In this paper, we show that this original rule is only one of a family of rules all of which use Hebbian and anti-Hebbian learning to find correlations between data sets: we derive slightly different rules from Becker's information theoretic criteria and from probabilistic assumptions. We then derive a robust version of this last rule and then compare the effectiveness of these rules on a standard data set