<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Ghojogh, Benyamin</style></author><author><style face="normal" font="default" size="100%">Karray, Fakhri</style></author><author><style face="normal" font="default" size="100%">Crowley, Mark</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Generalized Subspace Learning by Roweis Discriminant Analysis</style></title><secondary-title><style face="normal" font="default" size="100%">International Conference on Image Analysis and Recognition</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Data Reduction</style></keyword><keyword><style  face="normal" font="default" size="100%">Manifold learning</style></keyword><keyword><style  face="normal" font="default" size="100%">Numerosity Reduction</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2020</style></year></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://arxiv.org/abs/1910.05437</style></url></web-urls></urls><publisher><style face="normal" font="default" size="100%">Springer</style></publisher><pub-location><style face="normal" font="default" size="100%">Póvoa de Varzim, Portugal (virtual)</style></pub-location><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">We present a new method which generalizes subspace learning based on eigenvalue and generalized eigenvalue problems. This method, Roweis Discriminant Analysis (RDA), is named after Sam Roweis to whom the field of subspace learning owes significantly. RDA is a family of infinite number of algorithms where Principal Component Analysis (PCA), Supervised PCA (SPCA), and Fisher Discriminant Analysis (FDA) are special cases. One of the extreme special cases, which we name Double Supervised Discriminant Analysis (DSDA), uses the labels twice; it is novel and has not appeared elsewhere. We propose a dual for RDA for some special cases. We also propose kernel RDA, generalizing kernel PCA, kernel SPCA, and kernel FDA, using both dual RDA and representation theory. Our theoretical analysis explains previously known facts such as why SPCA can use regression but FDA cannot, why PCA and SPCA have duals but FDA does not, why kernel PCA and kernel SPCA use kernel trick but kernel FDA does not, and why PCA is the best linear method for reconstruction. Roweisfaces and kernel Roweisfaces are also proposed generalizing eigenfaces, Fisherfaces, supervised eigenfaces, and their kernel variants. We also report experiments showing the effectiveness of RDA and kernel RDA on some benchmark datasets.</style></abstract></record></records></xml>