A Survey of Multilinear Subspace Learning for Tensor Data

A Survey Of Multilinear Subspace Learning For Tensor Data-Free PDF

  • Date:26 Sep 2020
  • Views:2
  • Downloads:0
  • Pages:35
  • Size:1.36 MB

Share Pdf : A Survey Of Multilinear Subspace Learning For Tensor Data

Download and Preview : A Survey Of Multilinear Subspace Learning For Tensor Data


Report CopyRight/DMCA Form For : A Survey Of Multilinear Subspace Learning For Tensor Data


Transcription:

1 Introduction, With the advances in data collection and storage capabilities massive multidi. mensional data are being generated on a daily basis in a wide range of emerg. ing applications and learning algorithms for knowledge extraction from these. data are becoming more and more important Two dimensional 2D data in. clude gray level images in computer vision and pattern recognition 1 4 mul. tichannel EEG signals in biomedical engineering 5 6 and gene expression. data in bioinformatics 7 Three dimensional 3D data include 3D objects in. generic object recognition 8 hyperspectral cube in remote sensing 9 and. gray level video sequences in activity or gesture recognition for surveillance or. human computer interaction HCI 10 11 There are also many multidimen. sional signals in medical image analysis 12 content based retrieval 1 13. and space time super resolution 14 for digital cameras with limited spatial. and temporal resolution In addition many streaming data and mining data. are frequently organized as third order tensors 15 17 Data in environmen. tal sensor monitoring are often organized in three modes of time location. and type 17 Data in social network analysis are usually organized in three. modes of time author and keywords 17 Data in network forensics are often. organized in three modes of time source and destination and data in web. graph mining are commonly organized in three modes of source destination. and text 15, These massive multidimensional data are usually very high dimensional with. a large amount of redundancy and only occupying a subspace of the input. space 18 Thus for feature extraction dimensionality reduction is frequently. employed to map high dimensional data to a low dimensional space while re. taining as much information as possible 18 19 However this is a challenging. problem due to the large variability and complex pattern distribution of the. input data and the limited number of samples available for training in prac. tice 20 Linear subspace learning LSL algorithms are traditional dimension. ality reduction techniques that represent input data as vectors and solve for. an optimal linear mapping to a lower dimensional space Unfortunately they. often become inadequate when dealing with massive multidimensional data. They result in very high dimensional vectors lead to the estimation of a large. number of parameters and also break the natural structure and correlation. in the original data 2 21 22, Due to the challenges in emerging applications above there has been a press. ing need for more effective dimensionality reduction schemes for massive mul. tidimensional data Recently interests have grown in multilinear subspace. learning MSL 2 21 26 a novel approach to dimensionality reduction of. multidimensional data where the input data are represented in their natural. multidimensional form as tensors Figure 1 shows two examples of tensor data. representations for a face image and a silhouette sequence MSL has the po. tential to learn more compact and useful representations than LSL 21 27 and. it is expected to have potential future impact in both developing new MSL. algorithms and solving problems in applications involving massive multidi. mensional data The research on MSL has gradually progressed from heuristic. exploration to systematic investigation 28 and both unsupervised and super. vised MSL algorithms have been proposed in the past a few years 2 21 26. It should be noted that MSL belongs to tensor data analysis or tensor based. computation and modeling which is more general and has a much wider. scope Multilinear algebra the basis of tensor data analysis has been studied. in mathematics for several decades 29 31 and there are a number of recent. survey papers summarizing recent developments in tensor data analysis E g. Qi et al review numerical multilinear algebra and its applications in 32 Muti. and Bourennane 33 survey new filtering methods for multicomponent data. modelled as tensors in noise reduction for color images and multicomponent. seismic data Acar and Yener 34 surveys unsupervised multiway data anal. ysis for discovering structures in higher order data sets in applications such. as chemistry neuroscience and social network analysis Kolda and Bader 35. provide an overview of higher order tensor decompositions and their appli. cations in psychometrics chemometrics signal processing etc These survey. papers primarily focus on unsupervised tensor data analysis through factor. decomposition In addition Zafeiriou 36 provides an overview of both unsu. pervised and supervised nonnegative tensor factorization NTF 37 38 with. NTF algorithms and their applications in visual representation and recogni. tion discussed, In contrast this survey paper focuses on a systematic introduction to the field. of MSL To the best knowledge of the authors this is the first unifying survey of. both unsupervised and supervised MSL algorithms For detailed introduction. and review on multilinear algebra multilinear decomposition and NTF the. a A 2D face b A 3D silhouette sequence, Fig 1 Illustration of real world data in their natural tensor representation.
List of Important Acronyms,Acronym Description,EMP Elementary Multilinear Projection. LDA Linear Discriminant Analsysis,LSL Linear Subspace Learning. MSL Multilinear Subspace Learning,PCA Principal Component Analysis. SSS Small Sample Size,TTP Tensor to Tensor Projection. TVP Tensor to Vector Projection,VVP Vector to Vector Projection.
readers should go to 30 36 39 40 while this paper serves as a complement to. these surveys In the rest of this paper Section 2 first introduces notations and. basic multilinear algebra and then addresses multilinear projections for direct. mapping from high dimensional tensorial representations to low dimensional. vectorial or tensorial representations This section also provides insights to the. relationships among different projections Section 3 formulates a unifying MSL. framework for systematic treatment of the MSL problem A typical approach. to solve MSL is presented and several algorithmic issues are examined in this. section Under the MSL framework existing MSL algorithms are reviewed. analyzed and categorized into taxonomies in Section 4 Finally MSL applica. tions are summarized in Section 5 and future research topics are covered in. Section 6 For easy reference Table 1 lists several important acronyms used. in this paper,2 Fundamentals and Multilinear Projections. This section first reviews the notations and some basic multilinear opera. tions 30 31 41 that are necessary in defining the MSL problem The im. portant concepts of multilinear projections are then introduced including el. ementary multilinear projection EMP tensor to vector projection TVP. and tensor to tensor projection TTP and their relationships are explored. Table 2 summarizes the important symbols used in this paper for quick refer. List of Important Notations,Notation Description,In the input dimensionality of the n mode. M the number of training samples, N the order of a tensor object the number of indices modes. P the number of EMPs in a TVP, Pn the n mode dimensionality in the projected space of a TTP. U n the nth projection matrix,up the n mode projection of the pth EMP.
vec the vectorized representation,X an input tensor sample. Y the projection of X on U n,y p the projection of X on up n 1 N. k kF the Frobenius norm,2 1 Notations, This paper follows the notation conventions in multilinear algebra pattern. recognition and adaptive learning literature 30 31 41 Vectors are denoted. by lowercase boldface letters e g x matrices by uppercase boldface e g. X and tensors by calligraphic letters e g X Their elements are denoted. with indices in parentheses Indices are denoted by lowercase letters spanning. the range from 1 to the uppercase letter of the index e g p 1 2 P In. addressing part of a vector matrix tensor denotes the full range of the. respective index and n1 n2 denotes indices ranging from n1 to n2 In this. paper only real valued data are considered,2 2 Basic Multilinear Algebra. As in 30 31 41 an N th order tensor is denoted as A RI1 I2 IN which. is addressed by N indices in n 1 N with each in addressing the n mode. of A The n mode product of a tensor A by a matrix U RJn In denoted as. A n U is a tensor with entries 30,A n U i1 in 1 jn in 1 iN A i1 iN U jn in 1.
Fig 2 Visual illustration of n mode vectors a a tensor A R8 6 4 b the. 1 mode vectors c the 2 mode vectors and d the 3 mode vectors. Fig 3 Visual illustration of 1 mode unfolding of a third order tensor. The scalar product of two tensors A B RI1 I2 IN is defined as. A B A i1 i2 iN B i1 i2 iN 2,and the Frobenius norm of A is defined as 30. k A kF A A 3, The n mode vectors of A are defined as the In dimensional vectors obtained. from A by varying its index in while keeping all the other indices fixed A. rank one tensor A equals to the outer product of N vectors. A u 1 u 2 u N 4,A i1 i2 iN u 1 i1 u 2 i2 u N iN 5, for all values of indices Unfolding A along the n mode is denoted as. A n RIn I1 In 1 In 1 IN 6, where the column vectors of A n are the n mode vectors of A Figures 2 b. 2 c and 2 d illustrate the 1 mode 2 mode and 3 mode vectors of a tensor. A in Fig 2 a respectively Figure 3 shows the 1 mode unfolding of the tensor. A in Fig 2 a, The distance between tensors A and B can be measured by the Frobenius.
dist A B k A B kF 7, Although this is a tensor based measure it is equivalent to a distance measure. of corresponding vector representations as proven in 42 Let vec A be the. vector representation vectorization of A then,dist A B k vec A vec B k2 8. This implies that the distance between two tensors as defined in 7 equals to. the Euclidean distance between their vectorized representations. 2 3 Multilinear Projections, A multilinear subspace is defined through a multilinear projection that maps. the input tensor data from one space to another lower dimensional space 43. Therefore multilinear projection is a key concept in MSL There are three. basic multilinear projections based on the input and output of a projection. the traditional vector to vector projection VVP TTP and TVP. 2 3 1 Vector to Vector Projection, Linear projection is a standard transformation used widely in various appli. cations 44 45 A linear projection takes a vector x RI and projects it to. another vector y RP using a projection matrix U RI P. y UT x x 1 UT 9, In typical pattern recognition applications P I Therefore linear projec.
tion is a VVP When the input to VVP is an N th order tensor X with N 1. it needs to be reshaped into a vector as x vec X before projection Figure. 4 a illustrates the VVP of a tensor object A Besides the traditional linear. projection there are alternative ways to project a tensor to a low dimensional. space as shown in Fig 4 b which will be discussed below. 2 3 2 Tensor to Tensor Projection, A tensor can be projected to another tensor of the same order named as. TTP As an N th order tensor X resides in the tensor space RI1 RI2 RIN. 30 43 the tensor space can be viewed as the Kronecker product of N vector. linear spaces RI1 RI2 RIN 43 To project a tensor X in a tensor space. RI1 RI2 RIN to another tensor Y in a lower dimensional tensor space. RP1 RP2 RPN where Pn In for all n N projection matrices U n. RIn Pn n 1 N are used so that 31,Y X 1 U 1 2 U 2 N U N 10. It can be done in N steps where in the nth step each n mode vector is. projected to a lower dimension Pn by U n as shown in Fig 5 a Figure 5 b. demonstrates how to project a tensor in 1 mode using a 1 mode projection. matrix which projects each 1 mode vector of the original tensor to a low. dimensional vector,2 3 3 Tensor to Vector Projection. The third multilinear projection is TVP which is referred to as the rank one. projections in some works 46 48 It projects a tensor to a vector which can. be viewed as multiple projections from a tensor to a scalar as illustrated in. Fig 6 a where the TVP of a tensor A R8 6 4 to a P 1 vector consists. of P projections from A to a scalar Thus the projection from a tensor to a. a Vector to vector linear projection,b Alternative ways to project a tensor. Fig 4 Ways to project a tensor to a low dimensional space. a Projection of a tensor in all modes,b Projection of a tensor in one mode.
Fig 5 Illustration of tensor to tensor projection,scalar is considered first. A tensor X RI1 I2 IN can be projected to a point y through N unit. projection vectors u 1 u 2 u N as,y X 1 u 1 2 u 2 N u N k u n k 1 for n 1 N 11. where k k is the Euclidean norm for vectors It can be written in the scalar. product 2 as,y X u 1 u 2 u N 12, Denote U u 1 u 2 u N then y X U This multilinear projection. u 1 u 2 u N is named as an EMP the projection of a tensor on a. single line resulting a scalar with one projection vector in each mode Figure. 6 b illustrates an EMP of a tensor, Thus the TVP of a tensor object X to a vector y RP in a P dimensional. vector space consists of P EMPs u 1 p u 2,p p 1 P which.
can be written concisely as up n 1 N p 1 The TVP from X to y is. y X N n 1 up n 1 N Pp 1 13,a Tensor to vector projection. b Elementary multilinear projection, Fig 6 Illustration of tensor to vector projection through elementary multilinear. Multilinear algebra the basis of tensor data analysis has been studied in mathematics for several decades 29 31 and there are a number of recent survey papers summarizing recent developments in tensor data analysis E g Qi et al review numerical multilinear algebra and its applications in 32 Muti and Bourennane 33 survey new ltering methods for multicomponent data modelled as

Related Books