지식나눔

[Q] Principal component analysis에 관한 정보

dimentionality reduction에 대한 충분한 설명이 있는 PCA 관련 문서를 얻고자 합니다. 이론적으로 잘 전개시킨 자료이면 좋겠네요. 더불어 source code로 구현한게 있다면 더욱 좋겠습니다. ^^;
지식의 출발은 질문, 모든 지식의 완성은 답변! 
각 분야 한인연구자와 현업 전문가분들의 답변을 기다립니다.
답변 1
  • 답변

    김은정님의 답변

    다음의 자료들은 산업기술정보원(http://www.kiniti.re.kr)의 영문 DB를 통해 검색된 자료입니다. 원하시는 자료의 원문은 산업기술정보원 웹을 통해 원문신청이 가능합니다. 참고해 보시기 바랍니다. * Considering precision of data in reduction of dimensionality and PCA, Brauner, N.; Shacham, M.;, Comput. Chem. Eng. (UK) vol.24, no.12 (2000) (UK), p2603-11, 14 ref(s) Abstract : Reduction of dimensionality of the data space in process data analysis is considered. A new stepwise collinearity diagnostic (SCD) procedure is presented, which employs indicators based on the estimated signal-to-noise ratio in the data in order to measure the collinearity between the variables. The SCD procedure selects a maximal subset of non-collinear variables and identifies the corresponding collinear subsets of variables. Using SCD, the dimension of the data space is reduced to the dimension of the maximal non-collinear subset. In process monitoring applications, the data associated with the surplus variables can be used for distinguishing between process and sensor failures. Two examples, which demonstrate the advantages of the proposed method over principal component analysis (PCA), are presented * Local PCA algorithms, Weingessel, A.; Hornik, K.;, IEEE Trans. Neural Netw. (USA) vol.11, no.6 (Nov. 2000) (USA), p1242-50, Abstract : Within the last years various principal component analysis (PCA) algorithms have been proposed. In this paper we use a general framework to describe those PCA algorithms which are based on Hebbian learning. For an important subset of these algorithms, the local algorithms, we fully describe their equilibria, where all lateral connections are set to zero and their local stability. We show how the parameters in the PCA algorithms have to be chosen in order to get an algorithm which converges to a stable equilibrium which provides principal component extraction * Fast dimensionality reduction and simple PCA, Partridge, M.; Calvo, R.A.;, Intell. Data Anal. (USA) vol.2, no.3 (Aug. 1998) (USA), 12 ref(s) Abstract : A fast and simple algorithm for approximately calculating the principal components (PCs) of a dataset and so reducing its dimensionality is described. This simple principal components analysis (SPCA) method was used for dimensionality reduction of two high-dimensional image databases, one of handwritten digits and one of handwritten Japanese characters. It was tested and compared with other techniques. On both databases SPCA shows a fast convergence rate compared with other methods and robustness to the reordering of the samples.. >dimentionality reduction에 대한 충분한 설명이 있는 >PCA 관련 문서를 얻고자 합니다. >이론적으로 잘 전개시킨 자료이면 좋겠네요. > >더불어 source code로 구현한게 있다면 더욱 좋겠습니다. ^^;
    다음의 자료들은 산업기술정보원(http://www.kiniti.re.kr)의 영문 DB를 통해 검색된 자료입니다. 원하시는 자료의 원문은 산업기술정보원 웹을 통해 원문신청이 가능합니다. 참고해 보시기 바랍니다. * Considering precision of data in reduction of dimensionality and PCA, Brauner, N.; Shacham, M.;, Comput. Chem. Eng. (UK) vol.24, no.12 (2000) (UK), p2603-11, 14 ref(s) Abstract : Reduction of dimensionality of the data space in process data analysis is considered. A new stepwise collinearity diagnostic (SCD) procedure is presented, which employs indicators based on the estimated signal-to-noise ratio in the data in order to measure the collinearity between the variables. The SCD procedure selects a maximal subset of non-collinear variables and identifies the corresponding collinear subsets of variables. Using SCD, the dimension of the data space is reduced to the dimension of the maximal non-collinear subset. In process monitoring applications, the data associated with the surplus variables can be used for distinguishing between process and sensor failures. Two examples, which demonstrate the advantages of the proposed method over principal component analysis (PCA), are presented * Local PCA algorithms, Weingessel, A.; Hornik, K.;, IEEE Trans. Neural Netw. (USA) vol.11, no.6 (Nov. 2000) (USA), p1242-50, Abstract : Within the last years various principal component analysis (PCA) algorithms have been proposed. In this paper we use a general framework to describe those PCA algorithms which are based on Hebbian learning. For an important subset of these algorithms, the local algorithms, we fully describe their equilibria, where all lateral connections are set to zero and their local stability. We show how the parameters in the PCA algorithms have to be chosen in order to get an algorithm which converges to a stable equilibrium which provides principal component extraction * Fast dimensionality reduction and simple PCA, Partridge, M.; Calvo, R.A.;, Intell. Data Anal. (USA) vol.2, no.3 (Aug. 1998) (USA), 12 ref(s) Abstract : A fast and simple algorithm for approximately calculating the principal components (PCs) of a dataset and so reducing its dimensionality is described. This simple principal components analysis (SPCA) method was used for dimensionality reduction of two high-dimensional image databases, one of handwritten digits and one of handwritten Japanese characters. It was tested and compared with other techniques. On both databases SPCA shows a fast convergence rate compared with other methods and robustness to the reordering of the samples.. >dimentionality reduction에 대한 충분한 설명이 있는 >PCA 관련 문서를 얻고자 합니다. >이론적으로 잘 전개시킨 자료이면 좋겠네요. > >더불어 source code로 구현한게 있다면 더욱 좋겠습니다. ^^;
    등록된 댓글이 없습니다.