Article Search
닫기

## Original Article

Split Viewer

Int. J. Fuzzy Log. Intell. Syst. 2018; 18(2): 120-125

Published online June 25, 2018

https://doi.org/10.5391/IJFIS.2018.18.2.120

© The Korean Institute of Intelligent Systems

## Combining Locality Preserving Projection with Global Information for Efficient Recognition

Heesung Lee

Department of Railroad Electrical and Electronics Engineering, Korea National University of Transportation, Uiwang-si, Korea

Correspondence to :
Heesung Lee (hslee0717@ut.ac.kr)

Received: May 8, 2018; Revised: June 1, 2018; Accepted: June 7, 2018

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

This paper proposes a new feature extraction scheme, combining global and local features. The proposed method uses principal component analysis (PCA) and linear discriminant analysis (LDA) for global property and locality preserving projections (LPP) for local property of the pattern. PCA and LDA are known for extracting the most descriptive ones after projection while LPP is known for preserving the neighborhood structure of the data set. The proposed combing method integrates global and local descriptive information and finds an efficient set of alternatives beyond PCA, LDA and LPP in the parametric space. Further, In order to find optimal parameters, the genetic algorithm (GA) is employed. Experiments are performed with four data sets selected in UCI machine learning repository to show the performance of the proposed algorithm.

Keywords: PCA, LDA, LPP, GA, Feature space, UCI machine learning repository

For general pattern recognition systems, the patterns can be represented as certain features [1]. It is also well-known fact that the irrelevant or redundant features degrade the classification accuracy, so constructing an appropriate feature subset is essential to build a good classification algorithm. Therefore, developing recognition scheme of real world objects or concepts using computational methods, the selection of an appropriate set of features is of considerable importance [2]. In this paper, we propose a new feature extraction scheme combining locality preserving projection with global information. The proposed method uses principal component analysis (PCA) and linear discriminant analysis (LDA) for global property and locality preserving projections (LPP) for local property of the pattern to improve the classification ability. PCA and LDA are well known for preserving the most descriptive ones after projection while LPP is known for preserving the neighborhood structure of the data set [3]. Our combing method can integrate global and local descriptive information and find a richer set of alternatives beyond PCA, LDA and LPP. In order to find the hybrid features adaptively and find optimal parameters, the genetic algorithm (GA) is employed. GAs have been used for many scientific and engineering optimization and search problems [4]. GAs provide an adaptive and robust computational procedure modeled on the mechanics of natural genetic systems [5]. GAs can be considered as a good solution for finding a richer set of alternatives beyond PCA, LDA and LPP in search space, because they are complicated problem in a large search space [6].

We introduce some preliminaries including PCA, LDA, LPP and GA briefly in Section 2. The proposed global and local features combining method based on GA is explained in Section 3. In Section 4, we assess the experimental performance of our proposed system. Finally, the conclusion is drawn in Section 5.

### 2.1 Principal Component Analysis

PCA can choose a dimensionality reducing linear projection that maximizes the scatter of all projected samples. Let us assume that an object is represented by the n-dimensional features

X=[x1,x2,,xn]T,

and we have a set of objects

Z=[X1,X2,,XN],

where each object belongs to one of C classes. If the total scatter matrix ST is defined as

ST=j=1N(Xj-μ)(Xj-μ)T,

where μRn is the mean vector of all objects and computed by

μ=1Nj=1NXj.

The objective of PCA is maximizing the determinant of the total scatter matrix of the projected samples as follows:

argWmax ST.

The new feature vectors YjRm for j = 1, 2, …, N are defined by the following linear transformation

Yj=WTXj   for   j=1,2,,N,

where WRn×m is a matrix with orthonormal columns. Wopt that maximizes the objective function is given by the set of n-dimensional eigenvectors of ST corresponding to the m-largest eigenvalues. A drawback of PCA is that the scatter being maximized is due not only to the between-class scatter that is useful for classification, but also to the within-class scatter that, for classification purposes, is unwanted information [7].

### 2.2 Linear Discriminant Analysis

The LDA is a linear projection method that maximizes the ratio of determinant of between-class covariance matrix to the determinant of within-class covariance matrix [8]. The within-class scatter matrix SW and the between class scatter matrix Sb are defined by

SW=c=1Cj=1Nc(Xjc-μc)·(Xjc-μc)T,

and

Sb=c=1N(μc-μ)·(μc-μ)T,

where Xjc is the jth object of class c, Nc is the number of objects in class c, and μcRn is the mean of class c.

SW and Sb represent the scatter of features around the mean of each class and scatter of features around the overall mean for all classes, respectively [9]. The objective of LDA is to maximize Sb while minimizing SW as follows:

argWmaxSbSW.

It has been proven that that this ratio is maximized when the column vectors of the projection matrix Wopt are the eigenvectors of SW-1Sb.

### 2.3 Locality Preserving Projections

Though the information of features is important, the local structure is also significant in many real-world applications [3, 10]. LPP is proposed for learning a locality preserving subspace and seeks to preserve the intrinsic geometry of the samples and local structure. The objective function of LPP is as follows:

min ij(Yi-Yj)2Sij,

where S is a similarity matrix and computed by

Sij={e(-Xi-Xj2/t),Xi-Xj2<ɛ,0,otherwise.

In (11), ɛ is a radius of the local neighborhood and defines the locality. Following some simple algebraic steps, we see that

12ij(Yi-Yj)2Sij=12ij(WTXi-WTXj)2Sij=ijWTXiSijXiTW-ijWTXiSijXjTW=WTZDZTW-WTZSZTW=WTZ(D-S)KTW=WTZLZTW,

where D is a diagonal matrix and its entries are column sums of S and L = DS is the Laplacian matrix. Because the bigger value D is, the more important is Yi, following constraint can be imposed

YTDY=1WTZDZTW=1.

Finally, we can obtain reduced optimal problem as follows:

argWmin WTZLZTW,WTZDZTW=1.

The Wopt that minimizes the objective function is given by the m-smallest eigenvalues to the generalized eigenvalue problem [11]

ZLZTW=λZDZTW.

### 2.4 Genetic Algorithm

The GA typically maintains a fixed size population of individuals that represent the set of solution candidates for the optimization problem to be solved. The goodness of each candidate solution is evaluated using a fitness function. The population of the GA evolves with a set of genetic operators being applied. The basic genetic operators are selection, crossover and mutation. In the selection process, some individuals are selected to be copied into a tentative next population. An individual is selected for the next population with the number of copies proportional to the fitness value. The copied individuals are altered to search for a global optimal solution in the crossover and mutation processes. The GA is simple yet provides adaptive and robust optimization methodology. Given below is the basic configuration of the GA [6].

Table 6. Average, the best, and the worst accrucy of the proposed method

DatabaseAverageBestWorst
Sonar86.27 (0.679)92.1676.47
Spambase83.13 (0.009)83.5782.26
SPECTF75.37 (0.084)83.3363.64
Heart63.43 (0.026)65.6759.70

### 3. Global and Local Information Combine

We propose a method which uses PCA and LDA for global property and LPP for local property of the pattern to improve the classification ability. It can integrate global and local descriptive information and find a richer set of alternatives beyond PCA, LDA and LPP. Based on them, we can obtain new features which preserve global and local structure, simultaneously as follows:

Hi=ω1Yi1+ω2Yi2+ω3Yi3   for   i=1,2,,m,

where ω1, ω2, and ω3 are three weight parameters and Yi1,Yi2 and Yi3 are the features obtained by

Yij=WoptjXi   for   i=1,2,,m,j=1,2,3,

where Woptj for j = 1, 2, 3 is the optimal linear transformation matrixes of PCA, LDA and LPP, respectively.

Because with different ω1, ω2, and ω3 values, (16) provides a rich set of alternatives to PCA, LDA and LPP, we have to find the best parameter setting. For these reason, we employ the GA to select weights parameters. GA can be considered as a good solution for finding a richer set of alternatives beyond PCA, LDA and LPP in a parametric space, because they are complicated combinational problem. Further, we select locality parameter ɛ using GA to find local features efficiently. The chromosome used for optimization is shown as Figure 1.

For experiments, four databases of the UCI repository [12] are used to test the proposed method. The databases used in the experiments are summarized in Table 1.

In this experiment, we use the genetic parameters given in Table 2. We divide each database into the four subsets. We use three subsets as training and the remaining one subset as test set. We made independent four runs to show the general performance of the algorithms. The training set is used to train classifier and find optimal parameter and the test set is used to evaluate the performance of the proposed method. The fitness function of a chromosome is computed by training accuracy using leave-one-out cross-validation of the training set with nearest neighbor (NN) classifier [13]. We denote the results of previous three methods using only PCA, LDA and LPP in Tables 3, 4, and 5, respectively. We also denote the accuracy of the proposed method in Table 6. In Tables 36, the numbers in the parenthesis denote the standard deviation of four runs.

Further, the result of the proposed method is compared with those of the conventional feature selection methods in Figure 2. It can be seen that, in all four databases, the proposed methods show better recognition accuracy than the previous methods. The reason for the better performance of the proposed method might be that proposed method considered global and local structures, simultaneously while conventional methods considered only one aspect structure. Further, optimal parameters can be obtained using GA to improve classification ability.

In this paper, we propose the improved the recognition techniques of real-world objects or concepts using computational methods. In classifier, the selection of an appropriate set of features is of considerable importance. Though PCA, LDA and LPP aim to preserve the global and local structure, respectively, it is more important that local and global structures are considered, simultaneously. Therefore, we have proposed a new feature extraction method which can preserve both extrinsic and intrinsic geometry of structures. To find optimal parameters, GA is used. For these reason, we can obtain feature extraction scheme that can have the more classification ability and use features which have global and local properties. The proposed method was applied to four databases selected in UCI machine learning repository and its effectiveness was demonstrated.

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP; Ministry of Science, ICT & Future Planning) (No. NRF-2017R1C1B5018408).

1. Liu, Y, and Dellaert, F 1998. A classification based similarity metric for 3D image retrieval., Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, Array, pp.800-805.
2. Lee, H, Hong, S, Nizami, IF, and Kim, E (2010). An efficient design of nearest neighbor classifier for various-scale problems. Pattern Recognition Letters. 31, 1020-1027.
3. Lee, H, Hong, S, An, S, and Kim, E 2008. Efficient classification scheme based on hybrid global and local properties of feature., Proceedings of International Conference on Control, Automation and Systems, Seoul, Korea, Array, pp.2126-2129.
4. Davis, L (1991). Handbook of Genetic Algorithms. New York, NY: Van Nostrand Reinhold
5. Lee, H, and Kim, E (2015). Genetic outlier detection for a robust support vector machine. International Journal of Fuzzy Logic and Intelligent Systems. 15, 96-101.
6. Lee, H, Kim, E, and Park, M (2007). A genetic feature weighting scheme for pattern recognition. Integrated Computer-Aided Engineering. 14, 161-171.
7. Belhumeur, PN, Hespanha, JP, and Kriegman, DJ (1997). Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence. 19, 711-720.
8. Kim, K, and Yeom, S (2017). Investigation on the growth of green bean sprouts with linear discriminant analysis. International Journal of Fuzzy Logic and Intelligent Systems. 17, 315-322.
9. Cho, H, and Moon, S 2009. Comparison of PCA and LDA based face recognition algorithms under illumination variations., Proceedings of ICROS-SICE International Joint Conference, Fukuoka, Japan, pp.4025-4030.
10. He, X, Yan, S, Hu, Y, Niyogi, P, and Zhang, HJ (2005). Face recognition using Laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence. 27, 328-340.
11. Zheng, Z, Tang, F, Tan, W, Jia, J, and Yang, J (2007). Gabor feature-based face recognition using supervised locality preserving projection. Signal Processing. 87, 2473-2483.
12. Murphy, PM, and Aha, DW (1994). UCI repository for machine learning databases. Irvine, CA: Department of Information and Computer Science, University of California
13. Ho, S, Liu, C, and Liu, S (2002). Design of an optimal nearest neighbor classifier using an intelligent genetic algorithm. Pattern Recognition Letters. 23, 1495-1503.

Heesung Lee received the B.S., M.S., and Ph.D. degrees in Electrical and Electronic Engineering from Yonsei University, Seoul, Korea, in 2003, 2005, and 2010, respectively. From 2011 to 2014, he was a managing researcher with the S1 Corporation, Seoul, Korea. Since 2015, he has been with the railroad electrical and electronics engineering at Korea National University of Transportation, Uiwang-si, Gyeonggi-do, Korea, where he is currently an assistant professor. His current research interests include computational intelligence, biometrics, and intelligent railroad system.

E-mail: hslee0717@ut.ac.kr

### Article

#### Original Article

Int. J. Fuzzy Log. Intell. Syst. 2018; 18(2): 120-125

Published online June 25, 2018 https://doi.org/10.5391/IJFIS.2018.18.2.120

## Combining Locality Preserving Projection with Global Information for Efficient Recognition

Heesung Lee

Department of Railroad Electrical and Electronics Engineering, Korea National University of Transportation, Uiwang-si, Korea

Correspondence to:Heesung Lee (hslee0717@ut.ac.kr)

Received: May 8, 2018; Revised: June 1, 2018; Accepted: June 7, 2018

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

### Abstract

This paper proposes a new feature extraction scheme, combining global and local features. The proposed method uses principal component analysis (PCA) and linear discriminant analysis (LDA) for global property and locality preserving projections (LPP) for local property of the pattern. PCA and LDA are known for extracting the most descriptive ones after projection while LPP is known for preserving the neighborhood structure of the data set. The proposed combing method integrates global and local descriptive information and finds an efficient set of alternatives beyond PCA, LDA and LPP in the parametric space. Further, In order to find optimal parameters, the genetic algorithm (GA) is employed. Experiments are performed with four data sets selected in UCI machine learning repository to show the performance of the proposed algorithm.

Keywords: PCA, LDA, LPP, GA, Feature space, UCI machine learning repository

### 1. Introduction

For general pattern recognition systems, the patterns can be represented as certain features [1]. It is also well-known fact that the irrelevant or redundant features degrade the classification accuracy, so constructing an appropriate feature subset is essential to build a good classification algorithm. Therefore, developing recognition scheme of real world objects or concepts using computational methods, the selection of an appropriate set of features is of considerable importance [2]. In this paper, we propose a new feature extraction scheme combining locality preserving projection with global information. The proposed method uses principal component analysis (PCA) and linear discriminant analysis (LDA) for global property and locality preserving projections (LPP) for local property of the pattern to improve the classification ability. PCA and LDA are well known for preserving the most descriptive ones after projection while LPP is known for preserving the neighborhood structure of the data set [3]. Our combing method can integrate global and local descriptive information and find a richer set of alternatives beyond PCA, LDA and LPP. In order to find the hybrid features adaptively and find optimal parameters, the genetic algorithm (GA) is employed. GAs have been used for many scientific and engineering optimization and search problems [4]. GAs provide an adaptive and robust computational procedure modeled on the mechanics of natural genetic systems [5]. GAs can be considered as a good solution for finding a richer set of alternatives beyond PCA, LDA and LPP in search space, because they are complicated problem in a large search space [6].

We introduce some preliminaries including PCA, LDA, LPP and GA briefly in Section 2. The proposed global and local features combining method based on GA is explained in Section 3. In Section 4, we assess the experimental performance of our proposed system. Finally, the conclusion is drawn in Section 5.

### 2.1 Principal Component Analysis

PCA can choose a dimensionality reducing linear projection that maximizes the scatter of all projected samples. Let us assume that an object is represented by the n-dimensional features

$X=[x1,x2,…,xn]T,$

and we have a set of objects

$Z=[X1,X2,…,XN],$

where each object belongs to one of C classes. If the total scatter matrix ST is defined as

$ST=∑j=1N(Xj-μ)(Xj-μ)T,$

where μRn is the mean vector of all objects and computed by

$μ=1N∑j=1NXj.$

The objective of PCA is maximizing the determinant of the total scatter matrix of the projected samples as follows:

$argW max ST.$

The new feature vectors YjRm for j = 1, 2, …, N are defined by the following linear transformation

$Yj=WTXj for j=1,2,…,N,$

where WRn×m is a matrix with orthonormal columns. Wopt that maximizes the objective function is given by the set of n-dimensional eigenvectors of ST corresponding to the m-largest eigenvalues. A drawback of PCA is that the scatter being maximized is due not only to the between-class scatter that is useful for classification, but also to the within-class scatter that, for classification purposes, is unwanted information [7].

### 2.2 Linear Discriminant Analysis

The LDA is a linear projection method that maximizes the ratio of determinant of between-class covariance matrix to the determinant of within-class covariance matrix [8]. The within-class scatter matrix SW and the between class scatter matrix Sb are defined by

$SW=∑c=1C∑j=1Nc(Xjc-μc)·(Xjc-μc)T,$

and

$Sb=∑c=1N(μc-μ)·(μc-μ)T,$

where $Xjc$ is the jth object of class c, Nc is the number of objects in class c, and μcRn is the mean of class c.

SW and Sb represent the scatter of features around the mean of each class and scatter of features around the overall mean for all classes, respectively [9]. The objective of LDA is to maximize Sb while minimizing SW as follows:

$argW max SbSW.$

It has been proven that that this ratio is maximized when the column vectors of the projection matrix Wopt are the eigenvectors of $SW-1Sb$.

### 2.3 Locality Preserving Projections

Though the information of features is important, the local structure is also significant in many real-world applications [3, 10]. LPP is proposed for learning a locality preserving subspace and seeks to preserve the intrinsic geometry of the samples and local structure. The objective function of LPP is as follows:

$min ∑ij(Yi-Yj)2Sij,$

where S is a similarity matrix and computed by

$Sij={e(-‖Xi-Xj‖2/t),‖Xi-Xj‖2<ɛ,0,otherwise.$

In (11), ɛ is a radius of the local neighborhood and defines the locality. Following some simple algebraic steps, we see that

$12∑ij(Yi-Yj)2Sij=12∑ij(WTXi-WTXj)2Sij=∑ijWTXiSijXiTW-∑ijWTXiSijXjTW=WTZDZTW-WTZSZTW=WTZ(D-S)KTW=WTZLZTW,$

where D is a diagonal matrix and its entries are column sums of S and L = DS is the Laplacian matrix. Because the bigger value D is, the more important is Yi, following constraint can be imposed

$YTDY=1→WTZDZTW=1.$

Finally, we can obtain reduced optimal problem as follows:

$argW min WTZLZTW,WTZDZTW=1.$

The Wopt that minimizes the objective function is given by the m-smallest eigenvalues to the generalized eigenvalue problem [11]

$ZLZTW=λZDZTW.$

### 2.4 Genetic Algorithm

The GA typically maintains a fixed size population of individuals that represent the set of solution candidates for the optimization problem to be solved. The goodness of each candidate solution is evaluated using a fitness function. The population of the GA evolves with a set of genetic operators being applied. The basic genetic operators are selection, crossover and mutation. In the selection process, some individuals are selected to be copied into a tentative next population. An individual is selected for the next population with the number of copies proportional to the fitness value. The copied individuals are altered to search for a global optimal solution in the crossover and mutation processes. The GA is simple yet provides adaptive and robust optimization methodology. Given below is the basic configuration of the GA [6].

Table 6. Average, the best, and the worst accrucy of the proposed method.

DatabaseAverageBestWorst
Sonar86.27 (0.679)92.1676.47
Spambase83.13 (0.009)83.5782.26
SPECTF75.37 (0.084)83.3363.64
Heart63.43 (0.026)65.6759.70

### 3. Global and Local Information Combine

We propose a method which uses PCA and LDA for global property and LPP for local property of the pattern to improve the classification ability. It can integrate global and local descriptive information and find a richer set of alternatives beyond PCA, LDA and LPP. Based on them, we can obtain new features which preserve global and local structure, simultaneously as follows:

$Hi=ω1Yi1+ω2Yi2+ω3Yi3 for i=1,2,…,m,$

where ω1, ω2, and ω3 are three weight parameters and $Yi1,Yi2$ and $Yi3$ are the features obtained by

$Yij=WoptjXi for i=1,2,…,m, j=1,2,3,$

where $Woptj$ for j = 1, 2, 3 is the optimal linear transformation matrixes of PCA, LDA and LPP, respectively.

Because with different ω1, ω2, and ω3 values, (16) provides a rich set of alternatives to PCA, LDA and LPP, we have to find the best parameter setting. For these reason, we employ the GA to select weights parameters. GA can be considered as a good solution for finding a richer set of alternatives beyond PCA, LDA and LPP in a parametric space, because they are complicated combinational problem. Further, we select locality parameter ɛ using GA to find local features efficiently. The chromosome used for optimization is shown as Figure 1.

### 4. Experimental Results

For experiments, four databases of the UCI repository [12] are used to test the proposed method. The databases used in the experiments are summarized in Table 1.

In this experiment, we use the genetic parameters given in Table 2. We divide each database into the four subsets. We use three subsets as training and the remaining one subset as test set. We made independent four runs to show the general performance of the algorithms. The training set is used to train classifier and find optimal parameter and the test set is used to evaluate the performance of the proposed method. The fitness function of a chromosome is computed by training accuracy using leave-one-out cross-validation of the training set with nearest neighbor (NN) classifier [13]. We denote the results of previous three methods using only PCA, LDA and LPP in Tables 3, 4, and 5, respectively. We also denote the accuracy of the proposed method in Table 6. In Tables 36, the numbers in the parenthesis denote the standard deviation of four runs.

Further, the result of the proposed method is compared with those of the conventional feature selection methods in Figure 2. It can be seen that, in all four databases, the proposed methods show better recognition accuracy than the previous methods. The reason for the better performance of the proposed method might be that proposed method considered global and local structures, simultaneously while conventional methods considered only one aspect structure. Further, optimal parameters can be obtained using GA to improve classification ability.

### 5. Conclusion

In this paper, we propose the improved the recognition techniques of real-world objects or concepts using computational methods. In classifier, the selection of an appropriate set of features is of considerable importance. Though PCA, LDA and LPP aim to preserve the global and local structure, respectively, it is more important that local and global structures are considered, simultaneously. Therefore, we have proposed a new feature extraction method which can preserve both extrinsic and intrinsic geometry of structures. To find optimal parameters, GA is used. For these reason, we can obtain feature extraction scheme that can have the more classification ability and use features which have global and local properties. The proposed method was applied to four databases selected in UCI machine learning repository and its effectiveness was demonstrated.

### Acknowledgements

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP; Ministry of Science, ICT & Future Planning) (No. NRF-2017R1C1B5018408).

### Fig 1.

Figure 1.

Structure of the chromosome.

The International Journal of Fuzzy Logic and Intelligent Systems 2018; 18: 120-125https://doi.org/10.5391/IJFIS.2018.18.2.120

### Fig 2.

Figure 2.

Correct classification ratio for four databases of the UCI repository.

The International Journal of Fuzzy Logic and Intelligent Systems 2018; 18: 120-125https://doi.org/10.5391/IJFIS.2018.18.2.120
 procedure Genetic Algorithm begin initialize P(t); while termination-condition not satisfied do begin evaluate P(t); select P(t + 1) from P(t); crossover P(t + 1); mutation P(t + 1); t = t + 1; end end

Table 1. Databases used for experimnets.

DatabaseNumber of instancesNumber of classesNumber of features
Sonar208360
Spambase4601257
SPECTF267244
Heart270213

Table 2. GA parameters.

ParameterValue
Crossover rate0.6
Mutation rate0.05
Population size100
Generation200

Table 3. Average, the best, and the worst accrucy using PCA.

DatabaseAverageBestWorst
Sonar82.35 (0.053)86.2774.51
Spambase81.84 (0.004)82.3581.39
SPECTF69.70 (0.081)74.2457.58
Heart62.31 (0.022)64.1759.70

Table 4. Average, the best, and the worst accrucy using LDA.

DatabaseAverageBestWorst
Sonar74.02 (0.079)80.4062.75
Spambase78.37 (0.022)81.6577.13
SPECTF73.11 (0.054)75.7665.15
Heart53.36 (0.056)59.7046.27

Table 5. Average, the best, and the worst accrucy using LPP.

DatabaseAverageBestWorst
Sonar84.80 (0.040)90.2080.39
Spambase82.04 (0.002)82.2681.82
SPECTF73.86 (0.050)80.3068.19
Heart55.97 (0.019)58.2153.73

Table 6. Average, the best, and the worst accrucy of the proposed method.

DatabaseAverageBestWorst
Sonar86.27 (0.679)92.1676.47
Spambase83.13 (0.009)83.5782.26
SPECTF75.37 (0.084)83.3363.64
Heart63.43 (0.026)65.6759.70

### References

1. Liu, Y, and Dellaert, F 1998. A classification based similarity metric for 3D image retrieval., Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, Array, pp.800-805.
2. Lee, H, Hong, S, Nizami, IF, and Kim, E (2010). An efficient design of nearest neighbor classifier for various-scale problems. Pattern Recognition Letters. 31, 1020-1027.
3. Lee, H, Hong, S, An, S, and Kim, E 2008. Efficient classification scheme based on hybrid global and local properties of feature., Proceedings of International Conference on Control, Automation and Systems, Seoul, Korea, Array, pp.2126-2129.
4. Davis, L (1991). Handbook of Genetic Algorithms. New York, NY: Van Nostrand Reinhold
5. Lee, H, and Kim, E (2015). Genetic outlier detection for a robust support vector machine. International Journal of Fuzzy Logic and Intelligent Systems. 15, 96-101.
6. Lee, H, Kim, E, and Park, M (2007). A genetic feature weighting scheme for pattern recognition. Integrated Computer-Aided Engineering. 14, 161-171.
7. Belhumeur, PN, Hespanha, JP, and Kriegman, DJ (1997). Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence. 19, 711-720.
8. Kim, K, and Yeom, S (2017). Investigation on the growth of green bean sprouts with linear discriminant analysis. International Journal of Fuzzy Logic and Intelligent Systems. 17, 315-322.
9. Cho, H, and Moon, S 2009. Comparison of PCA and LDA based face recognition algorithms under illumination variations., Proceedings of ICROS-SICE International Joint Conference, Fukuoka, Japan, pp.4025-4030.
10. He, X, Yan, S, Hu, Y, Niyogi, P, and Zhang, HJ (2005). Face recognition using Laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence. 27, 328-340.
11. Zheng, Z, Tang, F, Tan, W, Jia, J, and Yang, J (2007). Gabor feature-based face recognition using supervised locality preserving projection. Signal Processing. 87, 2473-2483.
12. Murphy, PM, and Aha, DW (1994). UCI repository for machine learning databases. Irvine, CA: Department of Information and Computer Science, University of California
13. Ho, S, Liu, C, and Liu, S (2002). Design of an optimal nearest neighbor classifier using an intelligent genetic algorithm. Pattern Recognition Letters. 23, 1495-1503.