The enormous flow of information in various fields of technology prompts for the development of novel Data Mining methodologies that can process and interprets large volumes of data to be classified to a specify category with high reliability. The problem with Data Classification is inconvenient to describe the similarity among these data, which leads to poor classification accuracy. The classification algorithm like decision tree requires partitioning the underlying domain, a difficult task. The conventional k Nearest Neighbor is a Lazy algorithm typically uses Euclidean distance to measure the similarity. It defers data processing until needed and is susceptibility to noise. This paper proposes a novel approach to overcome these problems in Data Classification using integration of Data Mining technique Weight Adapted k-Nearest Neighbors (kNN) with Genetic algorithm. k Nearest Neighbor is selected for classification as it works directly on the numerically valued data without the need for discretisation.
[...] Selection: Roulette Wheel Selection Crossover Rate : 0.7 (single point crossover) Mutation Rate : 0.001 Number of Generations : 150 Table 4.4 Performance measure for Balance Scale Dataset using conventional k-NN Classification for different k values Traini Testi ng ng Conventional k-NNC Datas Data set in et in Precision Recall Table 4.6 Performance measure for Balance-Scale Dataset using weight adapted kNN classification with Genetic Algorithm Traini Testi ng ng GA with Weight adapted k-NNC Datas Datas et et in % in % Precision Recall k=1 In Balance Scale dataset, the values of precision and recall are high in using Genetic algorithm with weight adapted kNN. [...]
[...] Traini Testi ng ng Conventional k-NNC Datas Datas et in et in Precision Recall Precision Recall Table 4.3 Performance measure for Iris Dataset using weight adapted k-NN with Genetic Algorithm Traini Testi ng ng GA with Weight adapted k-NNC Datas Data set et in % in % Precision Recall k=1 The values of precision and recall measures for different k values shows good performance improvement in the proposed weight adapted kNN with Genetic algorithm. ILLUSTRATION II: BALANCE SCALE DATASET The performance measures for the conventional k-NN Classification, the Weight Adapted k-NN Classification and Genetic based k-NN for Balance Table 4.2 Performance measure for Iris Dataset using weight Adapted k-NN classification for different k values Traini Testi ng ng Weight adapted k-NNC Datas Data et in set in Scale dataset are elaborated in Table and 4.6 respectively. [...]
[...] The proposed Weight adapted k Nearest Neighbor with Genetic algorithm classification is shown in figure The importance of different attributes in the dataset keeps on changing with time in most real world applications. To overcome this, weights or ranks are assigned to each attributes in the dataset The attribute which has the highest weight is selected first for k Nearest Neighbor classification. This is called weighted k Nearest Neighbor classification. To improve the performance of classification accuracy and to optimize the weights assigned to the attributes, weight adapted k Nearest Neighbor classification is combined with Genetic Algorithm. [...]
[...] Fadavi Amiri, “Feature Reduction of Nearest Neighbor classifiers using Genetic Algorithm”, Proceedings of World Academy of Science, Engineering and Technology, December 2006, pp 36- Jing Peng, Douglas R. Heisterkamp, and H.K. Dai, “Adaptive Quasiconformal Kernel Nearest Neighbor Classification”, Ieee Transactions On Pattern Analysis And Machine Intelligence, Vol May 2004, pp 656 H.A. Guvenir and A.akkus, "Weighted K-Nearest Neighbor Classification on Feature Projections", in: proceeding of the 12th Intel. Symposium on Computer and Information Sciences(ISCISP7) , Antalya,Turkey, 1997,44- J.D.Kelly and L.Davis. [...]
[...] So we normalize the values in the dataset using the equation I,J min(X min Where XI, J is the jth feature of the ith pattern, X'I,J is the corresponding normalized feature, and K is defined between 1 to n and is the total number of patterns WEIGHT ADAPTED K-NN CLASSIFICATION In this paper, we propose the Weight Adapted kNearest Neighbor (WAkNN) classification algorithm that is based on the k-NN classification paradigm . In WAkNN, the weights of features are learnt using an iterative algorithm. [...]
using our reader.