CORRELATION WEIGHTED HETEROGENEOUS EUCLIDEAN-OVERLAP METRIC

Chaoqun Li and Hongwei Li

References

  1. [1] E. Frank, M. Hall, and B. Pfahringer, Locally weighted naiveBayes, Proc. Conf. on Uncertainty in Artificial Intelligence,Morgan Kaufmann, 2003, 249–256.
  2. [2] B. Wang and H. Zhang, Probability based metrics for locallyweighted naive Bayes. Proc. 20th Canadian Conf. on ArtificalIntelligence, 2007, 180–191.
  3. [3] R.D. Short, & K. Fukunaga, The optimal distance measurefor nearest neighbour classification. IEEE Transactions onInformation Theory, 27(5), 1981, 622–627.
  4. [4] J.P. Myles and D.J. Hand, The multi-class metric problem innearest neighbour discrimination rules, Pattern Recognition,23(11), 1990, 1291–1297.
  5. [5] E. Blanzieri and F. Ricci, Probability based metrics for nearestneighbor classification and case-based reasoning, Proc. 3rd In-ternational Conf. on Case-Based Reasoning and Development,Lecture Notes in Computer Science, 1650, 1999, 14–28.
  6. [6] C. Stanfill and D. Waltz, Toward memorybased reasoning.Communications of the ACM, 29, 1986, 1213–1228.
  7. [7] S. Cost and S. Salzberg, A weighted nearest neighbor algorithmfor learning with symbolic features, Machine Learning, 10 (1),1993, 57–78.
  8. [8] R. John, S. Kasif, S. Salzberg, and D.W. Aha, Towards abetter understanding of memory-based and bayesian classifiers.Proc. 11th Conf. on International Machine Learning. NewBrunswick, NJ: Morgan Kaufmann, 1994, 242–250.
  9. [9] D.R. Wilson and T.R. Martinez, Improved heterogeneousdistance functions, Journal of Artificial Intelligence Research,6 (1), 1997, 1–34.
  10. [10] P.N. Tan, M. Steinbach, and V. Kumar, Introduction to datamining, 1st ed. (Pearson Education, Inc: Boston, 2006).
  11. [11] J.G. Cleary and L.E. Trigg, K: An instance-based learnerusing an entropic distance measure. Proc. 12th InternationalMachine Learning Conference, Tahoe City, CA, Morgan Kauf-mann, 1995, 108–114.345
  12. [12] H. Wang, Nearest neighbors by neighborhood counting, IEEETransactions on Pattern Analysis and Machine Intelligence,28(6), 2006, 942–953.
  13. [13] D.W. Aha, Tolerating noisy, irrelevant, and novel attributesin instance-based learning algorithms, International Journalof Man-Machine Studies, 36 (2), 1992, 267–287.
  14. [14] D. Wettschereck and D.W. Aha, Weighting features, Proc. 1stInternational Conf. on Case-Based Reasoning Research andDevelopment, Lecture Notes In Computer Science, Springer-Verlag, London, UK, 1010, 1995, 347–358.
  15. [15] R. Kohavi, P. Langley, and Y. Yun, The utility of featureweighting in nearest-neighbor algorithms. Poster Papers: 9thEuropean Conf. on Machine Learning, Prague, Czech Republic,1997. Unpublished.
  16. [16] G. John, R. Kohavi, and K. Perfleg, Irrelevant features andthe subset selection problem. Proc. 11th international conf. onmachine learning, Morgan Kaufmann, 1994, 121–129.
  17. [17] M.A. Hall, Correlation-based feature selection for discrete andnumeric class machine learning, Proc. 17th International Conf.on Machine Learning, 2000, 359–366.
  18. [18] Y. Lei and H. Liu, Efficient feature selection via analysisof relevance and redundancy. Journal of Machine LearningResearch, 5, 2004, 1205–1224.
  19. [19] H. Peng, F. Long, and C. Ding, Feature selection basedon mutual information: Criteria of max-dependency, max-relevance, and min-redundancy, IEEE Trasaction on PatternAnalysis and Machine Intelegence, 27 (8), 2005, 1226–1238.
  20. [20] D.R. Wilson, Advances in instance-based learning algorithms,Doctoral Dissertation, Brigham Young University Provo, UT,1997.
  21. [21] N. Friedman, D. Geiger, and M. Goldszmidt, Bayesian networkclassifiers, Machine Learning, 29, 1997, 131–161.
  22. [22] E.H. Han, G. Karypis, and V. Kumar, Text categorization usingweight adjusted k-nearest neighbor classification, Technicalreport, Department of CS, University of Minnesota, 1999.
  23. [23] L. Jiang, D. Wang, Z. Cai, S. Jiang, and X. Yan. Scaling upthe accuracy of k-nearest-neighbor classifiers: A naive-Bayeshybrid, International Journal of Computers and Applications,31 (1), 2009, 36–43.
  24. [24] C. Li, L. Jiang, and J. Wu, Distance and attribute weightedk-nearest-neighbor and its application in reservoir porosityprediction. Journal of Information and Computation Science,6 (2), 2009, 845–851.
  25. [25] W.H. Press, S.A. Teukolsky, W.T. Vetterling, and B.P. Flan-nery, Numerical Recipes in C, 2nd ed., (Cambridge UniversityPress: Cambridge, 1988).
  26. [26] I.H. Witten and E. Frank, Data mining: Practical machinelearning tools and techniques, 2nd ed. (San Francisco: MorganKaufmann, CA, 2005).
  27. [27] C. Nadeau, and Y. Bengio, Inference for the generalizationerror, Machine Learning, 52 (3), 2003, 239–281.

Important Links:

Go Back