- Open Access
- Article
On a Kernel k-Means Algorithm
Fachhochschule für Ok¨ onomie und Management, BWL, Wirtschaftsinformatik, Arnulfstrasse 30, D-80335 München, Germany
* Author to whom correspondence should be addressed.
Journal of Engineering Research and Sciences, Volume 3, Issue 10, Page # 37-43, 2024; DOI: 10.55708/js0310004
Keywords: Clustering, K-means Algorithm, Kernels
Received: 29 August 2024, Revised: 08 October 2024, Accepted: 10 October 2024, Published Online: 22 October 2024
(This article belongs to the Special Issue Special Issue on Multidisciplinary Sciences and Advanced Technology 2024 & Section Biochemical Research Methods (BRM))
APA Style
Falkowski, B.-J. (2024). On a kernel k-means algorithm. Journal of Engineering Research and Sciences, 3(10), 37-46. https://doi.org/10.55708/js0310004
Chicago/Turabian Style
Falkowski, Bernd-Jürgen. “On a Kernel k-Means Algorithm.” Journal of Engineering Research and Sciences 3, no. 10 (2024): 37-46. https://doi.org/10.55708/js0310004.
IEEE Style
B.-J. Falkowski, “On a kernel k-means algorithm,” Journal of Engineering Research and Sciences, vol. 3, no. 10, pp. 37-46, 2024, doi: 10.55708/js0310004.
This is the extended version of a paper presented at CISP-BMEI 2023. After a general introduction kernels are described by showing how they arise from considerations concerning elementary geometrical properties. They appear as generalizations of the scalarproduct that in turn is the algebraic version of length and angle. By introducing the Reproducing Kernel Hilbert Space it is shown how operations in a high dimensional feature space can be performed without explicitly using an embedding function (the “kernel trick”). The general section of the paper lists some kernels and sophisticated kernel clustering algorithms. Thus the continuing popularity of the k-means algorithm is probably due to its simplicity. This explains why an elegant version of a k-means iterative algorithm originally established by Duda is treated. This was extended to a kernel algorithm by the author. However, its performance still heavily depended on the initialization. In this paper previous results on the original k-means algorithm are transferred to the kernel version thus removing these setbacks. Moreover the algorithm is slightly modified to allow for an easy quantification of the improvements to the target function after initializaztion.
- B.-J. Falkowski, “Maximum likelihood estimates and a kernel k-means iterative algorithm for normal mixtures”, “USB Proceedings IECON 2020 The 46th Annual Conference of the IEEE Industrial Electronics Society”, Online Singapore, doi:10.1109/IECON43393.2020.9254276.
- J. Shawe-Taylor, N. Cristianini, Kernel Methods for Pattern Analysis, Cambridge University Press, 2004.
- B.-J. Falkowski, “On certain generalizations of inner product similar- ity measures”, Journal of the American Society for Information Science, vol. 49, no. 9, 1998.
- G. Salton, Automatic Text Processing, Addison-Wesley Series in Com- puter Science, 1989.
- N. Aronszajn, “Theory of reproducing kernels”, AMS, 1950.
- G. Wahba, “Support vector machines, reproducing kernel hilbert spaces, and randomized gacv”, “Advances in Kernel Methods, Sup- port Vector Learning”, MIT Press, 1999.
- G. Wahba, “An introduction to reproducing kernel hilbert spaces and why they are so useful”, “IFAC Publications”, Rotterdam, 2003.
- A. Gretton, “Introduction to rkhs and some simple kernel algorithms”, 2019, uCL.
- D. Knuth, The Art of Computer Programming. Vol. 1, Fundamental Algo- rithms, 2nd Edition, 1973.
- “Polynomial kernel”, https://en.wikipedia.org/wiki/ Polynomial_kernel, accessed: 2024-03-24.
- I. Schoenberg, “On certain metric spaces arising from euclidean spaces and their embedding in hilbert space”, Annals of Mathematics, vol. 38, no. 4, 1937.
- I. Schoenberg, “Metric spaces and positive definite functions”, Trans- actions of the American Mathematical Society, vol. 41, 1938.
- C. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, 2013, reprinted.
- B.-J. Falkowski, “Mercer kernels and 1-cohomology”, N. Baba, R. Howlett, L. Jain, eds., “Proc. of the 5th Intl. Conference on Knowl- edge Based Intelligent Engineering Systems and Allied Technologies (KES 2001)”, IOS Press, 2001.
- B.-J. Falkowski, “Mercer kernels and 1-cohomology of certain semi- simple lie groups”, V. Palade, R. Howlett, L. Jain, eds., “Proc. of the 7th Intl. Conference on Knowledge-Based Intelligent Information and Engineering Systems (KES 2003)”, vol. 2773 of LNAI, Springer Verlag, 2003.
- A. Jain, “Data clustering: 50 years beyond k-means”, Pattern Recogni- tion Letters, 2009, doi:10.1016/j.patrec.2009.09.011.
- R. Duda, P. Hart, D. Stork, Pattern Classification, Wiley, 2017, reprinted.
- R. Chitta, “Kernel-clustering based on big data”, Ph.D. thesis, MSU, 2015.
- “Kernel clustering”, http://www.cse.msu.edu>cse902>ppt>Ker.
.., last visited: 20.01.2019. - B.-J. Falkowski, “A kernel iterative k-means algorithm”, “Advances in Intelligent Systems and Computing 1051, Proceedings of ISAT 2019”, Springer-Verlag, 2019.
- B.-J. Falkowski, “Bregman divergencies, triangle inequality, and max- imum likelihood estimates for normal mixtures”, “Proceedings of the 2022 Intelligent Systems Conference (Intellisys)”, vol. 1, Springer, 2022, doi:10.1007/978-3-031-16072-1_12.
- S. Gallant, “Perceptron-based learning algorithms”, IEEE Transactions on Neural Networks, vol. 1, no. 2, 1990.
- D. Stork, “Solution manual to accompany pattern classification, 2nd ed.”, PDF can be obtained via Wiley.
- D. Arthur, S. Vassilitski, “k-means++: The advantages of careful seed- ing”, “Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)”, 2007.
- S. Har-Peled, B. Sadri, “How fast is the k-means method?”, “ACM- SIAM Symposium on Discrete Mathematics (SODA)”, 2005.
- “Kernel k-means and spectral clustering”, https://www.cs.utexas. edu/~inderjit/public_papers/kdd_spectral_kernelkmeans. pdf, last visited: 19.06.2018.
No. of Downloads per Month
Wrong shortcode initialized
No. of Downloads per Country
Wrong shortcode initialized