
信息论学习。
5星
- 浏览量: 0
- 大小:None
- 文件类型:None
简介:
This book offers a comprehensive and unified exploration of Information Theoretic Learning (ITL) algorithms, designed to effectively adapt both linear and nonlinear learning machines within either supervised or unsupervised learning contexts. ITL establishes a distinct framework that replaces traditional second-order statistical concepts – such as covariance, L2 distances, and correlation functions – with simpler scalars and functions, grounded in information theory; specifically, utilizing entropy, mutual information, and correntropy. Furthermore, ITL provides a method for characterizing the probabilistic structure of data beyond conventional second-order statistics, ultimately leading to enhanced performance while circumventing the considerable computational demands associated with full Bayesian approaches. This advancement is achieved through the application of a non-parametric estimator based on Renyi’s quadratic entropy, which is solely determined by pairwise differences observed between individual data samples. The book systematically evaluates the performance of ITL algorithms alongside their corresponding second-order counterparts across a diverse range of engineering and machine learning applications. It is anticipated that students, practitioners, and researchers engaged in areas such as statistical signal processing, computational intelligence, and machine learning will discover within this work the fundamental theoretical knowledge required to comprehend the core concepts, practical algorithms for implementing various applications, and potentially groundbreaking yet relatively unexplored avenues that promise to stimulate future research endeavors.
全部评论 (0)


