Skip to main content
    Details
    Presenter(s)
    Haonan Zhang Headshot
    Display Name
    Haonan Zhang
    Affiliation
    Affiliation
    Southwest University
    Country
    Abstract

    Kernel least mean square based on the Nystrom method (NysKLMS) has been proposed to fix the network structure of kernel least mean square (KLMS) by approximating a large Gram matrix with a low-rank matrix generated by sampling from input vectors. However, the computational burden of NysKLMS increases with the growth of the input dimension. To alleviate this computational burden, a novel KLMS based on a sparse Nystrom method (SNKLMS) algorithm is proposed in this paper. Unlike NysKLMS using the k-means sampling to obtain k centroids from input data, the proposed SNKLMS algorithm first divides input data into several clusters, and then selects k centroids from the obtained clusters equally. According to the relation between the Gaussian kernel and the distance of different clusters, partial submatrices in the low-rank matrix of SNKLMS can be omitted to further reduce the operations of multiplication and addition, and thus a sparse low-rank matrix is obtained. Simulations on the channel equalization and chaotic time-series prediction validate the superiorities of SNKLMS.

    Slides