Kernel KNN函数

代码

knn.m

function [ y ] = knn( X, X_train, y_train, K)
%KNN K-Nearest Neighbors Algorithm.
%
%   INPUT:  X:          testing sample features, P-by-N_test matrix.
%            X_train:    training sample features, P-by-N matrix.
%            y_train:    training sampele labels, 1-by-N vector.
%            K:          take k in k-Nearest Neighbors.
%   OUTPUT:  y:          predicted labels, 1-by-N vector.
%Author: X-Lion
%Date:   20150807
[~,N_test] = size(X);predicted_label = zeros(1,N_test);
for i = 1:N_test% calculate the K nearest neighbors and the distance[dists,neighbors] = top_K_neighbors(X_train,y_train,X(:,i),K);% recognize the label of the test vector.predicted_label(i) = recog(y_train(neighbors),max(y_train));
endy = predicted_label;end

top_K_neighbors.m

function [ dists,neighbors ] = top_K_neighbors( X_train,y_train,X_test,K )
% TOP_K_NEIGHBORS
%
%   INPUT:  X_test:             the test vector with P-by-1.
%            X_train,y_train:    training data set.
%            K:                  is the K neighbor parameter.
%   OUTPUT:  dists,neighbors:    top K neighbors index and dist.
% Author: X-Lion
% Date:   20150807
[~,N_train] = size(X_train);
test_mat = repmat(X_test,1,N_train);
% The distance is the Euclid Distance.
dist_mat = (X_train-double(test_mat)).^2;
dist_array = sum(dist_mat);% Linear Kernel(k(ui,vj) = ui'*vj)
% for i = 1:N_train
%     Kxx = X_train(:,i)'*X_train(:,i);
%     Kxy = X_train(:,i)'*X_test;
%     Kyy = X_test'*X_test;
%     dist_array(i) = Kxx - 2 * Kxy + Kyy;
% end% Gaussian Kernel(k(ui,vj) = exp(-gama*||ui-vj||^2))
% Parameter:
%           g:gama
% g = 0.1;
% for i = 1:N_train
%     Kxx = exp(-g*norm(X_train(:,i)-X_train(:,i))^2);
%     Kxy = exp(-g*norm(X_train(:,i)-X_test)^2);
%     Kyy = exp(-g*norm(X_test-X_test)^2);
%     dist_array(i) = Kxx - 2 * Kxy + Kyy;
% end% Polynomial Kernel(k(ui,vj) = (gama*ui'*vj+coef)^d)
% Parameter:
%           g:gama
%           coef
%           d:degree
% g = 0.01;
% coef = 1;
% d = 2;
% for i = 1:N_train
%     Kxx = (g*X_train(:,i)'*X_train(:,i) + coef)^d;
%     Kxy = (g*X_train(:,i)'*X_test + coef)^d;
%     Kyy = (g*X_test'*X_test + coef)^d;
%     dist_array(i) = Kxx - 2 * Kxy + Kyy;
% end% Tanh Kernel(k(ui,vj) = tanh(gama*ui'*vj + coef))
% Parameter:
%           g:gama
%           coef
% g = 2;
% coef = 1;
% for i = 1:N_train
%     Kxx = tanh(g*X_train(:,i)'*X_train(:,i) + coef);
%     Kxy = tanh(g*X_train(:,i)'*X_test + coef);
%     Kyy = tanh(X_test'*X_test + coef);
%     dist_array(i) = Kxx - 2 * Kxy + Kyy;
% end%The neighbors are the index of top K nearest points.
[dists,neighbors] = sort(dist_array);
dists = dists(1:K);
neighbors = neighbors(1:K);
end

recog.m

function result = recog( K_labels,class_num )
%RECOG
%
%   INPUT:  X_test:             the test vector with P-by-1.
%            X_train,y_train:    training data set.
%            K:                  is the K neighbor parameter.
%   OUTPUT:  dists,neighbors:    top K neighbors index and dist.
%Author: X-Lion
%Date:   20150807
[~,K] = size(K_labels);
class_count = zeros(1,class_num+1);
for i = 1:Kclass_index = K_labels(i)+1;class_count(class_index) = class_count(class_index) + 1;
end
[~,result] = max(class_count);
result = result - 1;
end

参考资料

  • http://blog.csdn.net/rk2900/article/details/9080821

Kernel KNN ( K-Nearest Neighbors )相关推荐

  1. KNN(K Nearest Neighbors)分类是什么学习方法?如何或者最佳的K值?RadiusneighborsClassifer分类器又是什么?KNN进行分类详解及实践

    KNN(K Nearest Neighbors)分类是什么学习方法?如何或者最佳的K值?RadiusneighborsClassifer分类器又是什么?KNN进行分类详解及实践 如何使用GridSea ...

  2. [机器学习-sklearn] KNN(k近邻法)学习与总结

    KNN 学习与总结 引言 一,KNN 原理 二,KNN算法介绍 三, KNN 算法三要素 1 距离度量 2. K 值的选择 四, KNN特点 KNN算法的优势和劣势 KNN算法优点 KNN算法缺点 五 ...

  3. 机器学习——K近邻算法(KNN)(K Nearest Neighbor)

    参考视频与文献: python与人工智能-KNN算法实现_哔哩哔哩_bilibili 机器学习--K近邻算法(KNN)及其python实现_清泉_流响的博客-CSDN博客_python实现knn 机器 ...

  4. K NEAREST NEIGHBOR 算法(knn)

    K Nearest Neighbor算法又叫KNN算法,这个算法是机器学习里面一个比较经典的算法, 总体来说KNN算法是相对比较容易理解的算法.其中的K表示最接近自己的K个数据样本.KNN算法和K-M ...

  5. [更新ing]sklearn(十六):Nearest Neighbors *

    Finding the Nearest Neighbors 1.NearestNeighbors #Unsupervised learner for implementing neighbor sea ...

  6. 近邻模块︱apple.Turicreate中相似判定Nearest Neighbors(四)

    apple.Turicreate已经是第四篇了.本模块主要阐述该平台相似模块的一些功能. 也是目前求相似解决方案很赞的一个. 官方地址:https://apple.github.io/turicrea ...

  7. 机器学习之深入理解K最近邻分类算法(K Nearest Neighbor)

    [机器学习]<机器学习实战>读书笔记及代码:第2章 - k-近邻算法 1.初识 K最近邻分类算法(K Nearest Neighbor)是著名的模式识别统计学方法,在机器学习分类算法中占有 ...

  8. K Nearest Neighbor 算法

    K Nearest Neighbor算法又叫KNN算法,这个算法是机器学习里面一个比较经典的算法, 总体来说KNN算法是相对比较容易理解的算法.其中的K表示最接近自己的K个数据样本.KNN算法和K-M ...

  9. Python KNN K近邻分类

    Python KNN K近邻分类 1 声明 本文的数据来自网络,部分代码也有所参照,这里做了注释和延伸,旨在技术交流,如有冒犯之处请联系博主及时处理. 2 KNN简介 相关概念见下: 对于给定的观测来 ...

最新文章

  1. OSGI动态加载删除Service bundle
  2. 球球大作战体验服找不到团战服务器6,球球大作战常见问题汇总 新版本问题解决方法...
  3. [Leetcode] single number ii 找单个数
  4. HDU #5733 tetrahedron
  5. 【Vue中的坑】Vue中的修改变量没有效果?
  6. bash脚本比较运算符和if else和test命令
  7. mac 修改pip镜像为国内镜像
  8. php冒泡排序图解,PHP冒泡排序(Bubble Sort)代码实现图解
  9. blink usb无线网卡驱动 linux,BLINK无线网卡驱动下载
  10. 微型计算机分类可以分为哪些,微型计算机的分类通常以微处理器的什么来划分...
  11. 微信分享网页时自定义标题、描述和图片
  12. SQL查询列出每个班的班号和总人数
  13. 史上最全Java并发编程面试题(75道附答案)
  14. 谷歌新系统 fuchsia
  15. intel服务器芯片排行,【2021Intel服务器CPU排行榜】Intel服务器CPU哪款好_热门Intel服务器CPU推荐-太平洋产品报价...
  16. 什么是DISA STIG?概述+STIG安全
  17. 智能优化算法:多目标粒子群优化算法(MOPSO)
  18. JB出品,下一代IDE!!
  19. 美的破壁机BL1503B介绍
  20. 【翻译】URI与URL的区别

热门文章

  1. 第02章 Tableau连接数据源
  2. 如何选择国际短信服务商?
  3. 清华计算机研究生复试考什么,清华大学计算机硕士研究生考试是走普通流程?就是报名初试复试等。需要做其他准备么?复试前要联络大学么...
  4. 探索神秘的编程语言的奥秘世界
  5. 面向对象测试-输出和修改动物信息
  6. 输入一个月份,判断是上半年还是下半年?再判断一下是哪个季度?
  7. 微信小程序免费HTTPS证书申请搭建教程(2)---安装SSL并使用HTTPS访问
  8. 【随机过程】11 - 泊松过程及其解析计算
  9. 详解区块链技术,如何运作
  10. AB1562A QCC5144 QCC3046 蓝牙芯片相关资料