注:数据集在文章末尾

(1)回归树

import numpy as np
import matplotlib.pyplot as plt
from sklearn import tree# 载入数据
data = np.genfromtxt("data.csv", delimiter=",")
x_data = data[:,0,np.newaxis]
y_data = data[:,1,np.newaxis]
plt.scatter(x_data,y_data)
plt.show()

model = tree.DecisionTreeRegressor(max_depth=5)
model.fit(x_data, y_data)x_test = np.linspace(20,80,100)
x_test = x_test[:,np.newaxis]# 画图
plt.plot(x_data, y_data, 'b.')
plt.plot(x_test, model.predict(x_test), 'r')
plt.show()

# 导出决策树
import graphviz # http://www.graphviz.org/dot_data = tree.export_graphviz(model, out_file = None, feature_names = ['x','y'],class_names = ['label0','label1'],filled = True,rounded = True,special_characters = True)
graph = graphviz.Source(dot_data)

(2)回归树–预测房价

from sklearn import tree
from sklearn.datasets.california_housing import fetch_california_housing
from sklearn.model_selection import train_test_splithousing = fetch_california_housing()
print(housing.DESCR)

输出:

print(housing.data.shape)
print(housing.data[0])
print(housing.target[0])

输出:

x_data = housing.data
y_data = housing.target
x_train,x_test,y_train,y_test = train_test_split(x_data, y_data)model = tree.DecisionTreeRegressor()
model.fit(x_train, y_train)model.score(x_test,y_test)

输出:

数据集:“data.csv”:

32.502345269453031,31.70700584656992
53.426804033275019,68.77759598163891
61.530358025636438,62.562382297945803
47.475639634786098,71.546632233567777
59.813207869512318,87.230925133687393
55.142188413943821,78.211518270799232
52.211796692214001,79.64197304980874
39.299566694317065,59.171489321869508
48.10504169176825,75.331242297063056
52.550014442733818,71.300879886850353
45.419730144973755,55.165677145959123
54.351634881228918,82.478846757497919
44.164049496773352,62.008923245725825
58.16847071685779,75.392870425994957
56.727208057096611,81.43619215887864
48.955888566093719,60.723602440673965
44.687196231480904,82.892503731453715
60.297326851333466,97.379896862166078
45.618643772955828,48.847153317355072
38.816817537445637,56.877213186268506
66.189816606752601,83.878564664602763
65.41605174513407,118.59121730252249
47.48120860786787,57.251819462268969
41.57564261748702,51.391744079832307
51.84518690563943,75.380651665312357
59.370822011089523,74.765564032151374
57.31000343834809,95.455052922574737
63.615561251453308,95.229366017555307
46.737619407976972,79.052406169565586
50.556760148547767,83.432071421323712
52.223996085553047,63.358790317497878
35.567830047746632,41.412885303700563
42.436476944055642,76.617341280074044
58.16454011019286,96.769566426108199
57.504447615341789,74.084130116602523
45.440530725319981,66.588144414228594
61.89622268029126,77.768482417793024
33.093831736163963,50.719588912312084
36.436009511386871,62.124570818071781
37.675654860850742,60.810246649902211
44.555608383275356,52.682983366387781
43.318282631865721,58.569824717692867
50.073145632289034,82.905981485070512
43.870612645218372,61.424709804339123
62.997480747553091,115.24415280079529
32.669043763467187,45.570588823376085
40.166899008703702,54.084054796223612
53.575077531673656,87.994452758110413
33.864214971778239,52.725494375900425
64.707138666121296,93.576118692658241
38.119824026822805,80.166275447370964
44.502538064645101,65.101711570560326
40.599538384552318,65.562301260400375
41.720676356341293,65.280886920822823
51.088634678336796,73.434641546324301
55.078095904923202,71.13972785861894
41.377726534895203,79.102829683549857
62.494697427269791,86.520538440347153
49.203887540826003,84.742697807826218
41.102685187349664,59.358850248624933
41.182016105169822,61.684037524833627
50.186389494880601,69.847604158249183
52.378446219236217,86.098291205774103
50.135485486286122,59.108839267699643
33.644706006191782,69.89968164362763
39.557901222906828,44.862490711164398
56.130388816875467,85.498067778840223
57.362052133238237,95.536686846467219
60.269214393997906,70.251934419771587
35.678093889410732,52.721734964774988
31.588116998132829,50.392670135079896
53.66093226167304,63.642398775657753
46.682228649471917,72.247251068662365
43.107820219102464,57.812512976181402
70.34607561504933,104.25710158543822
44.492855880854073,86.642020318822006
57.50453330326841,91.486778000110135
36.930076609191808,55.231660886212836
55.805733357942742,79.550436678507609
38.954769073377065,44.847124242467601
56.901214702247074,80.207523139682763
56.868900661384046,83.14274979204346
34.33312470421609,55.723489260543914
59.04974121466681,77.634182511677864
57.788223993230673,99.051414841748269
54.282328705967409,79.120646274680027
51.088719898979143,69.588897851118475
50.282836348230731,69.510503311494389
44.211741752090113,73.687564318317285
38.005488008060688,61.366904537240131
32.940479942618296,67.170655768995118
53.691639571070056,85.668203145001542
68.76573426962166,114.85387123391394
46.230966498310252,90.123572069967423
68.319360818255362,97.919821035242848
50.030174340312143,81.536990783015028
49.239765342753763,72.111832469615663
50.039575939875988,85.232007342325673
48.149858891028863,66.224957888054632
25.128484647772304,53.454394214850524

【机器学习】监督学习--(回归)决策树③--回归树相关推荐

  1. 机器学习——监督学习之决策树分类模型

    概念 a.一种树形结构的分类器. b.通过顺序询问分类点的属性决定分类点的最终类别. c.决策树的构建通常根据特征的信息增益或其他指标. d.分类时,只需要按照决策树中的结点依次进行判断,即可得到样本 ...

  2. 【机器学习】9种回归算法及实例总结,建议学习收藏

    我相信很多人跟我一样,学习机器学习和数据科学的第一个算法是线性回归,它简单易懂.由于其功能有限,它不太可能成为工作中的最佳选择.大多数情况下,线性回归被用作基线模型来评估和比较研究中的新方法. 在处理 ...

  3. 监督学习 | CART 分类回归树原理

    文章目录 CART 算法 1. CART 生成 1.1 回归树生成 最小二乘回归树生成算法 1.2 分类树生成 基尼指数 CART 生成算法 参考文献 相关文章: 机器学习 | 目录 监督学习 | I ...

  4. 【火炉炼AI】机器学习006-用决策树回归器构建房价评估模型

    [火炉炼AI]机器学习006-用决策树回归器构建房价评估模型 (本文所使用的Python库和版本号: Python 3.5, Numpy 1.14, scikit-learn 0.19, matplo ...

  5. 机器学习算法 04 —— 决策树(ID3、C4.5、CART,剪枝,特征提取,回归决策树)

    文章目录 系列文章 决策树 1 决策树算法简介 2 决策树分类的原理 2.1 信息熵 2.2 决策树划分依据-信息增益(ID3) 2.3 决策树划分依据-信息增益率(C4.5) 2.4 决策树划分依据 ...

  6. Python实现Stacking回归模型(随机森林回归、极端随机树回归、AdaBoost回归、GBDT回归、决策树回归)项目实战

    说明:这是一个机器学习实战项目(附带数据+代码+文档+视频讲解),如需数据+代码+文档+视频讲解可以直接到文章最后获取. 1.项目背景 Stacking通常考虑的是异质弱学习器(不同的学习算法被组合在 ...

  7. 实验三:CART回归决策树python实现(两个测试集)(二)|机器学习

    目录 python实现 分步 源代码(全部) 测试集1(波士顿房价数据集) 测试集2(糖尿病数据集) 总结 python实现 分步 划分数据子集(左子树划分比指定值小的样本集合,右子树划分比指定值大的 ...

  8. 机器学习(五)—— 决策树回归模型和集合算法

    决策树回归模型和集合算法 1. 决策树概述 决策树(Decision Tree)是在已知各种情况发生概率的基础上,通过构成决策树来求取净现值的期望值大于等于零的概率 --百度百科 决策树就是我们通常所 ...

  9. 机器学习-监督学习-logistic回归,softMax回归

    本篇博文来总结一下回归模型里面两个非常重要的模型. logistic回归 softMAX回归 Logistic回归 logistics回归虽然有"回归"两字但是却是分类模型,并且是 ...

  10. 机器学习之分类决策树与回归决策树—基于python实现

    大家好,我是带我去滑雪! 本期为大家介绍决策树算法,它一种基学习器,广泛应用于集成学习,用于大幅度提高模型的预测准确率.决策树在分区域时,会考虑特征向量对响应变量的影响,且每次仅使用一个分裂变量,这使 ...

最新文章

  1. The listener supports no services
  2. 对于STM32F103控制的三轴机械臂基本功能测试-关节转动控制
  3. 字符串对象转数组对象_js对象转数组的方法 js怎么将数组对象转变成字符串
  4. 【TensorFlow】ValueError: Shape must be rank 1 but is rank 0 for ' ’ with input shapes: [].问题
  5. P3349-[ZJOI2016]小星星【树形dp,容斥】
  6. 设计模式_第二篇_策略模式
  7. 3.过滤——相关滤波(Correlation Filtering)_3
  8. 美团笔试题——公司食堂
  9. 13 Zuul的配置
  10. (转)Spring实现IoC的多种方式
  11. Win32汇编学习(6):键盘输入消息
  12. android view 画文字,【Android自定义View】绘图之文字篇(三)
  13. 关于固态硬盘开卡转接卡的研究,用SM2258XT测试智微、祥硕、威盛
  14. JIRA统计工时,我们用Tempo
  15. 提高多表关联数据查询效率
  16. RAW、YUV、RGB、JPEG格式简介
  17. 汉高将在上海成立新的粘合剂技术创新中心;宁德时代与戴姆勒卡车扩大全球合作伙伴关系 | 美通企业日报...
  18. 简单枚举 / 枚举排列
  19. 微信公众号网页开发——实用真机调试
  20. APL在Web应用系列 --- 例子1: 在Web页面的javascript中 调用 apl脚本

热门文章

  1. 本科生 计算机图形学试卷,湖南工程学院《计算机图形学》毕业补考试卷及答案...
  2. mysql连接编码设置_MySQL基础 - 编码设置
  3. 12002.i2ctools工具
  4. 26.QTableWidget用法
  5. 计算机内存只认4,为什么电脑的8GB内存只有7.45GB可用?
  6. 高性能HTTP加速器Varnish(管理维护篇)
  7. 平滑空间滤波器(附C语言实现代码)
  8. npm安装typescript
  9. 7.4 流水线的冒险
  10. matlab矩阵内存预分配