some understanding of《Inferring Decision Trees Using the Minimum Description Length Principle*》
《Inferring Decision Trees Using the Minimum Description Length Principle*》
Information And Computation 80, 227-248(1989)
My difficulty is: how to get 18.170 bits when computing Coding Decision Tree Costs?
here are some relevant part in this article.
---------------the decision tree is:--------------
3rd page of article( page 229 on the top-right corner)
-----------------------------formula for computing cost-------------
10th page of article ( page 236 on the top-left corner)
The total cost for this procedure is thus:
L(n,k,b)=log2(b+1)+log2[Cnk]L(n,k,b)=log_{2}(b+1)+log_{2}[C_n^{k}]L(n,k,b)=log2(b+1)+log2[Cnk]
-----------------------the sequence relevant to the above decision tree-------------------------------
13th page of article ( page 239 on the top-right corner)
All above are trying to encode the whole decision tree,
and then send the decision tree from sender to receiver.
My difficulty is: how to get 18.170 bits mentioned above?
My understanding is:
Outlook: 2 bits
Humidity lg(3)bits
Windy lg(2)bit (not mentioned in article ,I just guess)
the whole sequence in this paper is :
1 Outlook 1 Humidity 0 N 0 P 0 P 1 Windy 0 N 0 P
then,
there are 8digits above, the following I guess may be wrong:
3 decicion nodes cost:3 bits
5 leaves cost: 5 bits
2+lg(3)+log(2)+3+5+?=18.17
?=5.585,
but how to get 5.585?
Although the following picture(encoding exception) has the number 5.585,
I guess it cannot be used as the explanation of above 5.585.
Could you tell me how to get “18.170 bits” mentioned in this article ?
Thanks very much!
some understanding of《Inferring Decision Trees Using the Minimum Description Length Principle*》相关推荐
- R语言构建决策树(decision trees)模型并进行调优和解释
R语言构建决策树(decision trees)模型并进行调优和解释 目录 R语言构建决策树(decision trees)
- Gradient Boosting, Decision Trees and XGBoost with CUDA ——GPU加速5-6倍
xgboost的可以参考:https://xgboost.readthedocs.io/en/latest/gpu/index.html 整体看加速5-6倍的样子. Gradient Boosting ...
- 机器学习算法 --- Decision Trees Algorithms
一.Decision Trees Agorithms的简介 决策树算法(Decision Trees Agorithms),是如今最流行的机器学习算法之一,它即能做分类又做回归(不像之前介绍的其他学习 ...
- OpenCV使用不同的决策树decision trees的实例(附完整代码)
OpenCV使用不同的决策树decision trees的实例 OpenCV使用不同的决策树decision trees的实例 OpenCV使用不同的决策树decision trees的实例 #inc ...
- RDataMining系列:Chapter 4 Decision Trees --决策树实现,未完待续
***************** 利用party来做决策树分类 ***************** 数据:iris data 目标: 利用Sepal.Length, Sepal.Width,Peta ...
- [论文解读]NBDT: Neural-Backed Decision Trees
NBDT: Neural-Backed Decision Trees 文章目录 NBDT: Neural-Backed Decision Trees 简介 摘要 初步 相关工作 方法 使用嵌入式决策规 ...
- 决策树(Decision Trees)
决策树(Decision Trees) 1. Training and Visualizing a Decision Tree can perform both classification and ...
- Super easy to understand decision trees (part one)
文章目录 The preface What can it do how it work The data format Information entropy and GINI coefficient ...
- 机器学习算法系列(二十)-梯度提升决策树算法(Gradient Boosted Decision Trees / GBDT)
阅读本文需要的背景知识点:自适应增强算法.泰勒公式.One-Hot编码.一丢丢编程知识 一.引言 前面一节我们学习了自适应增强算法(Adaptive Boosting / AdaBoost Alg ...
最新文章
- 计算机应用主要设计到哪些方面,大学计算机应用基础教案设计.doc
- [Python_7] Python Socket 编程
- 【算法数据结构Java实现】时间复杂度为O(n)的最大和序列
- iPhone 13发售日期偷跑:9月17日全系开售、共4款
- 使用promise封装ajax
- 37 Reasons why your Neural Network is not working
- Java 获取集合长度
- HTML5触摸事件演化tap事件
- 选择排序----详细算法分析
- 实现一个顺序表的建立、查找、插入和删除操作【数据结构实验报告】
- 让电脑替你说:"I IOVE YOU"
- linux编译安装mysql的意思,linux编译模式安装mysql 步骤说明
- visual studio for mac在线安装网络错误
- iTunes Windows 历史版本下载
- 小程序体验版白屏(已解决)
- 使用toUpperCase toLowerCase getBytes方法实现一串字母字符的大小写转换
- excel常用操作收集
- [设计模式]创建模式-建造者(C++描述)
- 基于分布式的智联招聘数据的大屏可视化分析与预测
- 带你玩东方外传系列十一 ~ 二十游戏链接
热门文章
- Sublime Text 2 中文包
- Chart Share
- CG-CTF-Web-bypass again
- 如果用超级计算机渲染阿丽塔,【集群渲染】《阿凡达》幕后的渲染集群与渲染技术...
- 微信小程序 获取授权信息详解
- (五)Vue 面试真题演练
- php下载https图片,php下载https图片报错Failed to enable crypto
- 学习使用bilstm_crf_model出现的bug
- git 配置命令行别名
- cannot find Toolkit in /usr/local/cuda-8.0