group sparsity
Group lasso
β^λ=argminβ∥Y−Xβ∥22+λ∑g=1G∥βIg∥2,\hat{\bm \beta}_\lambda = \arg \min_{\bm \beta} \| \bm Y - \bm X \bm \beta \|_2^2 + \lambda \sum_{g=1}^G \|\bm \beta_{\mathcal{I}_g}\|_2,β^λ=argβmin∥Y−Xβ∥22+λg=1∑G∥βIg∥2,
where Ig\mathcal{I}_gIg is the index set belonging to the gggth group of variables, g=1,…,Gg=1,\ldots,Gg=1,…,G.
- This penalty can be viewed as an intermediate between the ℓ1\ell_1ℓ1 and ℓ2\ell_2ℓ2-type penalty.
The ℓ1\ell_1ℓ1-penalty treats the three coordinate directions differently from other directions, and this encourages sparsity in individual coefficients. The ℓ2\ell_2ℓ2-penalty treats all directions equally and does not encourage sparsity. The group lasso encourages sparsity at the factor level.
- The estimates have the attractive property of being invariant under groupwise orthogonal reparameterizations (transformations)\textcolor{red}{\text{\small invariant under groupwise orthogonal reparameterizations (transformations)}}invariant under groupwise orthogonal reparameterizations (transformations), like ridge regression.
Group LARS
∥M∥2,1=∑i=1d∥M∥2\|\bm M \|_{2,1}= \sum_{i=1}^d \| \bm M\|_2∥M∥2,1=i=1∑d∥M∥2
Group non-negative garrotte
group lasso | group LARS | group non-negative garrotte | |
---|---|---|---|
performance | excellent | comparable | |
computational efficiency | intensive in large scale problems | quickly | fastest |
applicability | sub-optimal when p→np \rightarrow np→n, not applicable when p>np>np>n |
相关约束
Elastic net: Under elastic net, highly correlated features will receive similar weightings. This grouping effect occurs as a result of strict convexity from the ℓ2\ell_2ℓ2 norm.
参考文献
- Yuan, Ming, and Yi Lin. “Model selection and estimation in regression with grouped variables.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 68.1 (2006): 49-67.
- Zou, Hui, and Trevor Hastie. “Regularization and variable selection via the elastic net.” Journal of the royal statistical society: series B (statistical methodology) 67.2 (2005): 301-320.
group sparsity相关推荐
- l20范数最小化求解系数方程_贪婪组稀疏方法(Greedy group sparsity)
l20范数最小化求解系数方程_贪婪组稀疏方法(Greedy group sparsity) 本文章部分参考Fast group sparse classification l20范数最小化求解系数方程 ...
- Multi-Task Learning的几篇综述文章
点击上方"AI遇见机器学习",选择"星标"公众号 重磅干货,第一时间送达 来自 | 知乎 地址 | https://zhuanlan.zhihu.com/p/1 ...
- 计算机视觉,图像处理 经典代码paper整理(很全很强大!)
转自:Jia-Bin Huang 同学收集了很多计算机视觉方面的代码 ,链接如下: https://netfiles.uiuc.edu/jbhuang1/www/resources/vision/in ...
- 从LASSO回归到结构性稀疏:线性回归的正则项都带来了什么?
©作者 | 黄秋实 单位 | 香港中文大学(深圳) 研究方向 | 智能电网 本文我们主要关注于以下问题: 1. LASSO Regression 是什么? 2. 稀疏性的意义是什么?(从数学上证明) ...
- CVPR 2020 论文和开源项目合集(Papers with Code)
CVPR 2020 论文开源项目合集,同时欢迎各位大佬提交issue,分享CVPR 2020开源项目 [推荐阅读] CVPR 2020 virtual ECCV 2020 论文开源项目合集来了:htt ...
- 【今日CS 视觉论文速览】1 Jan 2019
今日CS.CV计算机视觉论文速览 Tue, 1 Jan 2019 Totally 52 papers Interesting: 图片快速视觉效果增强算法,基于Ignatov的算法提高图像的感知质量,利 ...
- MTL多目标学习介绍综述等
工业界解决多目标问题的方案基本有三种策略:多模型分数融合.排序学习(Learning To Rank,LTR).多任务学习(Multi-Task Learning,MTL) 1. An Overvie ...
- 基于3D关节点的人体动作识别综述(转)
原文:2016,Pattern Recognition: 3D skeleton-based human action classification: A survey 摘要 近年来,基于深度序列的人 ...
- 论文阅读:Hit-Detector: Hierarchical Trinity Architecture Search for Object Detection
简介: Hit-Detector是第一个可以同时搜索检测网络的backbone.neck和head的NAS,在低算力.低参数量的情况下得到高mAP. 论文链接 代码链接 前言: NAS在图像识别任务中 ...
最新文章
- OpenCV提取轮廓(去掉面积小的轮廓)
- 填写实验计算机桌面的系统图标有,Win8系统桌面添加计算机图标的图文步骤
- android安装过哪些应用程序,如何安装应用程序两次而不干扰Android?
- 智能云如何加速产业智能化?百度CTO王海峰2020全球智博会擘画蓝图
- 10. 王道考研-树与二叉树
- ASP.NET MVC应用中一个诡异错误的处理
- 惠普HP LaserJet 2100 打印机驱动
- 研发项目wbs分解简单案例_工程项目管理之WBS分解实例(五篇模版)
- ubuntu下锐捷客户端提示多个ip地址
- 单龙芯3A3000-7A1000PMON研究学习-(7)撸起袖子干-make cfg 所执行的操作(d)
- 5G NR MIB详解
- b250支持服务器cpu,b250m主板应该上什么cpu
- Hadoop、Storm和Spark主流分布式系统特点和应用场景
- ARP和RARP协议工作原理
- 离散模型——多属性决策
- 把时间当作朋友——第3章 管理
- Altium designer6.9学习笔记一
- B2B,B2C和C2C
- 在职人员学历提升有哪几种方式?
- 在c语言中i10是什么意思啊,跪求!!!高手们帮忙