多类别不平衡分类-解决方案:概述【集成学习、数据重采样、深度学习(元学习)、异常检测】
类别不平衡(又称长尾问题)是指在分类问题中,类别之间的表示质量/样本数量不平等。
类别不平衡在实践中广泛存在,例如金融欺诈检测、入侵检测、医疗辅助诊断等罕见模式识别任务。
类的不平衡往往会导致传统机器学习算法的预测性能下降。类别不平衡学习旨在解决这一问题,即从不平衡的数据中学习一个无偏的预测模型。
目录
- 目录
- 框架与库 | Frameworks and Libraries
- Python
- R
- Java
- Scalar
- Julia
- 研究论文 | Research Papers
- 综述 | Surveys
- 集成学习 | Ensemble Learning
- 通用集成框架 | General ensemble
- 基于 Boosting 的方法 | Boosting-based
- 基于 Bagging 的方法 | Bagging-based
- 基于代价敏感学习的方法 | Cost-sensitive ensemble
- 数据重采样 | Data resampling
- 过采样 | Over-sampling
- 欠采样 | Under-sampling
- 混合采样 | Hybrid-sampling
- 代价敏感学习 | Cost-sensitive Learning
- 深度学习 | Deep Learning
- 综述 | Surveys
- 图数据挖掘 | Graph Neural Networks
- 难例挖掘 | Hard example mining
- 损失函数设计 | Loss function engineering
- 元学习 | Meta-learning
- 表示学习 | Representation Learning
- 后验概率校准 | Posterior Recalibration
- 半监督/自监督学习 | Semi/Self-supervised Learning
- 课程学习 | Curriculum Learning
- 双阶段训练 | Two-phase Training
- 网络结构 | Network Architecture
- 深度生成网络 | Deep Generative Model
- 不平衡回归 | Imbalanced Regression
- 异常检测 | Anomaly Detection
- 杂项 | Miscellaneous
- 数据集 | Datasets
- Github 项目 | Github Repositories
- 算法实现 & 实用程序 & 教程 | Algorithms & Utilities & Jupyter Notebooks
- 论文列表 | Paper list
- 幻灯片 | Slides
- Contributors ✨
框架与库 | Frameworks and Libraries
Python
imbalanced-ensemble [Github][Documentation][Gallery][Paper]
NOTE: written in python, easy to use.
imbalanced-ensemble
is a Python toolbox for quick implementing and deploying ensemble learning algorithms on class-imbalanced data. It is featured for:- (i) Unified, easy-to-use APIs, detailed documentation and examples.
- (ii) Capable for multi-class imbalanced learning out-of-box.
- (iii) Optimized performance with parallelization when possible using joblib.
- (iv) Powerful, customizable, interactive training logging and visualizer.
- (v) Full compatibility with other popular packages like scikit-learn and imbalanced-learn.
- Currently (v0.1.4), it includes more than 15 ensemble algorithms based on re-sampling and cost-sensitive learning (e.g., SMOTEBoost/Bagging, RUSBoost/Bagging, AdaCost, EasyEnsemble, BalanceCascade, SelfPacedEnsemble, ...).
imbalanced-learn [Github][Documentation][Paper]
NOTE: written in python, easy to use.
imbalanced-learn
is a python package offering a number of re-sampling techniques commonly used in datasets showing strong between-class imbalance. It is compatible with scikit-learn and is part of scikit-learn-contrib projects.- Currently (v0.8.0), it includes 21 different re-sampling techniques, including over-sampling, under-sampling and hybrid ones (e.g., SMOTE, ADASYN, TomekLinks, NearMiss, OneSideSelection, SMOTETomek, ...)
- This package also provides many utilities, e.g., Batch generator for Keras/TensorFlow, see API reference.
smote_variants [Documentation][Github] - A collection of 85 minority over-sampling techniques for imbalanced learning with multi-class oversampling and model selection features (All writen in Python, also support R and Julia).
R
- smote_variants [Documentation][Github] - A collection of 85 minority over-sampling techniques for imbalanced learning with multi-class oversampling and model selection features (All writen in Python, also support R and Julia).
- caret [Documentation][Github] - Contains the implementation of Random under/over-sampling.
- ROSE [Documentation] - Contains the implementation of ROSE (Random Over-Sampling Examples).
- DMwR [Documentation] - Contains the implementation of SMOTE (Synthetic Minority Over-sampling TEchnique).
Java
KEEL [Github][Paper] - KEEL provides a simple GUI based on data flow to design experiments with different datasets and computational intelligence algorithms (paying special attention to evolutionary algorithms) in order to assess the behavior of the algorithms. This tool includes many widely used imbalanced learning techniques such as (evolutionary) over/under-resampling, cost-sensitive learning, algorithm modification, and ensemble learning methods.
NOTE: wide variety of classical classification, regression, preprocessing algorithms included.
Scalar
- undersampling [Documentation][Github] - A Scala library for under-sampling and their ensemble variants in imbalanced classification.
Julia
- smote_variants [Documentation][Github] - A collection of 85 minority over-sampling techniques for imbalanced learning with multi-class oversampling and model selection features (All writen in Python, also support R and Julia).
研究论文 | Research Papers
综述 | Surveys
Learning from imbalanced data (IEEE TKDE, 2009, 6000+ citations) [Paper]
- Highly cited, classic survey paper. It systematically reviewed the popular solutions, evaluation metrics, and challenging problems in future research in this area (as of 2009).
Learning from imbalanced data: open challenges and future directions (2016, 900+ citations) [Paper]
- This paper concentrates on the open issues and challenges in imbalanced learning, i.e., extreme class imbalance, imbalance in online/stream learning, multi-class imbalanced learning, and semi/un-supervised imbalanced learning.
Learning from class-imbalanced data: Review of methods and applications (2017, 900+ citations) [Paper]
- A recent exhaustive survey of imbalanced learning methods and applications, a total of 527 papers were included in this study. It provides several detailed taxonomies of existing methods and also the recent trend of this research area.
集成学习 | Ensemble Learning
通用集成框架 | General ensemble
Self-paced Ensemble (ICDE 2020, 20+ citations) [Paper][Code][Slides][Zhihu/知乎][PyPI]
NOTE: versatile solution with outstanding performance and computational efficiency.
MESA: Boost Ensemble Imbalanced Learning with MEta-SAmpler (NeurIPS 2020) [Paper][Code][Video][Zhihu/知乎]
NOTE: learning an optimal sampling policy directly from data.
Exploratory Undersampling for Class-Imbalance Learning (IEEE Trans. on SMC, 2008, 1300+ citations) [Paper]
NOTE: simple but effective solution.
- EasyEnsemble [Code]
- BalanceCascade [Code]
基于 Boosting 的方法 | Boosting-based
AdaBoost (1995, 18700+ citations) [Paper][Code] - Adaptive Boosting with C4.5
DataBoost (2004, 570+ citations) [Paper] - Boosting with Data Generation for Imbalanced Data
SMOTEBoost (2003, 1100+ citations) [Paper][Code] - Synthetic Minority Over-sampling TEchnique Boosting
MSMOTEBoost (2011, 1300+ citations) [Paper] - Modified Synthetic Minority Over-sampling TEchnique Boosting
RAMOBoost (2010, 140+ citations) [Paper] [Code] - Ranked Minority Over-sampling in Boosting
RUSBoost (2009, 850+ citations) [Paper] [Code] - Random Under-Sampling Boosting
AdaBoostNC (2012, 350+ citations) [Paper] - Adaptive Boosting with Negative Correlation Learning
EUSBoost (2013, 210+ citations) [Paper] - Evolutionary Under-sampling in Boosting
基于 Bagging 的方法 | Bagging-based
Bagging (1996, 20000+ citations) [Paper][Code] - Bagging predictor
Diversity Analysis on Imbalanced Data Sets by Using Ensemble Models (2009, 400+ citations) [Paper]
- UnderBagging [Code]
- OverBagging [Code]
- SMOTEBagging [Code]
基于代价敏感学习的方法 | Cost-sensitive ensemble
AdaCost (ICML 1999, 800+ citations) [Paper][Code] - Misclassification Cost-sensitive boosting
AdaUBoost (NIPS 1999, 100+ citations) [Paper][Code] - AdaBoost with Unequal loss functions
AsymBoost (NIPS 2001, 700+ citations) [Paper][Code] - Asymmetric AdaBoost and detector cascade
数据重采样 | Data resampling
过采样 | Over-sampling
ROS [Code] - Random Over-sampling
SMOTE (2002, 9800+ citations) [Paper][Code] - Synthetic Minority Over-sampling TEchnique
Borderline-SMOTE (2005, 1400+ citations) [Paper][Code] - Borderline-Synthetic Minority Over-sampling TEchnique
ADASYN (2008, 1100+ citations) [Paper][Code] - ADAptive SYNthetic Sampling
SPIDER (2008, 150+ citations) [Paper][Code(Java)] - Selective Preprocessing of Imbalanced Data
Safe-Level-SMOTE (2009, 370+ citations) [Paper][Code(Java)] - Safe Level Synthetic Minority Over-sampling TEchnique
SVM-SMOTE (2009, 120+ citations) [Paper][Code] - SMOTE based on Support Vectors of SVM
MDO (2015, 150+ citations) [Paper][Code] - Mahalanobis Distance-based Over-sampling for Multi-Class imbalanced problems.
NOTE: See more over-sampling methods at smote-variants.
欠采样 | Under-sampling
RUS [Code] - Random Under-sampling
CNN (1968, 2100+ citations) [Paper][Code] - Condensed Nearest Neighbor
ENN (1972, 1500+ citations) [Paper] [Code] - Edited Condensed Nearest Neighbor
TomekLink (1976, 870+ citations) [Paper][Code] - Tomek's modification of Condensed Nearest Neighbor
NCR (2001, 500+ citations) [Paper][Code] - Neighborhood Cleaning Rule
NearMiss-1 & 2 & 3 (2003, 420+ citations) [Paper][Code] - Several kNN approaches to unbalanced data distributions.
CNN with TomekLink (2004, 2000+ citations) [Paper][Code(Java)] - Condensed Nearest Neighbor + TomekLink
OSS (2007, 2100+ citations) [Paper][Code] - One Side Selection
EUS (2009, 290+ citations) [Paper] - Evolutionary Under-sampling
IHT (2014, 130+ citations) [Paper][Code] - Instance Hardness Threshold
混合采样 | Hybrid-sampling
A Study of the Behavior of Several Methods for Balancing Training Data (2004, 2000+ citations) [Paper]
NOTE: extensive experimental evaluation involving 10 different over/under-sampling methods.
- SMOTE-Tomek [Code]
- SMOTE-ENN [Code]
SMOTE-RSB (2012, 210+ citations) [Paper][Code] - Hybrid Preprocessing using SMOTE and Rough Sets Theory
SMOTE-IPF (2015, 180+ citations) [Paper][Code] - SMOTE with Iterative-Partitioning Filter
代价敏感学习 | Cost-sensitive Learning
CSC4.5 (2002, 420+ citations) [Paper][Code(Java)] - An instance-weighting method to induce cost-sensitive trees
CSSVM (2008, 710+ citations) [Paper][Code(Java)] - Cost-sensitive SVMs for highly imbalanced classification
CSNN (2005, 950+ citations) [Paper][Code(Java)] - Training cost-sensitive neural networks with methods addressing the class imbalance problem.
深度学习 | Deep Learning
综述 | Surveys
A systematic study of the class imbalance problem in convolutional neural networks (2018, 330+ citations) [Paper]
Survey on deep learning with class imbalance (2019, 50+ citations) [Paper]
NOTE: a recent comprehensive survey of the class imbalance problem in deep learning.
图数据挖掘 | Graph Neural Networks
- GraphSMOTE: Imbalanced Node Classification on Graphs with Graph Neural Networks (WSDM 2021) [Paper][Code]
- Topology-Imbalance Learning for Semi-Supervised Node Classification (NeurIPS 2021) [Paper][Code]
- GraphENS: Neighbor-Aware Ego Network Synthesis for Class-Imbalanced Node Classification (ICLR 2022) [Paper][Code]
- LTE4G: Long-Tail Experts for Graph Neural Networks (CIKM 2022) [Paper][Code]
难例挖掘 | Hard example mining
- Training region-based object detectors with online hard example mining (CVPR 2016, 840+ citations) [Paper][Code] - In the later phase of NN training, only do gradient back-propagation for "hard examples" (i.e., with large loss value)
损失函数设计 | Loss function engineering
Focal loss for dense object detection (ICCV 2017, 2600+ citations) [Paper][Code (detectron2)][Code (unofficial)] - A uniform loss function that focuses training on a sparse set of hard examples to prevents the vast number of easy negatives from overwhelming the detector during training.
NOTE: elegant solution, high influence.
Training deep neural networks on imbalanced data sets (IJCNN 2016, 110+ citations) [Paper] - Mean (square) false error that can equally capture classification errors from both the majority class and the minority class.
Deep imbalanced attribute classification using visual attention aggregation (ECCV 2018, 30+ citation) [Paper][Code]
Imbalanced deep learning by minority class incremental rectification (TPAMI 2018, 60+ citations) [Paper] - Class Rectification Loss for minimizing the dominant effect of majority classes by discovering sparsely sampled boundaries of minority classes in an iterative batch-wise learning process.
Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss (NIPS 2019, 10+ citations) [Paper][Code] - A theoretically-principled label-distribution-aware margin (LDAM) loss motivated by minimizing a margin-based generalization bound.
Gradient harmonized single-stage detector (AAAI 2019, 40+ citations) [Paper][Code] - Compared to Focal Loss, which only down-weights "easy" negative examples, GHM also down-weights "very hard" examples as they are likely to be outliers.
Class-Balanced Loss Based on Effective Number of Samples (CVPR 2019, 70+ citations) [Paper][Code] - a simple and generic class-reweighting mechanism based on Effective Number of Samples.
Influence-Balanced Loss for Imbalanced Visual Classification (ICCV 2021) [Paper][Code]
AutoBalance: Optimized Loss Functions for Imbalanced Data (NeurIPS 2021) [Paper]
Label-Imbalanced and Group-Sensitive Classification under Overparameterization (NeurIPS 2021) [Paper][Code]
元学习 | Meta-learning
Learning to model the tail (NIPS 2017, 70+ citations) [Paper] - Transfer meta-knowledge from the data-rich classes in the head of the distribution to the data-poor classes in the tail.
Learning to reweight examples for robust deep learning (ICML 2018, 150+ citations) [Paper][Code] - Implicitly learn a weight function to reweight the samples in gradient updates of DNN.
NOTE: representative work to solve the class imbalance problem through meta-learning.
Meta-weight-net: Learning an explicit mapping for sample weighting (NIPS 2019) [Paper][Code] - Explicitly learn a weight function (with an MLP as the function approximator) to reweight the samples in gradient updates of DNN.
Learning Data Manipulation for Augmentation and Weighting (NIPS 2019) [Paper][Code]
Learning to Balance: Bayesian Meta-Learning for Imbalanced and Out-of-distribution Tasks (ICLR 2020) [Paper][Code]
MESA: Boost Ensemble Imbalanced Learning with MEta-SAmpler (NeurIPS 2020) [Paper][Code][Video]
NOTE: meta-learning-powered ensemble learning
表示学习 | Representation Learning
Learning deep representation for imbalanced classification (CVPR 2016, 220+ citations) [Paper]
Supervised Class Distribution Learning for GANs-Based Imbalanced Classification (ICDM 2019) [Paper]
Decoupling Representation and Classifier for Long-tailed Recognition (ICLR 2020) [Paper][Code]
NOTE: interesting findings on representation learning and classifier learning
Supercharging Imbalanced Data Learning With Energy-based Contrastive Representation Transfer (NeurIPS 2021) [Paper]
后验概率校准 | Posterior Recalibration
Posterior Re-calibration for Imbalanced Datasets (NeurIPS 2020) [Paper][Code]
Long-tail learning via logit adjustment (ICLR 2021) [Paper][Code]
半监督/自监督学习 | Semi/Self-supervised Learning
Rethinking the Value of Labels for Improving Class-Imbalanced Learning (NeurIPS 2020) [Paper][Code][Video]
NOTE: semi-supervised training / self-supervised pre-training helps imbalance learning
Distribution Aligning Refinery of Pseudo-label for Imbalanced Semi-supervised Learning (NeurIPS 2020) [Paper][Code]
ABC: Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning (NeurIPS 2021) [Paper][Code]
Improving Contrastive Learning on Imbalanced Data via Open-World Sampling (NeurIPS 2021) [Paper]
DASO: Distribution-Aware Semantics-Oriented Pseudo-label for Imbalanced Semi-Supervised Learning (CVPR 2022) [Paper][Code]
课程学习 | Curriculum Learning
- Dynamic Curriculum Learning for Imbalanced Data Classification (ICCV 2019) [Paper]
双阶段训练 | Two-phase Training
Brain tumor segmentation with deep neural networks (2017, 1200+ citations) [Paper][Code (unofficial)]
Pre-training on balanced dataset, fine-tuning the last output layer before softmax on the original, imbalanced data.
网络结构 | Network Architecture
BBN: Bilateral-Branch Network with Cumulative Learning for Long-Tailed Visual Recognition (CVPR 2020) [Paper][Code]
Class-Imbalanced Deep Learning via a Class-Balanced Ensemble (TNNLS 2021) [Paper]
深度生成网络 | Deep Generative Model
- Deep Generative Model for Robust Imbalance Classification (CVPR 2020) [Paper]
不平衡回归 | Imbalanced Regression
Delving into Deep Imbalanced Regression (ICML 2021) [Paper][Code][Video]
Density-based weighting for imbalanced regression (Machine Learning [J], 2021) [Paper][Code]
异常检测 | Anomaly Detection
综述 | Surveys
Anomaly detection: A survey (ACM computing surveys, 2009, 9000+ citations) [Paper]
A survey of network anomaly detection techniques (2017, 700+ citations) [Paper]
基于分类的方法 | Classification-based
One-class SVMs for document classification (JMLR, 2001, 1300+ citations) [Paper]
One-class Collaborative Filtering (ICDM 2008, 1000+ citations) [Paper]
Isolation Forest (ICDM 2008, 1000+ citations) [Paper]
Anomaly Detection using One-Class Neural Networks (2018, 200+ citations) [Paper]
Anomaly Detection with Robust Deep Autoencoders (KDD 2017, 170+ citations) [Paper]
杂项 | Miscellaneous
数据集 | Datasets
imbalanced-learn
datasetsThis collection of datasets is from imblearn.datasets.fetch_datasets.
ID Name Repository & Target Ratio #S #F 1 ecoli UCI, target: imU 8.6:1 336 7 2 optical_digits UCI, target: 8 9.1:1 5,620 64 3 satimage UCI, target: 4 9.3:1 6,435 36 4 pen_digits UCI, target: 5 9.4:1 10,992 16 5 abalone UCI, target: 7 9.7:1 4,177 10 6 sick_euthyroid UCI, target: sick euthyroid 9.8:1 3,163 42 7 spectrometer UCI, target: > =44 11:1 531 93 8 car_eval_34 UCI, target: good, v good 12:1 1,728 21 9 isolet UCI, target: A, B 12:1 7,797 617 10 us_crime UCI, target: >0.65 12:1 1,994 100 11 yeast_ml8 LIBSVM, target: 8 13:1 2,417 103 12 scene LIBSVM, target: >one label 13:1 2,407 294 13 libras_move UCI, target: 1 14:1 360 90 14 thyroid_sick UCI, target: sick 15:1 3,772 52 15 coil_2000 KDD, CoIL, target: minority 16:1 9,822 85 16 arrhythmia UCI, target: 06 17:1 452 278 17 solar_flare_m0 UCI, target: M->0 19:1 1,389 32 18 oil UCI, target: minority 22:1 937 49 19 car_eval_4 UCI, target: vgood 26:1 1,728 21 20 wine_quality UCI, wine, target: <=4 26:1 4,898 11 21 letter_img UCI, target: Z 26:1 20,000 16 22 yeast_me2 UCI, target: ME2 28:1 1,484 8 23 webpage LIBSVM, w7a, target: minority 33:1 34,780 300 24 ozone_level UCI, ozone, data 34:1 2,536 72 25 mammography UCI, target: minority 42:1 11,183 6 26 protein_homo KDD CUP 2004, minority 111:1 145,751 74 27 abalone_19 UCI, target: 19 130:1 4,177 10 Imbalanced Databases
Link: GitHub - gykovacs/common_datasets: machine learning databases
Github 项目 | Github Repositories
算法实现 & 实用程序 & 教程 | Algorithms & Utilities & Jupyter Notebooks
imbalanced-algorithms - Python-based implementations of algorithms for learning on imbalanced data.
imbalanced-dataset-sampler - A (PyTorch) imbalanced dataset sampler for oversampling low frequent classes and undersampling high frequent ones.
class_imbalance - Jupyter Notebook presentation for class imbalance in binary classification.
Multi-class-with-imbalanced-dataset-classification - Perform multi-class classification on imbalanced 20-news-group dataset.
Advanced Machine Learning with scikit-learn: Imbalanced classification and text data - Different approaches to feature selection, and resampling methods for imbalanced data.
论文列表 | Paper list
Anomaly Detection Learning Resources by yzhao062 - Anomaly detection related books, papers, videos, and toolboxes.
Paper-list-on-Imbalanced-Time-series-Classification-with-Deep-Learning - Imbalanced Time-series Classification
幻灯片 | Slides
- acm_imbalanced_learning - slides and code for the ACM Imbalanced Learning talk on 27th April 2016 in Austin, TX.
awesome-imbalanced-learning/README_CN.md at master · ZhiningLiu1998/awesome-imbalanced-learning · GitHub
多类别不平衡分类-解决方案:概述【集成学习、数据重采样、深度学习(元学习)、异常检测】相关推荐
- 深度强化元学习教程---元学习概述
深度强化元学习是近期深度学习技术的一个另人瞩目的新兴领域,其利用元学习,解决了深度学习需要大数据集的问题,以及强化学习收敛慢的问题.同时元学习还可以适用于环境不断改变的应用场景,具有巨大的应用前景. ...
- 深度 | 学习如何学习的算法:简述元学习研究方向现状
要想实现足够聪明的人工智能,算法必须学会如何学习.很多研究者们曾对此提出过不同的解决方案,其中包括 UC Berkeley 的研究人员提出的与模型无关的元学习(MAML)方法.本文将以 MAML 为例 ...
- 【转】学习如何学习的算法:简述元学习研究方向现状
要想实现足够聪明的人工智能,算法必须学会如何学习.很多研究者们曾对此提出过不同的解决方案,其中包括 UC Berkeley 的研究人员提出的与模型无关的元学习(MAML)方法.本文将以 MAML 为例 ...
- 元学习之模型无关的元学习
本次讲述的论文: Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks 首先现在这里介绍一下元学习器的作用.元学习器,即m ...
- 元强化学习系列(1)之:元学习入门基础
元强化学习三境界 统计学是人工智能开始发展的一个基础,古老的人们从大量的数据中发现七所存在的规律,在以统计学为基础的 机器学习(machine learning)时代,复杂一点的分类问题效果就不好了, ...
- 标签稀疏类别不平衡问题解决方案总结
知乎主页https://www.zhihu.com/people/shuang-shou-cha-dai-53https://www.zhihu.com/people/shuang-shou-cha- ...
- 【转载】Few-shot learning(少样本学习)和 Meta-learning(元学习)概述
版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明. 本文链接:https://blog.csdn.net/weixin_37589575/arti ...
- Few-shot learning(少样本学习)和 Meta-learning(元学习)概述
目录 (一)Few-shot learning(少样本学习) 1. 问题定义 2. 解决方法 2.1 数据增强和正则化 2.2 Meta-learning(元学习) (二)Meta-learning( ...
- 【Datawhale第25期组队学习】Task03:基于线性模型的异常检测
Taks03 基于线性模型的异常检测 文章目录 Taks03 基于线性模型的异常检测 0 写在前面 1 概述 2 线性回归 2.1 基于自变量与因变量的线性回归 2.1.1梯度下降法优化目标函数 2. ...
最新文章
- linux监测cpu 内存,Linux中CPU与内存性能监测.docx
- HDU1040-As Easy As A+B
- AGI:走向通用人工智能的【生命学哲学科学】第二篇——思维、生命科学、客观世界
- 为什么梯度的方向与等高线切线方向垂直?
- Java黑皮书课后题第8章:8.18(打乱行)编写一个方法,使用下面的方法头打乱一个二维int型数组的行。编写一个测试程序,打乱下面的矩阵
- 企业中squid+iptables多模块的综合应用案例
- 提现接口网站 php,API提现接口
- ArcGIS实验教程——实验三十一:ArcGIS构建泰森多边形(Thiessen Polygon)实例精解
- DI(数据集成)前瞻调查
- [深度学习] 深度学习常见概念
- 获取枚举类型的字符串
- 大数据 (016)Hadoop-MR编程 -- 【使用hadoop计算微博用户可能喜欢的关键词----编程】
- win10自带虚拟机 Hyper-V下载和安装linux系统
- visual studio 2017 安装离线MSDN
- 进程已结束,退出代码-1073740791 (0xC0000409)
- linux系统能运行msi文件吗,在Linux中如何在Wine下运行msi文件
- lack名词形式_lack是什么意思_lack的翻译_音标_读音_用法_例句_爱词霸在线词典
- php源码安装图文教程_织梦教程 整站源码通用安装教程内附图文说明
- 计算机的ram是一种什么东西,科技:什么是RAM?
- 算法代码备忘录(2)