目录

Tri-party Deep Network Representation

Essence

Thinking

Abstract

Introduction

Problem Definition

Tri-DNR pipelines

Model Architecture


Tri-party Deep Network Representation

Essence

1) Deepwalk提取graph structure信息,即structural node order上下文节点顺序!!

2) Text context information = order of text words 文本词顺序

= semantic of text 文本语义信息

Thinking

Graph node representation用到structure,node content和label information。如果我在social message clustering中用到中间中心度聚类(科研团队识别方法)该如何?

Abstract

concept: network representation -> represent each node 表示节点 in a vector format rather than others

Aim:to learn a low-dimensional vector vi for each node vi in the network, so that nodes close to each other in network topology or with similar text context, or sharing the same class label information are close in the representation space.

problem: existing methods only focus on one aspect of node information and cannot leverage node labels.

solution: Tri-DNR

node information: node structure, node context, node labels -> jointly learn optimal node representation. 节点信息在本论文中不是一个单独的概念,而是概括性的概念。

1)Network structural level. Tri-DNR exploits the inter-node relationship by maximizing the probability of observing surrounding nodes given a node in random walks.

2) node content level. Tri-DNR captures node-word correlation by maximizing the co-occurrence of word sequence given a node.

3) node label level. Tri-DNR models node-label correspondence by maximizing the probability of word sequence given a class label.

Introduction

Problem: complex structure and rich node content information, while network structure is naturally sparse. The complexity of networked data.

solution: encodes each node in a common, continuous, and low-dimensional space.

-> problem: existing methods mainly employs network structure based methods or node content based methods.

(1) Methods based on network structure.

problem: these methods take the network structure as input but ignore content information associated to each node.

(2) Content perspectives

problem: TFIDF, LDA, etc not consider the context information of a document.

-> suboptimal representation次优化表示

solution: skip-gram model -> paragraph vector model for arbitrary piece of text.

problem: drawback of existing methods is twofold:

1) Only utilize one source of information(shallow)

2) all methods learn network embedding in a fully unsupervised way <- 因为node label provides usefully information.

challenges: main challenges of learning latent representation for network nodes:

(1) network structure, node content, label information integration?

in order to exploit both structure and text information for network representation.

solution: TADW

-> problems: TADW has following drawbacks:

- the accurate matrix M for factorization is non-trivial and very difficult to obtain. has to factorize an approximate matrix

- simply ignores the context of text information, can't capture the semantics of the word and nodes.

- TADW requires expensive matrix operation.

solution: Tri-DNR

- Model level, maximizing the probability of nodes -> inter-node relationship

- Node content and label, maximizing co-occurrence of word sequence: under a node, under a label

(2) Neural network modeling?

Solution for challenge 2: to separately learn a node vector by using DeepWalk for network structure and a document vector via paragraph vectors model.

-> Concatenate: node vector + document vector

Problem: suboptimal, it ignores the label information and overlooks interactions between network structures and text information.

Problem Definition

G = (V, E, D, C)

V表示nodes;E表示edges;D表示文本text;C表示labels

Tri-DNR pipelines

(1) Random walk sequence generation

-> Network structure to capture the node relationship

(2) Coupled neural network model learning

Model Architecture

(1) inter-node relationship modeling. Random walk sequences, vector

(2) node-content correlations assessing. Contextual information of words within a document.

(3) Connections

(4) Label-content correspondence modeling

-> the text information and label information will jointly affect V', the output representation of word wj, which will further propagate back to influence the input representation of vi∈V in the network.

As a result, the node representation(the input vectors of nodes) will be enhanced by both network structure, text content and label information.

GNN algorithms(3): Tri-party Deep Network Representation相关推荐

  1. Tri-Party Deep Network Representation

    Tri-Party Deep Network Representation 目标 方法 步骤 目标 将网络节点嵌入成向量. 另外,值得一提的是:本文所使用的是cora数据集,简单介绍就是一共有2708 ...

  2. GNN Algorithms(2): GCN, Graph Convolutional Network

    目录 GCN Algorithm Background 传统卷积公式:在graph上不行 定义Graph Fourier 传统Fourier transformation 传统Inverse Four ...

  3. 论文阅读Batch Normalization: Accelerating Deep Network Training byReducing Internal Covariate Shift

    论文阅读Batch Normalization: Accelerating Deep Network Training byReducing Internal Covariate Shift 全文翻译 ...

  4. CNN阴影去除--DeshadowNet: A Multi-context Embedding Deep Network for Shadow Removal

    DeshadowNet: A Multi-context Embedding Deep Network for Shadow Removal CVPR2017 本文使用深度学习CNN网络来进行阴影去除 ...

  5. 文献记录(part36)--A survey on heterogeneous network representation learning

    学习笔记,仅供参考,有错必纠 关键词:异构网络:网络表征学习:机器学习 文章目录 A survey on heterogeneous network representation learning 摘 ...

  6. 文献学习(part14)--Structural Deep Network Embedding

    学习笔记,仅供参考,有错必纠 文章目录 Structural Deep Network Embedding ABSTRACT INTRODUCTION RELATED WORK Deep Neural ...

  7. Batch normalization:accelerating deep network training by reducing internal covariate shift的笔记

    说实话,这篇paper看了很久,,到现在对里面的一些东西还不是很好的理解. 下面是我的理解,当同行看到的话,留言交流交流啊!!!!! 这篇文章的中心点:围绕着如何降低  internal covari ...

  8. 【论文翻译 IJCAI-20】Heterogeneous Network Representation Learning 异构网络表示学习

    文章目录 摘要 1 引言 2 异构网络挖掘 3 异构网络表示 3.1 异构网络嵌入 3.2 异构图神经网络 3.3 知识图谱与属性网络 3.4 应用 4 挑战.方向和开源数据 4.1 未来方向 避免设 ...

  9. 2019_WWW_Dual graph attention networks for deep latent representation of multifaceted social effect

    [论文阅读笔记]2019_WWW_Dual graph attention networks for deep latent representation of multifaceted social ...

最新文章

  1. Django框架(12.Django中模型类高阶查询(Q对象以及F对象 和聚合函数查询))
  2. Django(part26)--修改及删除记录
  3. 如何建立双机热备系统
  4. 学会了这些技术,你离BAT大厂不远了
  5. C# 操作线程的通用类[测试通过]
  6. MySQL之IFNULL()、ISNULL、NULLIF用法
  7. Java动态加载类(对反射的基本理解)
  8. c语言api函数写病毒,C语言病毒代码,及写病毒简单介绍
  9. 程序员教程第五版笔记
  10. sqlplus linux 连接数据库,sqlplus连接Oracle
  11. 如何快速统计任意一条线段的长度?
  12. 海思mpp_sample例子详解
  13. 华中科技大学计算机证书领取,华中科技大学计算机水平测试软考报名通知
  14. 信息安全结业复习题(选择 + 填空 + 简答 + 计算 + 设计 )含历年考题
  15. hosts文件为空,仍然显示ERR_CONNECTION_RESET的解决方法(hosts.ics)
  16. Linux驱动学习12(初步认识内存管理)
  17. 基于Android的车位共享系统的设计
  18. 高阻态是0还是1_FPGA/ASIC笔试面试题集锦(1)知识点高频复现练习题
  19. X11 Wayland 及 Mir 比较
  20. 硬件配置部分——从无到有自主搭建视觉惯性VI-SLAM(vins-mono)平台

热门文章

  1. SPARK 笔记 (五) 经纬度转换地址
  2. 孩子数学成绩不好怎么办_三年级发现孩子数学成绩非常不好,作为家长该怎么办?...
  3. 新闻App详细开发流程和结构搭建
  4. 杜佑夸高颎,NB人夸NB人
  5. 提升深度学习模型泛化性的方法
  6. 狂野飙车4java游戏音乐_狂野飙车8赛车背景音乐名称大全
  7. 7Ps营销理论(转载)
  8. 第10章 Spark(全面解读Spark架构体系)
  9. 编程实现Z=5X+3Y+10,设已知变量和结果均放在数据段
  10. Vue实战中的一些小魔法