GCN-LSTM模型预测道路交通车辆速度

GCN:又称GNN,图神经网络    LSTM:长短时记忆网络

Vehicle Speed Forecasting Based On GCN-LSTM Combined Model

Summary

This research offers a multistep traffic flow forecasting framework relying on interest spatial-temporal-graph neural network-long short-term memory neural network to address the traffic network's traffic flow forecasting challenge. The algorithm can record the complicated dependency structure of road nodes on the road network. It could obtain parameter estimation information from the K-order local neighbors of road connection nodes in the road network using LSGC (local spectrogram convolution). To broaden the receptivity range of graph inversion, it is more precise to collect knowledge from neighbor nodes by substituting the single-hop neighborhood matrix with K-order local neighborhoods. The high-order neighborhood of road nodes is entirely determined rather than merely containing attributes from first-order neighbor nodes. Moreover, an exterior characteristic improvement unit is developed to collect extrinsic parameters that influence traffic flow (weather, points of interest, time, etc.) to strengthen the framework’s traffic movement estimate reliability. The empirical findings reveal that the model performs well when evaluating static, dynamic, and static + dynamic combinations.

Introduction

Efficient traffic flow projections could reduce congestion problems, travel arrangements, traffic control for drivers and operators, and decision-makers [Li et al., 2021, 102977]. “Traffic projections are an important sector in the survey of intelligent transportation, and efficacious traffic flow prediction could alleviate traffic congestion, travel planning, and traffic management for individual drivers and decision-makers.” Environmental crises [Fan et al., 2017], process variables, and permanent variables will significantly impact the intricate temporal and spatial connections between traffic flow. Yu et al. [2017, 1501] and colleagues devised an ARIMA that could only handle nonstationary time series data. It is tough to investigate links across data streams, and it is no longer appropriate for today's operations. Furthermore, while conventional linear models, including a series of Kalman sifting approaches introduced and enhanced by Cao et al. [2020], have increased the quality of forecasting in certain elements, their capacity to accommodate nonlinear traffic flow data remains poor, and forecasting time is increased [Yang et al., 2017].

“Conventional machine educational methodologies, including support vector regression (SVR),” [Fu et al., 2016], “k-nearest neighbor algorithm,” [Liu et al., 2021], K-NN (K-nearest neighbor), and decision tree models [Nguyen et al., 2018, 1001], could delve deeper out the crucial regulations and rich details concealed in traffic flow from large datasets [Chen et al., 2018], and effectively enhance the traffic flow projections advancement procedure.

The advancement of artificial intelligence's usefulness in traffic projection has been aided by the creation of extensive neural network architectures. While several simple network architectures can increase model traffic forecast accuracy [Ma et al., 2017, 818], they have drawbacks like sluggish completion, over-fitting, and constant variance [Feng et al., 2019, 2009]. “Recurrent neural networks (RNN) [Zhang et al., 2018], extended short-term memory networks (LSTM),” [Li et al., 2017]. Gated recurrent unit (GRU) [Li, Linjia, et al. 2021, 2150481] may successfully employ the self-loop mechanism and acquire time-series properties to increase forecast efficacy compared to classic neural networks models. As a result, it is included in every framework to anticipate traffic speed, trip duration, and traffic flow, among other things.

A detailed description of the model

Graph Neural Network (GCN)

Because the transport system could be thought of as a graph made of nodes and edges. “It has been applied for dynamically shortest path transmitting, traffic congestion assessment, and dynamic traffic distribution,” [Aghdam et al. 2021, 17].

The most frequent strategy for our graph network study is to discover a range framework in the range sphere [Guo et al., 2019, 3917] and construct the spectrum combination depending on the chart Laplacian matrix to create the spectrogram convolution model. We utilize the local spectrogram convolution with a polynomial filtration system to reduce the number of variables and focus on saving the arithmetic period. “However, the Laplacian matrix power action still necessitates a lot of determination and high dimensionality, so the Chebyshev polynomial is presented to determine the K-order local transformation function, which could dramatically lessen prediction error from the square level to the linear story,” [Azari et al., 2019].

As demonstrated in Figure 1, the spectrogram convolution approach employing Chebyshev polynomial calculation could record information from the K-order local neighbors of the graph's apexes, fully accounting for the node's high-order neighborhood rather than only the single-hop neighborhood. The responsive range of graph convolution is expanded in this chapter by substituting the single-hop neighborhood matrix with the K-order local neighborhood, which extracts data from neighbor nodes more precisely.

Fig 1: K-hop neighbors of graph convolution

GCN-LSTM Structure

We integrated a long- and short-term memory neural network LSTM to record the complicated spatial linkage and busy time connection of traffic data in the actual world. “LSTM is an upgraded recurrent neural network (RNN), and whenever the training time series is long enough, LSTM outperforms ARIMA" [Li et al. 2021, 11269]. A specific cell unit, not a typical neuron node, is the fundamental entity of the LSTM concealed tier. This particular memory unit allows LSTM to address the RNN gradient inflation problem while simultaneously capturing the temporal correlation of traffic flow. We mix GCN and LSTM systems to represent the complicated geographical connection and active time association of traffic data in the physical biosphere. “The GCN model's job is to construct a graph of the road section's traffic statistics based on a predefined graph representation” [Sun et al., 2017, 210]. “It captures the spatial variation of such road sections in the road system every time by learning the depiction of the road section by incorporating the properties of the node's local neighbors. Then, these time-varying feature descriptions are fed into the LSTM model,” [Li et al. 2021, 1980].

Experiment Results and Analysis

Analysis of Static Attribute

We examine our suggested GCN-LSTM model to several frequently used methods approaches to assess its current effectiveness. The following are the frameworks:

  1. Historical average model (HA)
  2. Autoregressive integrated moving average model (ARIMA) with Kalman filter
  3. Support vector regression (SVR)
  4. Diffusion convolution recurrent neural network (DCRNN)
  5. GCN-LSTM:

Traditional nonneural network models such as HA, ARIMA, and SVR are one; DCRNN is a profound training algorithm that could record spatial information. GCN-LSTM is a profound training algorithm that thoroughly incorporates the longitudinal characteristics and active connection of traffic statistics. The total forecast accuracy of the GCN-LSTM framework and five typical approaches is shown in Table 1. To compare results, three measures are utilized: root means square error (RMSE), mean absolute error (MAE), and precision (accurateness) assessment.

As per Table 1, the RMSE values of the GCN-LSTM model drop by 2.06 percent, 33.37 percent, and 1.46 percent when opposed to the conventional approaches, HA, ARIMA, besides SVR, based on the outcomes of the 15-minute forecasting interval. The reliability score is enhanced by 7.34 percent and 0.78 percent, correspondingly, as contrasted to the HA and SVR models. Since this data has complicated spatiotemporal correlation and high-dimensional properties, HA, ARIMA, plus SVR could not compare by other approaches. Nonneural system techniques are not appropriate for network-wide time series forecasting. The RMSE worth of the AST-GCN-LSTM strategy that incorporates all peripheral characteristics into respect is 5.29 percent and 1.16 percent lesser than the DCRNN system also GCN-LSTM prototype, respectively when external characteristic variables are taken into account. MAE has a lower rating than the DCRNN and GCN-LSTM models, condensed by 7.36 percent and 1.21 percent, respectively. Table 1 shows that, when opposed to conventional approaches and other deep learning-based approaches, the procedure provided in this research has produced considerable gains, demonstrating the model's usefulness.

Analysis of the External Attribute

Comparative tests are conducted to evaluate the impacts of various feature qualities on traffic flow projection. There are four types of research conditions: adding static property traits solitary, totaling lively characteristic features solely, combining active also stationary peripheral element features simultaneously, yet not adding external element factors at all. Figure 2 depicts the outcomes. The addition of static feature attributes results in color yellow. The addition of dynamic attribute features resulted in gray. The color blue is the outcome of combining active and stationary extrinsic variables.

Fig 2: Experiment under different conditions

Figure 2 shows as when just dynamic attribute variables are examined, the GCN-LSTM (active) RMSE is 5.15% and 1.01% lesser than the DCRNN and GCN-LSTM framework, respectively. MAE has a lesser value than DCRNN besides GCN-LSTM algorithm, with reductions of 7.34 percent and 1.18 percent, respectively. When only static attributes are taken into account, the GCN-LSTM (stationary) RMSE is lessened by 5.15 percent and 0.85 percent, respectively, while contrasted to DCRNN plus GCN-LSTM algorithms. The MAE is condensed by 7.12 percent besides 0.93 percent, respectively. Whenever static and dynamic components are examined simultaneously, the RMSE of the GCN-LSTM algorithm is lessened by 5.29 percent and 1.16 percent, respectively, when opposed to the DCRNN algorithm plus the GCN-LSTM algorithm, and the MAE figure is lowered by 7.36 percent besides 1.21 percent, respectively.

Figure 2 shows that when just dynamic attribute elements are taken into account, the model performs better than once only stationary characteristic elements are well-thought-out. This also demonstrates the significance of taking active exterior characteristic information into account, besides we found that the model's efficiency is best when both static and dynamic parameters are taken into account. In conclusion, taking external information into account positively impacts the model's projection in real-world situations.

Conclusions

This research obtains dynamic property aspects by integrating the property augmentation unit design of various influences into the suggested GCN-LSTM model. The Chebyshev polynomial approximation spectrogram convolution algorithm is utilized to retrieve features after the vector representation has been augmented. From the K-order local neighbors of the nodes in the graph, this framework may describe the geographic properties of traffic flow. The K-order local neighborhood matrix can be applied to enlarge the approachable turf of the graph intricacy, allowing it to gain intelligence from neighbor nodes more precisely. After the data is recovered, the LSTM model records the partial derivatives by inputting the distinctive description of the information that evolves. It overcomes the difficulty of prior traffic prediction models by examining the efficiency of the suggested model, such as the operation assessment of external attribute features, and contrasting it with various foundation models to authenticate the efficacy of the projected algorithm. External influences impacting traffic movement are taken into account in full.

The findings demonstrate that the GCN-LSTM approach can successfully increase traffic predictive performance by considering the longitudinal association of road nodes and capturing the time dependency of traffic movement. Furthermore, the GCN-LSTM algorithm is appropriate for both road network traffic movement forecast and mid-and long-term traffic movement forecasting and multistep forecast.

Work Cited

Aghdam, Mahdi Yousefzadeh, et al. "Optimization of air traffic management efficiency based on deep learning enriched by the long short-term memory (LSTM) and extreme learning machine (ELM)." Journal of Big Data 8.1 (2021): 1-26.

Azari, Amin, et al. "Cellular traffic prediction and classification: A comparative evaluation of LSTM and ARIMA." International Conference on Discovery Science. Springer, Cham, 2019.

Cao, Miaomiao, Victor OK Li, and Vincent WS Chan. "A CNN-LSTM model for traffic speed prediction." 2020 IEEE 91st Vehicular Technology Conference (VTC2020-Spring). IEEE, 2020.

Chen, Cen, et al. "Exploiting Spatio-temporal correlations with multiple 3d convolutional neural networks for citywide vehicle flow prediction." 2018 IEEE international conference on data mining (ICDM). IEEE, 2018.

Fan, Dongfang, and Xiaoli Zhang. "Short-term traffic flow prediction method based on balanced binary tree and K-nearest neighbor nonparametric regression." International Conference on Modelling, Simulation and Applied Mathematics. 2017.

Feng, Xinxin, et al. "Adaptive multi-kernel SVM with spatial-temporal correlation for short-term traffic flow prediction." IEEE Transactions on Intelligent Transportation Systems 20.6 (2018): 2001-2013.

Fu, Rui, Zuo Zhang, and Li Li. "Using LSTM and GRU neural network methods for traffic flow prediction." 2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC). IEEE, 2016.

Guo, Shengnan, et al. "Deep spatial-temporal 3D convolutional neural networks for traffic data forecasting." IEEE Transactions on Intelligent Transportation Systems 20.10 (2019): 3913-3926.

Li, Junyi, et al. "Transferability improvement in short-term traffic prediction using stacked LSTM network." Transportation Research Part C: Emerging Technologies 124 (2021): 102977.

Li, Linjia, et al. "A spatial-temporal approach for traffic status analysis and prediction based on Bi-LSTM structure." Modern Physics Letters B 35.31 (2021): 2150481.

Li, Tao, et al. "Short-term traffic congestion prediction with Conv–BiLSTM considering Spatio-temporal features." IET Intelligent Transport Systems 14.14 (2021): 1978-1986.

Li, Yaguang, et al. "Diffusion convolutional recurrent neural network: Data-driven traffic forecasting." arXiv preprint arXiv:1707.01926 (2017).

Li, Yiqun, et al. "A hybrid deep learning framework for long-term traffic flow prediction." IEEE Access 9 (2021): 11264-11271.

Liu, Jiayu, et al. "Method of evaluating and predicting traffic state of highway network based on deep learning." Journal of Advanced Transportation 2021 (2021).

Ma, Xiaolei, et al. "Learning traffic as images: a deep convolutional neural network for large-scale transportation network speed prediction." Sensors 17.4 (2017): 818.

Nguyen, Hoang, et al. "Deep learning methods in transportation domain: a review." IET Intelligent Transport Systems 12.9 (2018): 998-1004.

Nguyen, Tu. "Spatiotemporal tile-based attention-guided lists for traffic video prediction." arXiv preprint arXiv:1910.11030 (2019). https://arxiv.org/abs/1910.11030

Sun, Yunchuan, et al. "Discovering time-dependent shortest path on traffic graph for drivers towards green driving." Journal of Network and Computer Applications 83 (2017): 204-212.

Yang, Senayan, et al. "Ensemble learning for short-term traffic prediction based on gradient boosting machine." Journal of Sensors 2017 (2017).

Yu, Haiyang, et al. "Spatiotemporal recurrent convolutional networks for traffic prediction in transportation networks." Sensors 17.7 (2017): 1501.

Zhang, Chaoyun, and Paul Patras. "Long-term mobile traffic forecasting using deep Spatio-temporal neural networks." Proceedings of the Eighteenth ACM International Symposium on Mobile Ad Hoc Networking and Computing. 2018.

Appendix

Table 1

Performance comparison of different methods.

T (min)

Metrics

HA

ARIMA

SVR

DCRNN

GCN-LSTM

15

RMSE

4.2951

7.2406

4.1455

4.5000

4.1193

MAE

2.7815

4.9824

2.6233

3.1700

2.7701

Accuracy

0.7008

0.4463

0.7112

0.2913

0.7129

30

RMSE

4.2951

6.7899

4.1628

4.5600

4.1207

MAE

2.7815

4.6765

2.6875

3.2300

2.7739

Accuracy

0.7008

0.3845

0.7100

0.2970

0.7126

45

RMSE

4.2951

6.7852

4.1885

4.6000

4.1252

MAE

2.7815

4.6734

2.7359

3.2700

2.7753

Accuracy

0.7008

0.3847

0.7082

0.3021

0.7123

60

RMSE

4.2951

6.7708

4.2156

4.6400

4.1262

MAE

2.7815

4.6655

2.7751

3.3100

2.7811

Accuracy

0.7008

0.3851

0.7063

0.3069

0.7119

GCN-LSTM预测道路交通车辆速度 英文 Vehicle Speed Forecasting Based On GCN-LSTM Combined Model相关推荐

  1. GCN-LSTM预测道路车辆速度英文 Forecasting using spatio-temporal data with combined Graph Convolution LSTM model

    GCN-LSTM模型预测道路交通车辆速度 GCN:又称GNN,图神经网络    LSTM:长短时记忆网络 可做学习参考 Abstract Accurate traffic prediction is ...

  2. GCN-LSTM 预测出租车速度 英文 Taxi Speed Prediction Using GCN-LSTM

    GCN-LSTM模型预测出租车速度 GCN:又称GNN,图神经网络    LSTM:长短时记忆网络 可做学习参考 Summary One of the most valuable findings i ...

  3. lstm中look_back的大小选择_基于时空关联度加权的LSTM短时交通速度预测

    作 者 信 息 刘易诗1,关雪峰1,2,吴华意1,2,曹 军1,张 娜1 (1. 武汉大学 测绘遥感信息工程国家重点实验室,湖北 武汉 430079:2. 地球空间信息技术协同创新中心,湖北 武汉 4 ...

  4. 【论文阅读】Attention Based Spatial-Temporal GCN...Traffic Flow Forecasting[基于注意力的时空图卷积网络交通流预测](1)

    [论文阅读]Attention Based Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting[基于注 ...

  5. 交通流预测爬坑记(二):最简单的LSTM预测交通流,使用tensorflow2实现

    说到时间序列预测,我想一定首先想到RNN,然后想到LSTM,LSTM原理就不说了,网上有很多相关文章. 下面使用tensorflow2.0来实现预测 不得不说tensorflow2.0 太香了,太简单 ...

  6. AI可以预测道路交叉口的车辆轨迹

    文章来源:ATYUN AI平台 来自本田,密歇根大学和印第安纳大学的研究人员开发了一种深度学习系统,可以预测道路交叉口的车辆轨迹. "安全驾驶不仅需要准确识别和定位附近的物体,还需要预测其未 ...

  7. 【回归预测-LSTM预测】基于灰狼算法优化LSTM实现数据回归预测附Matlab代码

    1 内容介绍 一种基于灰狼算法优化LSTM的网络流量预测方法,属于网络流量预测领域,该方法包括以下步骤:对第一网络流量数据集进行极差标准化处理,得到第二网络流量数据集,并划分为训练集和测试集,并确定灰 ...

  8. Chemistry.AI | 基于图卷积神经网络(GCN)预测分子性质

    GCN: Graph Convolutional Network(图卷积网络) 环境准备 Python版本:Python 3.6.8 PyTorch版本:PyTorch1.1.0 RDKit版本:RD ...

  9. LSTM预测MNIST手写数字张量流图分析

    看LSTM的代码感觉封装的太厉害,看的有些模糊,现画了个MNIST的张量流图,便于分析代码 原始代码如下 # View more python learning tutorial on my Yout ...

最新文章

  1. SQL Tips:兼顾检索速度和精确性
  2. docker mysql映射端口映射_docker的简单操作和端口映射
  3. CM: UPDATE_PAYLOAD_FROM_ADDINSCH
  4. DP备份任务失败原因解析
  5. LF 第一模块 考试总结
  6. Linux环境运行Jmeter
  7. idea中npm安装总结与node-sass依赖安装等常见问题避坑总结
  8. 如何去除chrome最常访问的网页
  9. 3.3 垃圾回收算法
  10. NVIDIA携大型台湾服务器制造商:为推AI数据中心设计方案
  11. 什么是软件开发模式?
  12. 【经验】CCF CSP认证问题
  13. html5分辨率异常自动检测
  14. ​UI自动化测试面试题及答案大全
  15. Doctype作用?标准模式与兼容模式各有什么区别?
  16. 60几行代码绘制丘比特爱情之箭!
  17. 用I2C级联扩展器做一个时钟盒子
  18. vmware开启虚拟机时虚拟机黑屏的解决办法
  19. 含含乐-口含烟真的能戒烟替烟?
  20. 又出事了?网站被攻击了?高中生?

热门文章

  1. 暴力字典密码破解之crypt
  2. linux awk命令详解(一) awk语法 awk运算 awk数组
  3. python使用python-docx自动化操作word
  4. html5考试总结300字,期末考试总结反思300字
  5. RALL机制的线程锁
  6. fscanf php,php fscanf 函数_PHP教程
  7. Google Java Style 中文版
  8. 错误1053 服务没有及时响应启动或控制请求
  9. Rosalind Java|Finding a Spliced Motif
  10. Linux开放80端口