DOI QR코드

DOI QR Code

Prediction for Bicycle Demand using Spatial-Temporal Graph Models

시-공간 그래프 모델을 이용한 자전거 대여 예측

  • Jangwoo Park (Dept. of Artificial Intelligent Eng., Sunchon National University)
  • 박장우 (국립순천대학교 인공지능공학부)
  • Received : 2023.10.23
  • Accepted : 2023.12.21
  • Published : 2023.12.31

Abstract

There is a lot of research on using a combination of graph neural networks and recurrent neural networks as a way to account for both temporal and spatial dependencies. In particular, graph neural networks are an emerging area of research. Seoul's bicycle rental service (aka Daereungi) has rental stations all over the city of Seoul, and the rental information at each station is a time series that is faithfully recorded. The rental information of each rental station has temporal characteristics that show periodicity over time, and regional characteristics are also thought to have important effects on the rental status. Regional correlations can be well understood using graph neural networks. In this study, we reconstructed the time series data of Seoul's bicycle rental service into a graph and developed a rental prediction model that combines a graph neural network and a recurrent neural network. We considered temporal characteristics such as periodicity over time, regional characteristics, and the degree importance of each rental station.

시간-공간적 의존성을 모두 고려하는 방법으로 그래프 신경망과 순환 신경망을 함께 사용하는 연구가 많이 진행되고 있다. 특히 그래프 신경망은 새롭게 활발히 연구되고 있는 분야이다. 서울시 자전거 대여 서비스(일명 따릉이)는 서울시 곳곳에 대여소를 갖추고 있으며 각 대여소에서 대여 정보가 충실하게 기록되어 있는 시계열 자료이다. 각 대여소의 대여 정보는 시간에 따른 주기성을 보이는 시간적인 특성을 갖추고 있으며, 지역적인 특성도 대여 현황에 큰 영향을 미치리라고 생각된다. 지역적 상관관계는 그래프 신경망을 이용하여 잘 이해할 수 있다. 이 연구에서는 서울시 자전거 대여 서비스의 시계열 데이터를 그래프로 재구성하고 그래프 신경망과 순차 신경망을 결합한 대여 예측 모델을 개발하였다. 시간에 따른 주기성과 같은 시간 특성과 지역적인 특성 및 각 대여소의 중요도 정도를 고려하였다. 대여소의 중요도 정도는 대여량 예측에 중요한 인자로 사용됨을 확인하였다.

Keywords

References

  1. Seoul Bike, https://www.bikeseoul.com/ 
  2. V.E.Sathishkumar and Y.Cho. "A rule-based model for Seoul Bike sharing demand prediction using weather data." European Journal of Remote Sensing 53. vol.sup1., pp.166-183, 2020.  https://doi.org/10.1080/22797254.2020.1725789
  3. V.E.Sathishkumar, J.Park, and Y.Cho. "Seoul bike trip duration prediction using data mining techniques." IET Intelligent Transport Systems, Vol.14, No.11, pp.1465-1474, 2020.  https://doi.org/10.1049/iet-its.2019.0796
  4. V.E.Sathishkumar, J.Park, and Y.Cho. "Using data mining techniques for bike sharing demand prediction in metropolitan city." Computer Communications 153, pp.353-366, 2020.  https://doi.org/10.1016/j.comcom.2020.02.007
  5. C.Zhang, J.J.Q.Yu, and Y.Liu. "Spatial-temporal graph attention networks: A deep learning approach for traffic forecasting." IEEE Access 7, 166246-166256, 2019.  https://doi.org/10.1109/ACCESS.2019.2953888
  6. X.Kong., W.Xin, X.Wei, P.Bao, J.Zhang and W.Lu. "STGAT: Spatial-temporal graph attention networks for traffic flow forecasting." IEEE Access 8, 134363-134372., 2020.  https://doi.org/10.1109/ACCESS.2020.3011186
  7. B.Yu, H.Yin, and Z.Zhu. "Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting." arXiv preprint arXiv:1709.04875, 2017. 
  8. J.Zhu, Y.Song, L.Zhao and H.Li "A3t-gcn: Attention temporal graph convolutional network for traffic forecasting." ISPRS International Journal of Geo-Information, 10, Vol.7, p.485, 2021. 
  9. C.Ying, T.Cai, S.Luo, S.Zheng, G.Ke, D.He, T.Shen and T.Y.Liu, "Do transformers really perform badly for graph representation?" Advances in Neural Information Processing Systems 34, 28877-28888., 2021. 
  10. Pytorch-Geometric, https://pytorch-geometric.readthedocs.io/en/latest/index.html 
  11. T.N.Kipf, and M.Welling. "Semi-supervised classification with graph convolutional networks." arXiv preprint arXiv:1609.02907, 2016. 
  12. P.Velickovic, G.Cucurull, A.Casanova, A.Romero, P.Lio, and Y.Bengio "Graph attention networks." arXiv preprint arXiv:1710.10903, 2017. 
  13. S. rody, U.Alon, and E.Yahav. "How attentive are graph attention networks?" arXiv preprint arXiv:2105.14491, 2021. 
  14. R.C.Staudemeyer and E.R.Morris. "Understanding LSTM-a tutorial into long short-term memory recurrent neural networks." arXiv preprint arXiv:1909.09586, 2019. 
  15. K.Cho, B.Merrienboer, C.Caglar, D.Bahdanau, F.Bougares, H.Schwenk and Y.Bengio "Learning phrase representations using RNN encoder-decoder for statistical machine translation." arXiv preprint arXiv:1406.1078, 2014. 
  16. S.Guo, Y.Lin, N.Feng, C.Song, and H.Wan "Attention based spatial-temporal graph convolutional networks for traffic flow forecasting." Proceedings of the AAAI conference on artificial intelligence., Vol.33., No.1., pp.922-929, 2019. 
  17. Power transform, Wikipedia, https://en.wikipedia.org/wiki/Power_transform 
  18. sklearn, PowerTransformer, https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.PowerTransformer.html 
  19. Y.Seo, M.Defferrard, P.Vandergheynst and X. resson "Structured sequence modeling with graph convolutional recurrent networks." Neural Information Processing: 25th International Conference, ICONIP 2018, Siem Reap, Cambodia, December, 13-16, pp.362-373, 2018. 
  20. J. CHen, X.Wang, and X.Xu. "GC-LSTM: Graph convolution embedded LSTM for dynamic network link prediction." Applied Intelligence, pp.1-16., 2022. 
  21. Seoul Open Data Square, https://data.seoul.go.kr/