References
- Seoul Bike, https://www.bikeseoul.com/
- V.E.Sathishkumar and Y.Cho. "A rule-based model for Seoul Bike sharing demand prediction using weather data." European Journal of Remote Sensing 53. vol.sup1., pp.166-183, 2020. https://doi.org/10.1080/22797254.2020.1725789
- V.E.Sathishkumar, J.Park, and Y.Cho. "Seoul bike trip duration prediction using data mining techniques." IET Intelligent Transport Systems, Vol.14, No.11, pp.1465-1474, 2020. https://doi.org/10.1049/iet-its.2019.0796
- V.E.Sathishkumar, J.Park, and Y.Cho. "Using data mining techniques for bike sharing demand prediction in metropolitan city." Computer Communications 153, pp.353-366, 2020. https://doi.org/10.1016/j.comcom.2020.02.007
- C.Zhang, J.J.Q.Yu, and Y.Liu. "Spatial-temporal graph attention networks: A deep learning approach for traffic forecasting." IEEE Access 7, 166246-166256, 2019. https://doi.org/10.1109/ACCESS.2019.2953888
- X.Kong., W.Xin, X.Wei, P.Bao, J.Zhang and W.Lu. "STGAT: Spatial-temporal graph attention networks for traffic flow forecasting." IEEE Access 8, 134363-134372., 2020. https://doi.org/10.1109/ACCESS.2020.3011186
- B.Yu, H.Yin, and Z.Zhu. "Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting." arXiv preprint arXiv:1709.04875, 2017.
- J.Zhu, Y.Song, L.Zhao and H.Li "A3t-gcn: Attention temporal graph convolutional network for traffic forecasting." ISPRS International Journal of Geo-Information, 10, Vol.7, p.485, 2021.
- C.Ying, T.Cai, S.Luo, S.Zheng, G.Ke, D.He, T.Shen and T.Y.Liu, "Do transformers really perform badly for graph representation?" Advances in Neural Information Processing Systems 34, 28877-28888., 2021.
- Pytorch-Geometric, https://pytorch-geometric.readthedocs.io/en/latest/index.html
- T.N.Kipf, and M.Welling. "Semi-supervised classification with graph convolutional networks." arXiv preprint arXiv:1609.02907, 2016.
- P.Velickovic, G.Cucurull, A.Casanova, A.Romero, P.Lio, and Y.Bengio "Graph attention networks." arXiv preprint arXiv:1710.10903, 2017.
- S. rody, U.Alon, and E.Yahav. "How attentive are graph attention networks?" arXiv preprint arXiv:2105.14491, 2021.
- R.C.Staudemeyer and E.R.Morris. "Understanding LSTM-a tutorial into long short-term memory recurrent neural networks." arXiv preprint arXiv:1909.09586, 2019.
- K.Cho, B.Merrienboer, C.Caglar, D.Bahdanau, F.Bougares, H.Schwenk and Y.Bengio "Learning phrase representations using RNN encoder-decoder for statistical machine translation." arXiv preprint arXiv:1406.1078, 2014.
- S.Guo, Y.Lin, N.Feng, C.Song, and H.Wan "Attention based spatial-temporal graph convolutional networks for traffic flow forecasting." Proceedings of the AAAI conference on artificial intelligence., Vol.33., No.1., pp.922-929, 2019.
- Power transform, Wikipedia, https://en.wikipedia.org/wiki/Power_transform
- sklearn, PowerTransformer, https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.PowerTransformer.html
- Y.Seo, M.Defferrard, P.Vandergheynst and X. resson "Structured sequence modeling with graph convolutional recurrent networks." Neural Information Processing: 25th International Conference, ICONIP 2018, Siem Reap, Cambodia, December, 13-16, pp.362-373, 2018.
- J. CHen, X.Wang, and X.Xu. "GC-LSTM: Graph convolution embedded LSTM for dynamic network link prediction." Applied Intelligence, pp.1-16., 2022.
- Seoul Open Data Square, https://data.seoul.go.kr/