Acknowledgement
This research was supported by "Regional Innovation Strategy (RIS)" through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (MOE)(2021RIS-002)
References
- P. Siano, "Demand response and smart grids: A survey," Renew Sustain Energy Rev., vol. 30 no. 37, 2014, pp. 461-478. https://doi.org/10.1016/j.rser.2013.10.022
- J. Joo and J. Oh, "Efficient Grid-Independent ESS control System by Prediction of Energy Production Consumption," J. of the Korea Institute of Electronic Communication Sciences, vol. 14 no. 1, 2020, pp.155-160.
- E. Kwak and C. Moon, "Analysis of Power System Stability by Deployment of Renewable Energy Resources," J. of the Korea Institute of Electronic Communication Sciences, vol. 16 no. 4, 2021, pp.633-642.
- Z. Wang and T. Hong, "Reinforcement learning for building controls: The opportunities and challenges," Appl Energy, vol.. 269, 2020. pp.115036
- Yang, Shiyu, H. Oliver Gao, and Fengqi You. "Model predictive control for Demand- and MarketResponsive building energy management by leveraging active latent heat storage," Appl Energy, vol. 327, 2022. pp.120054
- D. Mariano-Hern'andez, L. Hernandez-Callejo, A. Zorita-Lamadrid, O. Duque-P'erez, and F. Santos Garcia, "A review of strategies for building energy management system: Model predictive control, demand side management, optimization, and fault detect & diagnosis," Journal of Building Engineering, vol. 33, 2021. pp.101692
- C, Henggeler Antunes, M. J. Alves, and I. Soares, "A comprehensive and modular set of appliance operation MILP models for demand response optimization," Advances Applied Energy, vol. 320, 2022. pp.119142
- J. R. Vazquez-Canteli and Z. Nagy, "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Advances Applied Energy, vol. 235 no. 1, 2019, pp. 1072-1089. https://doi.org/10.1016/j.apenergy.2018.11.002
- J. Schulman, F. Wolski, P. Dhariwal, A. Radford, and O. klimov, "Proximal policy optimization algorithms," arXiv preprint, vol. 1707, 2017. pp.06347
- T. P. Lillicrap, J. J. Hunt, A. Pritzel, N. Heess, T. Erez, Y. Tassa, D. Silver, D. Wiersta, "Continuous control with deep reinforcement learning," in Int. Conf. on Learning Representations (ICLR), San Juan City, PR, USA, 2016, pp. 1-14.
- S. Fujimoto, H. V. Hoof, and D. Meger, "Addressing function approximation error in actor-critic methods," in Proc. of the 35nd Int. Conf. on Machine Learning (ICML), Stockholm City, Sweden, 2018, pp. 1587-1596.
- A. Ajagekar and F. You, "Deep reinforcement learning based unit commitment scheduling under load and wind power uncertainty," IEEE Trans Sustain Energy, vol. 14 no. 2, 2023, pp. 790-803.
- R. Lu and Sh. Hong, "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Advances Applied Energy, vol. 236 no. 8, 2019, pp. 937-949. https://doi.org/10.1016/j.apenergy.2018.12.061
- R. Jin, Y. Zhou, C. Lu, and J. Song, "Deep reinforcement learning-based strategy for charging station participating in demand response," Advances Applied Energy, Vol328. 2022, pp.120140
- X. Kong, D. Kong, J. Yao, L. Bai, and J. Xiao. "Online pricing of demand response based on long short-term memory and reinforcement learning," Advances Applied Energy, vol. 328, 2022. pp.120140
- J. V' azquez-Canteli, S. Dey, G. Henze, and Z. Nagy, "CityLearn: Standardizing Research in Multi-Agent Reinforcement Learning for Demand Response and Urban Energy Management," arXiv preprint, vol 2012, 2020. pp.10504
- S. Jung, C. Sim, S. Park, and J. Kim, "A Novel of solar Heat collection Device Prototype using Parabolic based on Solar Light Tracking," J. of the Korea Institute of Electronic Communication Sciences, vol. 11 no. 3, 2018, pp. 411-420. https://doi.org/10.13067/JKIECS.2016.11.4.411
- M. Kang, "Renewable Energy Generation Prediction Model using Meterological Big Data," J. of the Korea Institute of Electronic Communication Sciences, vol. 18 no. 1, 2023, pp. 39-44.