과제정보
This work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF-2018R1D1A1B07045976) in (2018).
참고문헌
- M. Feurer and F. Hutter, "Hyperparameter optimization," in F. Hutter, L. Kotthoff, and J. Vanschoren (Eds.), Automated Machine Learning, pp.3-33, Springer, 2019.
- J. Bergstra, D. Yamins, and D. D. Cox, "Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures," in Proc. of the 30th International Conference on Machine Learning, vol.28, pp.115-123, 2013. DOI: 10.5555/3042817.3042832
- J. Bergstra and Y. Bengio, "Random search for hyper-parameter optimization," Journal of machine learning research, vol.13, pp.281-305, 2012. DOI: 10.5555/2188385.2188395
- J. Snoek, H. Larochelle, and R. P. Adams, "Practical bayesian optimization of machine learning algorithms," Advances in Neural Information Processing Systems, vol.25, pp.2951-2959, 2012.
- Brochu, E., Cora, M., and de Freitas, N. "A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning," In TR-2009-23, UBC, 2009.
- J. Wang, J. Xu, and X. Wang, "Combination of hyperband and bayesian optimization for hyperparameter optimization in deep learning," arXiv preprint arXiv:1801.01596, 2018. https://arxiv.org/abs/1801.01596
- S. Falkner, A. Klein, and F. Hutter, "Bohb: Robust and efficient hyperparameter optimization at scale," Proceedings of Machine Learning Research, vol.80, pp.1437-1446, 2018.
- C. Harrington, "Practical guide to hyperparameters optimization for deep learning models," Deep Learning, 2018. https://blog.floydhub.com/guide-to-hyperparameters-search-for-deep-learning-models/
- Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, "Gradient based learning applied to document recognition," Proceedings of the IEEE, vol.86, no.11, pp.2278-2324, 1998.