References
- Smith, Craig S, "The Man Who Helped Turn Toronto into a High-Tech Hotbed," The New York Times. Retrieved 27 June 2017.
- Wu H, Shapiro J L. (2006) Does overfitting affect performance in estimation of distribution algorithms. Conference on Genetic and Evolutionary Computation. ACM, pp.433-434.
- Karystinos G N, Pados D A. (2000) On overfitting, generalization, and randomly expanded training sets. IEEE Transactions on Neural Networks, 11(5):1050. https://doi.org/10.1109/72.870038
- Yann LeCun, Yoshua Bengio & Geoffrey Hinton, "Deep learning," Nature volume521, pages436-444 (28 May 2015). https://doi.org/10.1038/nature14539
- Srivastava N, Hinton G, Krizhevsky A, et al. (2014) Dropout: a simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(1):1929-1958.
- Yip K Y, Gerstein M. (2009) Training set expansion: an approach to improving the reconstruction of biological networks from limited and uneven reliable interactions. State of the art of air pollution control techniques for industrial processes and power generation. Dept. of Civil Engineering, College of Engineering, University of Tennessee, pp.243-250.
- T. J. O'shea, J. Corgan, and T. C. Clancy, "Convolutional radio modulation recognition networks," in Proc. International conference on engineering applications of neural networks, 2016, pp. 213-226.