참고문헌
- Li, Y., yu Chen, C., and Wasserman, W. W. (2015). Deep feature selection: Theory and application to identify enhancers and promoters. JOURNAL OF COMPUTATIONAL BIOLOGY, pages 1{15.
- Marbach, D., Scha_ter, T., Mattiussi, C., and Floreano, D. (2009). Generating realistic in silico gene networks for performance assessment of reverse engineering methods. Journal of Computational Biology, 16(2):229{239.
- Montavon, G., Lapuschkin, S., Binder, A., Samek, W., and M uller, K.-R. (2017). Explaining nonlinear classi_cation decisions with deep taylor decomposition. Pattern Recognition, 65:211{222.
- Nair, V. and Hinton, G. E. (2010). Recti_ed linear units improve restricted boltzmann machines. In Proceedings of the 27th international conference on machine learning (ICML-10), pages 807{814.
- Nilsson, R., Pe~na, J. M., Bj orkegren, J., and Tegn_er, J. (2007). Consistent feature selection for pattern recognition in polynomial time. Journal of Machine Learning Research, 8(Mar):589{612.
- Qi, Y. (2012). Random forest for bioinformatics. In Ensemble machine learning, pages 307{323. Springer.
- Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1988). Learning representations by back-propagating errors. Cognitive modeling, 5(3):1.
- Saeys, Y., Inza, I., and Larraaga, P. (2007). A review of feature selection techniques in bioinformatics. Bioinformatics, 23(19):2507.
- Shrikumar, A., Greenside, P., and Kundaje, A. (2017). Learning important features through propagating activation di_erences. arXiv preprint arXiv:1704.02685.
- Srivastava, N., Hinton, G. E., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from over_tting. Journal of Machine Learning Research, 15(1):1929{1958.
- Stolovitzk, G., Monroe, D., and Califano, A. (2007). Dialogue on reverse- engineering assessment and methods: The dream of high-throughput pathway inference. Annals of the New York Academy of Sciences, 1115:11{22.
- Stolovitzky, G., Prill, R., and Califano, A. (2009). Lessons from the dream2 challenges. Annals of the New York Academy of Sciences, 1158:159{95.
- Strobl, C., Boulesteix, A.-L., Kneib, T., Augustin, T., and Zeileis, A. (2008). Conditional variable importance for random forests. BMC bioinformatics, 9(1):307.
- Wang, M., Chen, X., and Zhang, H. (2010). Maximal conditional chi-square importance in random forests. Bioinformatics, 26(6):831{837.
- Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(2):301{320.
- Altmann, A., Tolo_si, L., Sander, O., and Lengauer, T. (2010). Permutation importance: a corrected feature importance measure. Bioinformatics, 26(10):1340{1347.
- Chen, X., Liu, C.-T., Zhang, M., and Zhang, H. (2007). A forest-based approach to identifying gene and gene{gene interactions. Proceedings of the National Academy of Sciences, 104(49):19199{19203.
- Debaditya, R., Sri, R. M. K., and Krishna, M. C. (2015). Feature selection using deep neural networks.
- Montavon, G., Lapuschkin, S., Binder, A., Samek, W., and M uller, K.-R. (2017). Explaining nonlinear classi_cation decisions with deep taylor decomposition. Pattern Recognition, 65:211{222.
- Nair, V. and Hinton, G. E. (2010). Recti_ed linear units improve restricted boltzmann machines. In Proceedings of the 27th international conference on machine learning (ICML-10), pages 807{814.
- Nilsson, R., Pe~na, J. M., Bj orkegren, J., and Tegn_er, J. (2007). Consistent feature selection for pattern recognition in polynomial time. Journal of Machine Learning Research, 8(Mar):589{612.
- Qi, Y. (2012). Random forest for bioinformatics. In Ensemble machine learning, pages 307{323. Springer.
- Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1988). Learning representations by back-propagating errors. Cognitive modeling, 5(3):1.
- Saeys, Y., Inza, I., and Larraaga, P. (2007). A review of feature selection techniques in bioinformatics. Bioinformatics, 23(19):2507.
- Shrikumar, A., Greenside, P., and Kundaje, A. (2017). Learning important features through propagating activation di_erences. arXiv preprint arXiv:1704.02685.
- Srivastava, N., Hinton, G. E., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from over_tting. Journal of Machine Learning Research, 15(1):1929{1958.
- Stolovitzk, G., Monroe, D., and Califano, A. (2007). Dialogue on reverse- engineering assessment and methods: The dream of high-throughput pathway inference. Annals of the New York Academy of Sciences, 1115:11{22.
- Stolovitzky, G., Prill, R., and Califano, A. (2009). Lessons from the dream2 challenges. Annals of the New York Academy of Sciences, 1158:159{95.
- Strobl, C., Boulesteix, A.-L., Kneib, T., Augustin, T., and Zeileis, A. (2008). Conditional variable importance for random forests. BMC bioinformatics, 9(1):307.
- Wang, M., Chen, X., and Zhang, H. (2010). Maximal conditional chi-square importance in random forests. Bioinformatics, 26(6):831{837.