DOI QR코드

DOI QR Code

Application of Different Tools of Artificial Intelligence in Translation Language

  • 투고 : 2023.03.05
  • 발행 : 2023.03.30

초록

With progressive advancements in Man-made consciousness (computer based intelligence) and Profound Learning (DL), contributing altogether to Normal Language Handling (NLP), the precision and nature of Machine Interpretation (MT) has worked on complex. There is a discussion, but that its no time like the present the human interpretation became immaterial or excess. All things considered, human flaws are consistently dealt with by its own creations. With the utilization of brain networks in machine interpretation, its been as of late guaranteed that keen frameworks can now decipher at standard with human interpreters. In any case, simulated intelligence is as yet not without any trace of issues related with handling of a language, let be the intricacies and complexities common of interpretation. Then, at that point, comes the innate predispositions while planning smart frameworks. How we plan these frameworks relies upon what our identity is, subsequently setting in a one-sided perspective and social encounters. Given the variety of language designs and societies they address, their taking care of by keen machines, even with profound learning abilities, with human proficiency looks exceptionally far-fetched, at any rate, for the time being.

키워드

참고문헌

  1. Bahdanau, D., Cho, K., & Bengio, Y. (2015). Neural machine translation by jointly learning to align and translate. International Conference on Learning Representations.
  2. Bentivogli, L., Bisazza, A., Cettolo, M. & Federico, M. (2016). Neural versus phrase-based machine translation quality: a case study. Empirical Methods in Natural Language Processing (EMNLP).
  3. Berger, A., Della Pietra, S. A., & Della Pietra, V. J. (1996). A maximum entropy approach to natural language processing. Computational Linguistics, 22: 1, 39-71.
  4. Blei, D. M., Ng, A. Y., & Jordan, M. I. (2003). Latent Dirichlet allocation. Journal of Machine Learning Research, 3: 5, 993-1022.
  5. Cho, K., van Merrienboer, B., Gulcehre, C., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder- decoder for statistical machine translation. Conference on Empirical Methods in Natural Language Processing.
  6. Dar, Pranav. 2018. Microsoft's language translation: AI has reached human levels of accuracy. Analytics Vidhya. Available at: https://www.analyticsvidhya.com/blog/2018/03/microsofts-claims-language-translation-ai-reached-humanlevels-accuracy/
  7. Hajic, J. (1998). Building a Syntactically Annotated Corpus: The Prague Dependency Treebank (pp. 106-132). Karolinum.
  8. Hao, Karen. (2018). Baidu's Chinese-to-English translator finishes your sentence for you. MIT Technology Review. Available at: https://www.technologyreview.com/thedownload/612338/baidus-chinese-to-englishtranslator-finishes-your-sentence-for-you/
  9. Hochreiter, S., & Schmidhuber, J. 1997. Long short-term memory. Neural Computation.
  10. Hofstadter, Douglas. (2018). The Shallowness of Google Translate. The Atlantic. Available at: https://www.theatlantic.com/technology/archive/2018/01/the-shallowness-of-google-translate/551570/