• Title/Summary/Keyword: source text

Search Result 267, Processing Time 0.026 seconds

Fake News Detection on YouTube Using Related Video Information (관련 동영상 정보를 활용한 YouTube 가짜뉴스 탐지 기법)

  • Junho Kim;Yongjun Shin;Hyunchul Ahn
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.3
    • /
    • pp.19-36
    • /
    • 2023
  • As advances in information and communication technology have made it easier for anyone to produce and disseminate information, a new problem has emerged: fake news, which is false information intentionally shared to mislead people. Initially spread mainly through text, fake news has gradually evolved and is now distributed in multimedia formats. Since its founding in 2005, YouTube has become the world's leading video platform and is used by most people worldwide. However, it has also become a primary source of fake news, causing social problems. Various researchers have been working on detecting fake news on YouTube. There are content-based and background information-based approaches to fake news detection. Still, content-based approaches are dominant when looking at conventional fake news research and YouTube fake news detection research. This study proposes a fake news detection method based on background information rather than content-based fake news detection. In detail, we suggest detecting fake news by utilizing related video information from YouTube. Specifically, the method detects fake news through CNN, a deep learning network, from the vectorized information obtained from related videos and the original video using Doc2vec, an embedding technique. The empirical analysis shows that the proposed method has better prediction performance than the existing content-based approach to detecting fake news on YouTube. The proposed method in this study contributes to making our society safer and more reliable by preventing the spread of fake news on YouTube, which is highly contagious.

A Blockchain Network Construction Tool and its Electronic Voting Application Case (블록체인 자동화도구 개발과 전자투표 적용사례)

  • AING TECKCHUN;KONG VUNGSOVANREACH;Okki Kim;Kyung-Hee Lee;Wan-Sup Cho
    • The Journal of Bigdata
    • /
    • v.6 no.2
    • /
    • pp.151-159
    • /
    • 2021
  • Construction of a blockchain network needs a cumbersome and time consuming activity. To overcome these limitations, global IT companies such as Microsoft are providing cloud-based blockchain services. In this paper, we propose a blockchain-based construction and management tool that enables blockchain developers, blockchain operators, and enterprises to deploy blockchain more comfortably in their infrastructure. This tool is implemented using Hyperledger Fabric, one of the famous private blockchain platforms, and Ansible, an open-source IT automation engine that supports network-wide deployment. Instead of complex and repetitive text commands, the tool provides a user-friendly web dashboard interface that allows users to seamlessly set up, deploy and interact with a blockchain network. With this proposed solution, blockchain developers, operators, and blockchain researchers can more easily build blockchain infrastructure, saving time and cost. To verify the usefulness and convenience of the proposed tool, a blockchain network that conducts electronic voting was built and tested. The construction of a blockchain network, which consists of writing more than 10 setting files and executing commands over hundreds of lines, can be replaced with simple input and click operations in the graphical user interface, saving user convenience and time. The proposed blockchain tool will be used to build trust data infrastructure in various fields such as food safety supply chain construction in the future.

Liaohe National Park based on big data visualization Visitor Perception Study

  • Qi-Wei Jing;Zi-Yang Liu;Cheng-Kang Zheng
    • Journal of the Korea Society of Computer and Information
    • /
    • v.28 no.4
    • /
    • pp.133-142
    • /
    • 2023
  • National parks are one of the important types of protected area management systems established by IUCN and a management model for implementing effective conservation and sustainable use of natural and cultural heritage in countries around the world, and they assume important roles in conservation, scientific research, education, recreation and driving community development. In the context of big data, this study takes China's Liaohe National Park, a typical representative of global coastal wetlands, as a case study, and using Python technology to collect tourists' travelogues and reviews from major OTA websites in China as a source. The text spans from 2015 to 2022 and contains 2998 reviews with 166,588 words in total. The results show that wildlife resources, natural landscape, wetland ecology and the fishing and hunting culture of northern China are fully reflected in the perceptions of visitors to Liaohe National Park; visitors have strong positive feelings toward Liaohe National Park, but there is still much room for improvement in supporting services and facilities, public education and visitor experience and participation.

A School-tailored High School Integrated Science Q&A Chatbot with Sentence-BERT: Development and One-Year Usage Analysis (인공지능 문장 분류 모델 Sentence-BERT 기반 학교 맞춤형 고등학교 통합과학 질문-답변 챗봇 -개발 및 1년간 사용 분석-)

  • Gyeongmo Min;Junehee Yoo
    • Journal of The Korean Association For Science Education
    • /
    • v.44 no.3
    • /
    • pp.231-248
    • /
    • 2024
  • This study developed a chatbot for first-year high school students, employing open-source software and the Korean Sentence-BERT model for AI-powered document classification. The chatbot utilizes the Sentence-BERT model to find the six most similar Q&A pairs to a student's query and presents them in a carousel format. The initial dataset, built from online resources, was refined and expanded based on student feedback and usability throughout over the operational period. By the end of the 2023 academic year, the chatbot integrated a total of 30,819 datasets and recorded 3,457 student interactions. Analysis revealed students' inclination to use the chatbot when prompted by teachers during classes and primarily during self-study sessions after school, with an average of 2.1 to 2.2 inquiries per session, mostly via mobile phones. Text mining identified student input terms encompassing not only science-related queries but also aspects of school life such as assessment scope. Topic modeling using BERTopic, based on Sentence-BERT, categorized 88% of student questions into 35 topics, shedding light on common student interests. A year-end survey confirmed the efficacy of the carousel format and the chatbot's role in addressing curiosities beyond integrated science learning objectives. This study underscores the importance of developing chatbots tailored for student use in public education and highlights their educational potential through long-term usage analysis.

The Standard of Judgement on Plagiarism in Research Ethics and the Guideline of Global Journals for KODISA (KODISA 연구윤리의 표절 판단기준과 글로벌 학술지 가이드라인)

  • Hwang, Hee-Joong;Kim, Dong-Ho;Youn, Myoung-Kil;Lee, Jung-Wan;Lee, Jong-Ho
    • Journal of Distribution Science
    • /
    • v.12 no.6
    • /
    • pp.15-20
    • /
    • 2014
  • Purpose - In general, researchers try to abide by the code of research ethics, but many of them are not fully aware of plagiarism, unintentionally committing the research misconduct when they write a research paper. This research aims to introduce researchers a clear and easy guideline at a conference, which helps researchers avoid accidental plagiarism by addressing the issue. This research is expected to contribute building a climate and encouraging creative research among scholars. Research design, data, methodology & Results - Plagiarism is considered a sort of research misconduct along with fabrication and falsification. It is defined as an improper usage of another author's ideas, language, process, or results without giving appropriate credit. Plagiarism has nothing to do with examining the truth or accessing value of research data, process, or results. Plagiarism is determined based on whether a research corresponds to widely-used research ethics, containing proper citations. Within academia, plagiarism goes beyond the legal boundary, encompassing any kind of intentional wrongful appropriation of a research, which was created by another researchers. In summary, the definition of plagiarism is to steal other people's creative idea, research model, hypotheses, methods, definition, variables, images, tables and graphs, and use them without reasonable attribution to their true sources. There are various types of plagiarism. Some people assort plagiarism into idea plagiarism, text plagiarism, mosaic plagiarism, and idea distortion. Others view that plagiarism includes uncredited usage of another person's work without appropriate citations, self-plagiarism (using a part of a researcher's own previous research without proper citations), duplicate publication (publishing a researcher's own previous work with a different title), unethical citation (using quoted parts of another person's research without proper citations as if the parts are being cited by the current author). When an author wants to cite a part that was previously drawn from another source the author is supposed to reveal that the part is re-cited. If it is hard to state all the sources the author is allowed to mention the original source only. Today, various disciplines are developing their own measures to address these plagiarism issues, especially duplicate publications, by requiring researchers to clearly reveal true sources when they refer to any other research. Conclusions - Research misconducts including plagiarism have broad and unclear boundaries which allow ambiguous definitions and diverse interpretations. It seems difficult for researchers to have clear understandings of ways to avoid plagiarism and how to cite other's works properly. However, if guidelines are developed to detect and avoid plagiarism considering characteristics of each discipline (For example, social science and natural sciences might be able to have different standards on plagiarism.) and shared among researchers they will likely have a consensus and understanding regarding the issue. Particularly, since duplicate publications has frequently appeared more than plagiarism, academic institutions will need to provide pre-warning and screening in evaluation processes in order to reduce mistakes of researchers and to prevent duplicate publications. What is critical for researchers is to clearly reveal the true sources based on the common citation rules and to only borrow necessary amounts of others' research.

A Study on Gusadang Kim Nakhaeng's Writing for Ancestral Rites - Exploring the source of his appealing (구사당(九思堂) 김낙행(金樂行)의 제문(祭文) 연구(硏究) - 호소력의 근원에 대한 탐색 -)

  • Jeong, Si-youl
    • (The)Study of the Eastern Classic
    • /
    • no.59
    • /
    • pp.93-120
    • /
    • 2015
  • The purpose of this study is to explore the source of appealing which Gusadang Kim Nakhaeng's writing for ancestral rites is equipped with. Gusadang was one of the Confucianists in Yeongnam during the 18th century and was praised for his scholarly virtue of jihaenghapil and silcheongunghaeng. Although Gusadang's writing for ancestral rites and his teacher Milam Lee Jaeui's letters were even specially named as 'gujemilchal', there has been almost no research on Gusadang's writing for ancestral rites yet. Therefore, this study selects three pieces of Gusadang's writing for ancestral rites which are especially rich in emotional expression for discussion. Chapter 2 titled as 'the Reconstruction of Memory in a Microscopic Perspective' presents the reason why Gusadang's writing for ancestral rites is recognized even as a piece of work equipped with appealing. Writing for ancestral rites begins from the point that there exists memory that can be shared by both the living and the dead. In reconstructing the anecdote with the dead on the stage of ritual writing in detail, the writer's memory plays an important role. Chapter 3 titled as 'the Rhetorical Reconstruction of Elevated Sensitivity' examines rhetorical devices needed for writing for ancestral rites. Proper rhetoric is needed to upgrade the dignity of the ritual writing and arouse sympathy from the readers. Although writing for ancestral rites is supposed to express sadness in terms of its formal characteristics, it should not end up being a mere outlet of emotion. Chapter 4 looks into 'the Descriptive Reconstruction of Lamenting Sentiment'. There should be a clear focus of description to make the gesture of the living towards the being not existing in the world any longer an appealing story. While maintaining a distinct way of description, Gusadang organizes the noble character of the dead, pitiable death, the precious bond in the past, and the longing of those left for the dead systematically. Writing for ancestral rites is a field to mourn over the death and reproduce the sadness of the living through writing. To make the text written in that way get to work as ritual writing properly, it should be appealing necessarily. This study has found the fact that such appealing that gives life to ritual writing is grounded on authenticity.

Digital Archives of Cultural Archetype Contents: Its Problems and Direction (디지털 아카이브즈의 문제점과 방향 - 문화원형 콘텐츠를 중심으로 -)

  • Hahm, Han-Hee;Park, Soon-Cheol
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.17 no.2
    • /
    • pp.23-42
    • /
    • 2006
  • This is a study of the digital archives of Culturecontent.com where 'Cultural Archetype Contents' are currently in service. One of the major purposes of our study is to point out problems in the current system and eventually propose improvements to the digital archives. The government launched a four-year project for developing the cultural archetype content sources and establishing its related business with the hope of enhancing the nation's competitiveness. More specifically, the project focuses on the production of source materials of cultural archetype contents in the subjects of Korea's history. tradition, everyday life. arts and general geographical books. In addition, through this project, the government also intends to establish a proper distribution system of digitalized culture contents and to control copyright issues. This paper analyzes the digital archives system that stores the culture content data that have been produced from 2002 to 2005 and evaluates the current system's weaknesses and strengths. The summary of our findings is as follows. First. the digital archives system does not contain a semantic search engine and therefore its full function is 1agged. Second, similar data is not classified into the same categories but into the different ones, thereby confusing and inconveniencing users. Users who want to find source materials could be disappointed by the current distributive system. Our paper suggests a better system of digital archives with text mining technology which consists of five significant intelligent process-keyword searches, summarization, clustering, classification and topic tracking. Our paper endeavors to develop the best technical environment for preserving and using culture contents data. With the new digitalized upgraded settings, users of culture contents data will discover a world of new knowledge. The technology we introduce in this paper will lead to the highest achievable digital intelligence through a new framework.

Twitter Issue Tracking System by Topic Modeling Techniques (토픽 모델링을 이용한 트위터 이슈 트래킹 시스템)

  • Bae, Jung-Hwan;Han, Nam-Gi;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.109-122
    • /
    • 2014
  • People are nowadays creating a tremendous amount of data on Social Network Service (SNS). In particular, the incorporation of SNS into mobile devices has resulted in massive amounts of data generation, thereby greatly influencing society. This is an unmatched phenomenon in history, and now we live in the Age of Big Data. SNS Data is defined as a condition of Big Data where the amount of data (volume), data input and output speeds (velocity), and the variety of data types (variety) are satisfied. If someone intends to discover the trend of an issue in SNS Big Data, this information can be used as a new important source for the creation of new values because this information covers the whole of society. In this study, a Twitter Issue Tracking System (TITS) is designed and established to meet the needs of analyzing SNS Big Data. TITS extracts issues from Twitter texts and visualizes them on the web. The proposed system provides the following four functions: (1) Provide the topic keyword set that corresponds to daily ranking; (2) Visualize the daily time series graph of a topic for the duration of a month; (3) Provide the importance of a topic through a treemap based on the score system and frequency; (4) Visualize the daily time-series graph of keywords by searching the keyword; The present study analyzes the Big Data generated by SNS in real time. SNS Big Data analysis requires various natural language processing techniques, including the removal of stop words, and noun extraction for processing various unrefined forms of unstructured data. In addition, such analysis requires the latest big data technology to process rapidly a large amount of real-time data, such as the Hadoop distributed system or NoSQL, which is an alternative to relational database. We built TITS based on Hadoop to optimize the processing of big data because Hadoop is designed to scale up from single node computing to thousands of machines. Furthermore, we use MongoDB, which is classified as a NoSQL database. In addition, MongoDB is an open source platform, document-oriented database that provides high performance, high availability, and automatic scaling. Unlike existing relational database, there are no schema or tables with MongoDB, and its most important goal is that of data accessibility and data processing performance. In the Age of Big Data, the visualization of Big Data is more attractive to the Big Data community because it helps analysts to examine such data easily and clearly. Therefore, TITS uses the d3.js library as a visualization tool. This library is designed for the purpose of creating Data Driven Documents that bind document object model (DOM) and any data; the interaction between data is easy and useful for managing real-time data stream with smooth animation. In addition, TITS uses a bootstrap made of pre-configured plug-in style sheets and JavaScript libraries to build a web system. The TITS Graphical User Interface (GUI) is designed using these libraries, and it is capable of detecting issues on Twitter in an easy and intuitive manner. The proposed work demonstrates the superiority of our issue detection techniques by matching detected issues with corresponding online news articles. The contributions of the present study are threefold. First, we suggest an alternative approach to real-time big data analysis, which has become an extremely important issue. Second, we apply a topic modeling technique that is used in various research areas, including Library and Information Science (LIS). Based on this, we can confirm the utility of storytelling and time series analysis. Third, we develop a web-based system, and make the system available for the real-time discovery of topics. The present study conducted experiments with nearly 150 million tweets in Korea during March 2013.

Open Digital Textbook for Smart Education (스마트교육을 위한 오픈 디지털교과서)

  • Koo, Young-Il;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.2
    • /
    • pp.177-189
    • /
    • 2013
  • In Smart Education, the roles of digital textbook is very important as face-to-face media to learners. The standardization of digital textbook will promote the industrialization of digital textbook for contents providers and distributers as well as learner and instructors. In this study, the following three objectives-oriented digital textbooks are looking for ways to standardize. (1) digital textbooks should undertake the role of the media for blended learning which supports on-off classes, should be operating on common EPUB viewer without special dedicated viewer, should utilize the existing framework of the e-learning learning contents and learning management. The reason to consider the EPUB as the standard for digital textbooks is that digital textbooks don't need to specify antoher standard for the form of books, and can take advantage od industrial base with EPUB standards-rich content and distribution structure (2) digital textbooks should provide a low-cost open market service that are currently available as the standard open software (3) To provide appropriate learning feedback information to students, digital textbooks should provide a foundation which accumulates and manages all the learning activity information according to standard infrastructure for educational Big Data processing. In this study, the digital textbook in a smart education environment was referred to open digital textbook. The components of open digital textbooks service framework are (1) digital textbook terminals such as smart pad, smart TVs, smart phones, PC, etc., (2) digital textbooks platform to show and perform digital contents on digital textbook terminals, (3) learning contents repository, which exist on the cloud, maintains accredited learning, (4) App Store providing and distributing secondary learning contents and learning tools by learning contents developing companies, and (5) LMS as a learning support/management tool which on-site class teacher use for creating classroom instruction materials. In addition, locating all of the hardware and software implement a smart education service within the cloud must have take advantage of the cloud computing for efficient management and reducing expense. The open digital textbooks of smart education is consdered as providing e-book style interface of LMS to learners. In open digital textbooks, the representation of text, image, audio, video, equations, etc. is basic function. But painting, writing, problem solving, etc are beyond the capabilities of a simple e-book. The Communication of teacher-to-student, learner-to-learnert, tems-to-team is required by using the open digital textbook. To represent student demographics, portfolio information, and class information, the standard used in e-learning is desirable. To process learner tracking information about the activities of the learner for LMS(Learning Management System), open digital textbook must have the recording function and the commnincating function with LMS. DRM is a function for protecting various copyright. Currently DRMs of e-boook are controlled by the corresponding book viewer. If open digital textbook admitt DRM that is used in a variety of different DRM standards of various e-book viewer, the implementation of redundant features can be avoided. Security/privacy functions are required to protect information about the study or instruction from a third party UDL (Universal Design for Learning) is learning support function for those with disabilities have difficulty in learning courses. The open digital textbook, which is based on E-book standard EPUB 3.0, must (1) record the learning activity log information, and (2) communicate with the server to support the learning activity. While the recording function and the communication function, which is not determined on current standards, is implemented as a JavaScript and is utilized in the current EPUB 3.0 viewer, ths strategy of proposing such recording and communication functions as the next generation of e-book standard, or special standard (EPUB 3.0 for education) is needed. Future research in this study will implement open source program with the proposed open digital textbook standard and present a new educational services including Big Data analysis.

A Critical Approach on Environmental Education Biased to Environmental Possibilism - From Clearing up the Cause to Problem-Solving Mechanism - (환경관리주의 환경교육에 대한 비판적 고찰 - 원인규명에서 해결기제로의 전환을 위하여 -)

  • Kim, Tae-Kyung
    • Hwankyungkyoyuk
    • /
    • v.18 no.3 s.28
    • /
    • pp.59-74
    • /
    • 2005
  • We can't deny Korean EE has basically developed on the basis of Environmental Possibilism (Environmental management or Reformism) in lots of aspects. I would show three representative proofs here, the first, the philosophy of Korean EE has been mainly focused on dichotomy of human-techno centrism and eco-centrism with no considering other alternative environmentalism since 4th Formal Curriculum, 1981. The second, simultaneously, the concept of EE has not distinguished from it of Science education. (Furthermore, unfortunately some says EE has been a part of Science education, although there should be many differences on its contextual aspect.) And the third one is that the limit of possibilism which market economists have worried, has scarcely mentioned in various kinds of EE-related teaching materials. Possibilism is basically likely to be accompanied by science and economics-oriented approach, and in this aspect this dichotomy, human-techno centrism and eco-centrism, has come from perspectives of Economical development process and over-addicted belief to Science. So it is enough to say that Korean EE has basically developed with biased to Environmental possibilism, in other words, biased to preference to it. And I'll critically focus on these two axes of possibilism, Science and Economics and its dichotomy. Of course, we should accept there are so many same parts in its contents between EE and Science, but we should know its contextual differences for triangular position of environmentalism suitable to EE and also overcome science-dependant approach to EE. Although science-dependant approach to EE and dichotomy could provide some tools for clearing up the causes of environmental problem, especially always it has insisted fundamental causes of environmental problem originated in human faults and over-use of eco-source or over-economic development, but now it is old-fashioned discourse, furthermore it come to have unavoidable limits in the debates of problem-solving mechanism to environmental problems. The paramount important thing is to supply the ways or thoughtful mechanism for solving or coordinating the Environmental problems, not just searching for cause of it. But scientific approach and its dichotomy based on possibilism have continuously born cause & effect in EE-related discourse. So there are so much needs to transfer from continuous bearing of cause & effect to constructive alternatives at least in environmentalism of EE. Traditionally, dichotomical division in EE Environmentalism, human-techno centrism and eco-centrism, couldn't have Provided any answers to our real society, it just gives us only cause & effects of Environmental problems. And also we can't find the description on the limits of capitalism market approach to Environmental problems especially in Korean EE text books, other teaching materials and its teaching-learning process, although market approach economist has been proved its fault beyond its functional merits as Environmental management tools. So we should introduce other alternative Environmental philosophy instead of Possibilism such as eco-socialism insisted by Schmacher M. and Boochin etc, or marxist-environmentalism for relative and comparative views to market-thought such as commodification. In this aspect we need to accept Oriental philosophy based on moderation(中庸) as new another alternatives with the reflection that we have recognized monism as representative Oriental philosophical environmentalism. Fundamentally monism has done its role with providing relative concepts to Dichotomy Enlightenment, but we can't say it has been core concept for understanding of oriental environmentalism, and we can't distinguish monism from oriental philosophy itself, just because oriental thought itself was basically monism. So conceptual difference should be recognized between EE and Science education in teaching-learning process on the basis of life-philosophy(Philosophie des Lebens) from epistemology. For this transformation, we should introduce existentialism in Science education, in other words, only existential Science education based on phenomenology or interpretivism can be EE. And simultaneously we need some ways for overcoming of scientific foundationalism which has been tradition making science not stand on existentialism, formulating and featuring of almost all of natural things and its phenomenon from after enlightenment in western world, but it has malfunctioned in fixing conception of science just into essentialism itself. And we also introduce integrated approach to science and society for EE like STS. Those are ways for overcoming of Environmental possibilism in EE.

  • PDF