• Title/Summary/Keyword: 모델링결과데이터

Search Result 1,381, Processing Time 0.027 seconds

A Study on the Development of High Sensitivity Collision Simulation with Digital Twin (디지털 트윈을 적용한 고감도 충돌 시뮬레이션 개발을 위한 연구)

  • Ki, Jae-Sug;Hwang, Kyo-Chan;Choi, Ju-Ho
    • Journal of the Society of Disaster Information
    • /
    • v.16 no.4
    • /
    • pp.813-823
    • /
    • 2020
  • Purpose: In order to maximize the stability and productivity of the work through simulation prior to high-risk facilities and high-cost work such as dismantling the facilities inside the reactor, we intend to use digital twin technology that can be closely controlled by simulating the specifications of the actual control equipment. Motion control errors, which can be caused by the time gap between precision control equipment and simulation in applying digital twin technology, can cause hazards such as collisions between hazardous facilities and control equipment. In order to eliminate and control these situations, prior research is needed. Method: Unity 3D is currently the most popular engine used to develop simulations. However, there are control errors that can be caused by time correction within Unity 3D engines. The error is expected in many environments and may vary depending on the development environment, such as system specifications. To demonstrate this, we develop crash simulations using Unity 3D engines, which conduct collision experiments under various conditions, organize and analyze the resulting results, and derive tolerances for precision control equipment based on them. Result: In experiments with collision experiment simulation, the time correction in 1/1000 seconds of an engine internal function call results in a unit-hour distance error in the movement control of the collision objects and the distance error is proportional to the velocity of the collision. Conclusion: Remote decomposition simulators using digital twin technology are considered to require limitations of the speed of movement according to the required precision of the precision control devices in the hardware and software environment and manual control. In addition, the size of modeling data such as system development environment, hardware specifications and simulations imitated control equipment and facilities must also be taken into account, available and acceptable errors of operational control equipment and the speed required of work.

Exploring the Trend of Korean Creative Dance by Analyzing Research Topics : Application of Text Mining (연구주제 분석을 통한 한국창작무용 경향 탐색 : 텍스트 마이닝의 적용)

  • Yoo, Ji-Young;Kim, Woo-Kyung
    • Journal of Korea Entertainment Industry Association
    • /
    • v.14 no.6
    • /
    • pp.53-60
    • /
    • 2020
  • The study is based on the assumption that the trend of phenomena and trends in research are contextually consistent. Therefore the purpose of this study is to explore the trend of dance through the subject analysis of the Korean creative dance study by utilizing text mining. Thus, 1,291 words were analyzed in the 616 journal title, which were established on the paper search website. The collection, refining and analysis of the data were all R 3.6.0 SW. According to the study, keywords representing the times were frequently used before the 2000s, but Korean creative dance research types were also found in terms of education and physical training. Second, the frequency of keywords related to the dance troupe's performance was high after the 2000s, but it was confirmed that Choi Seung-hee was still in an important position in the study of Korean creative dance. Third, an analysis of the overall research subjects of the Korean creative dance study showed that the research on 'Art of Choi Seung-hee in the modern era' was the highest proportion. Fourth, the Hot Topics, which are rising as of 2000, appeared as 'the performance activities of the National Dance Company' and 'the choreography expression and utilization of traditional dance'. However, since the recent trend of the National Dance Company's performance is advocating 'modernization based on tradition', it has been confirmed that the trend of Korean creative dance since the 2000s has been focused on the use of traditional dance motifs. Fifth, the Cold Topic, which has been falling as of 2000, has been shown to be a study of 'dancing expressions by age'. It was judged that interest in research also decreased due to the tendency to mix various dance styles after the establishment of the genre of Korean creative dance.

A case study of elementary school mathematics-integrated classes based on AI Big Ideas for fostering AI thinking (인공지능 사고 함양을 위한 인공지능 빅 아이디어 기반 초등학교 수학 융합 수업 사례연구)

  • Chohee Kim;Hyewon Chang
    • The Mathematical Education
    • /
    • v.63 no.2
    • /
    • pp.255-272
    • /
    • 2024
  • This study aims to design mathematics-integrated classes that cultivate artificial intelligence (AI) thinking and to analyze students' AI thinking within these classes. To do this, four classes were designed through the integration of the AI4K12 Initiative's AI Big Ideas with the 2015 revised elementary mathematics curriculum. Implementation of three classes took place with 5th and 6th grade elementary school students. Leveraging the computational thinking taxonomy and the AI thinking components, a comprehensive framework for analyzing of AI thinking was established. Using this framework, analysis of students' AI thinking during these classes was conducted based on classroom discourse and supplementary worksheets. The results of the analysis were peer-reviewed by two researchers. The research findings affirm the potential of mathematics-integrated classes in nurturing students' AI thinking and underscore the viability of AI education for elementary school students. The classes, based on AI Big Ideas, facilitated elementary students' understanding of AI concepts and principles, enhanced their grasp of mathematical content elements, and reinforced mathematical process aspects. Furthermore, through activities that maintain structural consistency with previous problem-solving methods while applying them to new problems, the potential for the transfer of AI thinking was evidenced.

Analysis of the Impact of Generative AI based on Crunchbase: Before and After the Emergence of ChatGPT (Crunchbase를 바탕으로 한 Generative AI 영향 분석: ChatGPT 등장 전·후를 중심으로)

  • Nayun Kim;Youngjung Geum
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.19 no.3
    • /
    • pp.53-68
    • /
    • 2024
  • Generative AI is receiving a lot of attention around the world, and ways to effectively utilize it in the business environment are being explored. In particular, since the public release of the ChatGPT service, which applies the GPT-3.5 model, a large language model developed by OpenAI, it has attracted more attention and has had a significant impact on the entire industry. This study focuses on the emergence of Generative AI, especially ChatGPT, which applies OpenAI's GPT-3.5 model, to investigate its impact on the startup industry and compare the changes that occurred before and after its emergence. This study aims to shed light on the actual application and impact of generative AI in the business environment by examining in detail how generative AI is being used in the startup industry and analyzing the impact of ChatGPT's emergence on the industry. To this end, we collected company information of generative AI-related startups that appeared before and after the ChatGPT announcement and analyzed changes in industry, business content, and investment information. Through keyword analysis, topic modeling, and network analysis, we identified trends in the startup industry and how the introduction of generative AI has revolutionized the startup industry. As a result of the study, we found that the number of startups related to Generative AI has increased since the emergence of ChatGPT, and in particular, the total and average amount of funding for Generative AI-related startups has increased significantly. We also found that various industries are attempting to apply Generative AI technology, and the development of services and products such as enterprise applications and SaaS using Generative AI has been actively promoted, influencing the emergence of new business models. The findings of this study confirm the impact of Generative AI on the startup industry and contribute to our understanding of how the emergence of this innovative new technology can change the business ecosystem.

  • PDF

Automatic Quality Evaluation with Completeness and Succinctness for Text Summarization (완전성과 간결성을 고려한 텍스트 요약 품질의 자동 평가 기법)

  • Ko, Eunjung;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.2
    • /
    • pp.125-148
    • /
    • 2018
  • Recently, as the demand for big data analysis increases, cases of analyzing unstructured data and using the results are also increasing. Among the various types of unstructured data, text is used as a means of communicating information in almost all fields. In addition, many analysts are interested in the amount of data is very large and relatively easy to collect compared to other unstructured and structured data. Among the various text analysis applications, document classification which classifies documents into predetermined categories, topic modeling which extracts major topics from a large number of documents, sentimental analysis or opinion mining that identifies emotions or opinions contained in texts, and Text Summarization which summarize the main contents from one document or several documents have been actively studied. Especially, the text summarization technique is actively applied in the business through the news summary service, the privacy policy summary service, ect. In addition, much research has been done in academia in accordance with the extraction approach which provides the main elements of the document selectively and the abstraction approach which extracts the elements of the document and composes new sentences by combining them. However, the technique of evaluating the quality of automatically summarized documents has not made much progress compared to the technique of automatic text summarization. Most of existing studies dealing with the quality evaluation of summarization were carried out manual summarization of document, using them as reference documents, and measuring the similarity between the automatic summary and reference document. Specifically, automatic summarization is performed through various techniques from full text, and comparison with reference document, which is an ideal summary document, is performed for measuring the quality of automatic summarization. Reference documents are provided in two major ways, the most common way is manual summarization, in which a person creates an ideal summary by hand. Since this method requires human intervention in the process of preparing the summary, it takes a lot of time and cost to write the summary, and there is a limitation that the evaluation result may be different depending on the subject of the summarizer. Therefore, in order to overcome these limitations, attempts have been made to measure the quality of summary documents without human intervention. On the other hand, as a representative attempt to overcome these limitations, a method has been recently devised to reduce the size of the full text and to measure the similarity of the reduced full text and the automatic summary. In this method, the more frequent term in the full text appears in the summary, the better the quality of the summary. However, since summarization essentially means minimizing a lot of content while minimizing content omissions, it is unreasonable to say that a "good summary" based on only frequency always means a "good summary" in its essential meaning. In order to overcome the limitations of this previous study of summarization evaluation, this study proposes an automatic quality evaluation for text summarization method based on the essential meaning of summarization. Specifically, the concept of succinctness is defined as an element indicating how few duplicated contents among the sentences of the summary, and completeness is defined as an element that indicating how few of the contents are not included in the summary. In this paper, we propose a method for automatic quality evaluation of text summarization based on the concepts of succinctness and completeness. In order to evaluate the practical applicability of the proposed methodology, 29,671 sentences were extracted from TripAdvisor 's hotel reviews, summarized the reviews by each hotel and presented the results of the experiments conducted on evaluation of the quality of summaries in accordance to the proposed methodology. It also provides a way to integrate the completeness and succinctness in the trade-off relationship into the F-Score, and propose a method to perform the optimal summarization by changing the threshold of the sentence similarity.

Performance Evaluation of Radiochromic Films and Dosimetry CheckTM for Patient-specific QA in Helical Tomotherapy (나선형 토모테라피 방사선치료의 환자별 품질관리를 위한 라디오크로믹 필름 및 Dosimetry CheckTM의 성능평가)

  • Park, Su Yeon;Chae, Moon Ki;Lim, Jun Teak;Kwon, Dong Yeol;Kim, Hak Joon;Chung, Eun Ah;Kim, Jong Sik
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.32
    • /
    • pp.93-109
    • /
    • 2020
  • Purpose: The radiochromic film (Gafchromic EBT3, Ashland Advanced Materials, USA) and 3-dimensional analysis system dosimetry checkTM (DC, MathResolutions, USA) were evaluated for patient-specific quality assurance (QA) of helical tomotherapy. Materials and Methods: Depending on the tumors' positions, three types of targets, which are the abdominal tumor (130.6㎤), retroperitoneal tumor (849.0㎤), and the whole abdominal metastasis tumor (3131.0㎤) applied to the humanoid phantom (Anderson Rando Phantom, USA). We established a total of 12 comparative treatment plans by the four geometric conditions of the beam irradiation, which are the different field widths (FW) of 2.5-cm, 5.0-cm, and pitches of 0.287, 0.43. Ionization measurements (1D) with EBT3 by inserting the cheese phantom (2D) were compared to DC measurements of the 3D dose reconstruction on CT images from beam fluence log information. For the clinical feasibility evaluation of the DC, dose reconstruction has been performed using the same cheese phantom with the EBT3 method. Recalculated dose distributions revealed the dose error information during the actual irradiation on the same CT images quantitatively compared to the treatment plan. The Thread effect, which might appear in the Helical Tomotherapy, was analyzed by ripple amplitude (%). We also performed gamma index analysis (DD: 3mm/ DTA: 3%, pass threshold limit: 95%) for pattern check of the dose distribution. Results: Ripple amplitude measurement resulted in the highest average of 23.1% in the peritoneum tumor. In the radiochromic film analysis, the absolute dose was on average 0.9±0.4%, and gamma index analysis was on average 96.4±2.2% (Passing rate: >95%), which could be limited to the large target sizes such as the whole abdominal metastasis tumor. In the DC analysis with the humanoid phantom for FW of 5.0-cm, the three regions' average was 91.8±6.4% in the 2D and 3D plan. The three planes (axial, coronal, and sagittal) and dose profile could be analyzed with the entire peritoneum tumor and the whole abdominal metastasis target, with planned dose distributions. The dose errors based on the dose-volume histogram in the DC evaluations increased depending on FW and pitch. Conclusion: The DC method could implement a dose error analysis on the 3D patient image data by the measured beam fluence log information only without any dosimetry tools for patient-specific quality assurance. Also, there may be no limit to apply for the tumor location and size; therefore, the DC could be useful in patient-specific QAl during the treatment of Helical Tomotherapy of large and irregular tumors.

Process Design of Carbon Dioxide Storage in the Marine Geological Structure: I. Comparative Analysis of Thermodynamic Equations of State using Numerical Calculation (이산화탄소 해양지중저장 처리를 위한 공정 설계: I. 수치계산을 통한 열역학 상태방정식의 비교 분석)

  • Huh, Cheol;Kang, Seong-Gil
    • Journal of the Korean Society for Marine Environment & Energy
    • /
    • v.11 no.4
    • /
    • pp.181-190
    • /
    • 2008
  • To response climate change and Kyoto protocol and to reduce greenhouse gas emissions, marine geological storage of $CO_2$ is regarded as one of the most promising option. Marine geological storage of $CO_2$ is to capture $CO_2$ from major point sources(eg. power plant), to transport to the storage sites and to store $CO_2$ into the marine geological structure such as deep sea saline aquifer. To design a reliable $CO_2$ marine geological storage system, it is necessary to perform numerical process simulation using thermodynamic equation of state. The purpose of this paper is to compare and analyse the relevant equations of state including ideal, BWRS, PR, PRBM and SRK equation of state. To evaluate the predictive accuracy of the equation of the state, we compared numerical calculation results with reference experimental data. Ideal and SRK equation of state did not predict the density behavior above $29.85^{\circ}C$, 60 bar. Especially, they showed maximum 100% error in supercritical state. BWRS equation of state did not predict the density behavior between $60{\sim}80\;bar$ and near critical temperature. On the other hand, PR and PRBM equation of state showed good predictive capability in supercritical state. Since the thermodynamic conditions of $CO_2$ reservoir sites correspond to supercritical state(above $31.1^{\circ}C$ and 73.9 bar), we conclude that it is recommended to use PR and PRBM equation of state in designing of $CO_2$ marine geological storage process.

  • PDF

A Performance Comparison of the Mobile Agent Model with the Client-Server Model under Security Conditions (보안 서비스를 고려한 이동 에이전트 모델과 클라이언트-서버 모델의 성능 비교)

  • Han, Seung-Wan;Jeong, Ki-Moon;Park, Seung-Bae;Lim, Hyeong-Seok
    • Journal of KIISE:Information Networking
    • /
    • v.29 no.3
    • /
    • pp.286-298
    • /
    • 2002
  • The Remote Procedure Call(RPC) has been traditionally used for Inter Process Communication(IPC) among precesses in distributed computing environment. As distributed applications have been complicated more and more, the Mobile Agent paradigm for IPC is emerged. Because there are some paradigms for IPC, researches to evaluate and compare the performance of each paradigm are issued recently. But the performance models used in the previous research did not reflect real distributed computing environment correctly, because they did not consider the evacuation elements for providing security services. Since real distributed environment is open, it is very vulnerable to a variety of attacks. In order to execute applications securely in distributed computing environment, security services which protect applications and information against the attacks must be considered. In this paper, we evaluate and compare the performance of the Remote Procedure Call with that of the Mobile Agent in IPC paradigms. We examine security services to execute applications securely, and propose new performance models considering those services. We design performance models, which describe information retrieval system through N database services, using Petri Net. We compare the performance of two paradigms by assigning numerical values to parameters and measuring the execution time of two paradigms. In this paper, the comparison of two performance models with security services for secure communication shows the results that the execution time of the Remote Procedure Call performance model is sharply increased because of many communications with the high cryptography mechanism between hosts, and that the execution time of the Mobile Agent model is gradually increased because the Mobile Agent paradigm can reduce the quantity of the communications between hosts.

A Survey of Yeosu Sado Dinosaur Tracksite and Utilization of Educational Materials using 3D Photogrammetry (3D 사진측량법을 이용한 여수 사도 공룡발자국 화석산지 조사 및 교육자료 활용방안)

  • Jo, Hyemin;Hong, Minsun;Son, Jongju;Lee, Hyun-Yeong;Park, Kyeong-Beom;Jung, Jongyun;Huh, Min
    • Journal of the Korean earth science society
    • /
    • v.42 no.6
    • /
    • pp.662-676
    • /
    • 2021
  • The Yeosu Sado dinosaur tracksite is well known for many dinosaur tracks and research on the gregarious behavior of dinosaurs. In addition, various geological and geographical heritage sites are distributed on Sado Island. However, educational field trips for students are very limited due to accessibility according to its geological location, time constraints due to tides, and continuous weathering and damage. Therefore, this study aims to generate 3D models and images of dinosaur tracks using the photogrammetric method, which has recently been used in various fields, and then discuss the possibility of using them as paleontological research and educational contents. As a result of checking the obtained 3D images and models, it was possible to confirm the existence of footprints that were not previously discovered or could not represent details by naked eyes or photos. Even previously discovered tracks could possibly present details using 3D images that could not be expressed by photos or interpretive drawings. In addition, the 3D model of dinosaur tracks can be preserved as semi-permanent data, enabling various forms of utilization and preservation. Here we apply 3D printing and mobile augmented reality content using photogrammetric 3D models for a virtual field trip, and these models acquired by photogrammetry can be used in various educational content fields that require 3D models.

Intelligent Brand Positioning Visualization System Based on Web Search Traffic Information : Focusing on Tablet PC (웹검색 트래픽 정보를 활용한 지능형 브랜드 포지셔닝 시스템 : 태블릿 PC 사례를 중심으로)

  • Jun, Seung-Pyo;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.93-111
    • /
    • 2013
  • As Internet and information technology (IT) continues to develop and evolve, the issue of big data has emerged at the foreground of scholarly and industrial attention. Big data is generally defined as data that exceed the range that can be collected, stored, managed and analyzed by existing conventional information systems and it also refers to the new technologies designed to effectively extract values from such data. With the widespread dissemination of IT systems, continual efforts have been made in various fields of industry such as R&D, manufacturing, and finance to collect and analyze immense quantities of data in order to extract meaningful information and to use this information to solve various problems. Since IT has converged with various industries in many aspects, digital data are now being generated at a remarkably accelerating rate while developments in state-of-the-art technology have led to continual enhancements in system performance. The types of big data that are currently receiving the most attention include information available within companies, such as information on consumer characteristics, information on purchase records, logistics information and log information indicating the usage of products and services by consumers, as well as information accumulated outside companies, such as information on the web search traffic of online users, social network information, and patent information. Among these various types of big data, web searches performed by online users constitute one of the most effective and important sources of information for marketing purposes because consumers search for information on the internet in order to make efficient and rational choices. Recently, Google has provided public access to its information on the web search traffic of online users through a service named Google Trends. Research that uses this web search traffic information to analyze the information search behavior of online users is now receiving much attention in academia and in fields of industry. Studies using web search traffic information can be broadly classified into two fields. The first field consists of empirical demonstrations that show how web search information can be used to forecast social phenomena, the purchasing power of consumers, the outcomes of political elections, etc. The other field focuses on using web search traffic information to observe consumer behavior, identifying the attributes of a product that consumers regard as important or tracking changes on consumers' expectations, for example, but relatively less research has been completed in this field. In particular, to the extent of our knowledge, hardly any studies related to brands have yet attempted to use web search traffic information to analyze the factors that influence consumers' purchasing activities. This study aims to demonstrate that consumers' web search traffic information can be used to derive the relations among brands and the relations between an individual brand and product attributes. When consumers input their search words on the web, they may use a single keyword for the search, but they also often input multiple keywords to seek related information (this is referred to as simultaneous searching). A consumer performs a simultaneous search either to simultaneously compare two product brands to obtain information on their similarities and differences, or to acquire more in-depth information about a specific attribute in a specific brand. Web search traffic information shows that the quantity of simultaneous searches using certain keywords increases when the relation is closer in the consumer's mind and it will be possible to derive the relations between each of the keywords by collecting this relational data and subjecting it to network analysis. Accordingly, this study proposes a method of analyzing how brands are positioned by consumers and what relationships exist between product attributes and an individual brand, using simultaneous search traffic information. It also presents case studies demonstrating the actual application of this method, with a focus on tablets, belonging to innovative product groups.