• Title/Summary/Keyword: media technology

Search Result 6,798, Processing Time 0.035 seconds

Development of simultaneous analytical method for investigation of ketamine and dexmedetomidine in feed (사료 내 케타민과 덱스메데토미딘의 잔류조사를 위한 동시분석법 개발)

  • Chae, Hyun-young;Park, Hyejin;Seo, Hyung-Ju;Jang, Su-nyeong;Lee, Seung Hwa;Jeong, Min-Hee;Cho, Hyunjeong;Hong, Seong-Hee;Na, Tae Woong
    • Analytical Science and Technology
    • /
    • v.35 no.3
    • /
    • pp.136-142
    • /
    • 2022
  • According to media reports, the carcasses of euthanized abandoned dogs were processed at high temperature and pressure to make powder, and then used as feed materials (meat and bone meal), raising the possibility of residuals in the feed of the anesthetic ketamine and dexmedetomidine used for euthanasia. Therefore, a simultaneous analysis method using QuEChERS combined with high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry was developed for rapid residue analysis. The method developed in this study exhibited linearity of 0.999 and higher. Selectivity was evaluated by analyzing blank and spiked samples at the limit of quantification. The MRM chromatograms of blank samples were compared with those of spiked samples with the analyte, and there were no interferences at the respective retention times of ketamine and dexmedetomidine. The detection and quantitation limits of the instrument were 0.6 ㎍/L and 2 ㎍/L, respectively. The limit of quantitation for the method was 10 ㎍/kg. The results of the recovery test on meat and bone meal, meat meal, and pet food showed ketamine in the range of 80.48-98.63 % with less than 5.00 % RSD, and dexmedetomidine in the range of 72.75-93.00 % with less than 4.83 % RSD. As a result of collecting and analyzing six feeds, such as meat and bone meal, prepared at the time the raw material was distributed, 10.8 ㎍/kg of ketamine was detected in one sample of meat and bone meal, while dexmedetomidine was found to have a concentration below the limit of quantitation. It was confirmed that the detected sample was distributed before the safety issue was known, and thereafter, all the meat and bone meal made with the carcasses of euthanized abandoned dogs was recalled and completely discarded. To ensure the safety of the meat and bone meal, 32 samples of the meat and bone meal as well as compound feed were collected, and additional residue investigations were conducted for ketamine and dexmedetomidine. As a result of the analysis, no component was detected. However, through this investigation, it was confirmed that some animal drugs, such as anesthetics, can remain without decomposition even at high temperature and pressure; therefore, there is a need for further investigation of other potentially hazardous substances not controlled in the feed.

A Study on Consumer's Emotional Consumption Value and Purchase Intention about IoT Products - Focused on the preference of using EEG - (IoT 제품에 관한 소비자의 감성적 소비가치와 구매의도에 관한 연구 - EEG를 활용한 선호도 연구를 중심으로 -)

  • Lee, Young-ae;Kim, Seung-in
    • Journal of Communication Design
    • /
    • v.68
    • /
    • pp.278-288
    • /
    • 2019
  • The purpose of this study is to analyze the effects of risk and convenience on purchase intention in the IOT market, and I want to analyze the moderating effect of emotional consumption value. In this study, two products were selected from three product groups. There are three major methods of research. First, theoretical considerations. Second, survey analysis. Reliability analysis and factor analysis were performed using descriptive statistics using SPSS. Third, we measured changes of EEG according to in - depth interview and indirect experience. As a result of the hypothesis of this study, it was confirmed that convenience of use of IoT product influences purchase intention. Risk was predicted to have a negative effect on purchase intentions, but not significant in this study. This implies that IoT products tend to be neglected in terms of monetary loss such as cost of purchase, cost of use, and disposal cost when purchasing. In-depth interviews and EEG analysis revealed that there is a desire to purchase and try out the IoT product due to the nature of the product, the novelty of new technology, and the vague idea that it will benefit my life. The aesthetic, symbolic, and pleasure factors, which are sub - elements of emotional consumption value, were found to have a great influence. This is consistent with previous research showing that emotional consumption value has a positive effect on purchase intention. In-depth interviews and EEG analyzes also yielded the same results. This study has revealed that emotional consumption value affects the intention to purchase IoT products. It seems that companies producing IoT products need to concentrate on marketing with more emotional consumption value.

Efficient plant regeneration through callus induction from the hypocotyl of Perilla frutescens L var. Dayu ('다유들깨'품종의 하배축에서 캘러스를 통한 고효율 식물재분화)

  • Ruyue Xu;Ji-Hi Son;Hong-Gyu Kang;Hyeon-Jin Sun;Hyo-Yeon Lee
    • Journal of Plant Biotechnology
    • /
    • v.50
    • /
    • pp.248-254
    • /
    • 2023
  • This study was conducted to establish an efficient plant regeneration system in 'Dayu', a Korean variety of Perilla frutescens developed for seed oil production, in conjunction with the previously studied variety 'Namcheon'. The healthiest callus was formed on the hypocotyl explants cultured on a medium containing 0.1 mg/L NAA and 0.5 mg/L BA, outperforming the leaf and cotyledon samples. In both dark and long-day conditions, Dayu consistently exhibited significantly higher shoot regeneration rates compared with Namcheon. The highest shoot regeneration rates in Dayu were observed from the hypocotyl explants cultured on 0.1 mg/L NAA and 0.5 mg/L BA media, with shoot regeneration rates of 84.4% and 86.7% under dark and long-day conditions, respectively. Various combinations of plant growth regulators were tested to establish the optimal shoot regeneration conditions for Dayu hypocotyl explants. The results demonstrated that the highest shoot regeneration rate (90%) was achieved when 0.5 mg/L of BA was added to the medium without NAA. Among the regenerated shoots, 70.5% were normal plants, while 19.3% were abnormal. The addition of NAA or an increase in its concentration led to a higher occurrence of abnormal plants. After the regenerated shoots were transferred to 1/2 MS medium, roots were observed within 10-15 days. By day 30, they had developed into complete plants. The results obtained from the regeneration experiments with the perilla variety Dayu can valuably inform molecular breeding reliant on transformation techniques such as genome-editing and genetic modification technology.

Application of MicroPACS Using the Open Source (Open Source를 이용한 MicroPACS의 구성과 활용)

  • You, Yeon-Wook;Kim, Yong-Keun;Kim, Yeong-Seok;Won, Woo-Jae;Kim, Tae-Sung;Kim, Seok-Ki
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.13 no.1
    • /
    • pp.51-56
    • /
    • 2009
  • Purpose: Recently, most hospitals are introducing the PACS system and use of the system continues to expand. But small-scaled PACS called MicroPACS has already been in use through open source programs. The aim of this study is to prove utility of operating a MicroPACS, as a substitute back-up device for conventional storage media like CDs and DVDs, in addition to the full-PACS already in use. This study contains the way of setting up a MicroPACS with open source programs and assessment of its storage capability, stability, compatibility and performance of operations such as "retrieve", "query". Materials and Methods: 1. To start with, we searched open source software to correspond with the following standards to establish MicroPACS, (1) It must be available in Windows Operating System. (2) It must be free ware. (3) It must be compatible with PET/CT scanner. (4) It must be easy to use. (5) It must not be limited of storage capacity. (6) It must have DICOM supporting. 2. (1) To evaluate availability of data storage, we compared the time spent to back up data in the open source software with the optical discs (CDs and DVD-RAMs), and we also compared the time needed to retrieve data with the system and with optical discs respectively. (2) To estimate work efficiency, we measured the time spent to find data in CDs, DVD-RAMs and MicroPACS. 7 technologists participated in this study. 3. In order to evaluate stability of the software, we examined whether there is a data loss during the system is maintained for a year. Comparison object; How many errors occurred in randomly selected data of 500 CDs. Result: 1. We chose the Conquest DICOM Server among 11 open source software used MySQL as a database management system. 2. (1) Comparison of back up and retrieval time (min) showed the result of the following: DVD-RAM (5.13,2.26)/Conquest DICOM Server (1.49,1.19) by GE DSTE (p<0.001), CD (6.12,3.61)/Conquest (0.82,2.23) by GE DLS (p<0.001), CD (5.88,3.25)/Conquest (1.05,2.06) by SIEMENS. (2) The wasted time (sec) to find some data is as follows: CD ($156{\pm}46$), DVD-RAM ($115{\pm}21$) and Conquest DICOM Server ($13{\pm}6$). 3. There was no data loss (0%) for a year and it was stored 12741 PET/CT studies in 1.81 TB memory. In case of CDs, On the other hand, 14 errors among 500 CDs (2.8%) is generated. Conclusions: We found that MicroPACS could be set up with the open source software and its performance was excellent. The system built with open source proved more efficient and more robust than back-up process using CDs or DVD-RAMs. We believe that the operation of the MicroPACS would be effective data storage device as long as its operators develop and systematize it.

  • PDF

The Conceptual Intersection between the Old and the New and the Transformation of the Traditional Knowledge System (신구(新舊) 관념의 교차와 전통 지식 체계의 변용)

  • Lee, Haenghoon
    • The Journal of Korean Philosophical History
    • /
    • no.32
    • /
    • pp.215-249
    • /
    • 2011
  • This essay reflects on the modernity of Korea by examining the transformation of the traditional knowledge system from a historico-semantic perspective with its focus on the opposition and collision of the old and the new conception occurred in the early period(1890~1910) of the acceptance of the Western modern civilization. With scientific success, trick of reason, Christianity and evolutionary view of history, the Western modernity regarded itself as a peak of civilization and forced the non-Western societies into the world system in which they came to be considered as 'barbarism(野蠻)' or 'half-enlightened(半開).' The East Asian civilization, which had its own history for several centuries, became degraded as kind of delusion and old-fashioned customs from which it ought to free itself. The Western civilization presented itself as exemplary future which East Asian people should achieve, while East Asian past traditions came to be conceived as just unnecessary vestiges which it was better to wipe out. It can be said that East Asian modernization was established through the propagation and acceptance of the modern products of the Western civilization rather than through the preservation of its past experience and pursuit of the new at the same time. Accordingly, it is difficult to apply directly to East Asian societies Koselleck's hypothesis; while mapping out his Basic Concept of History, he assumed that, in the so-called 'age of saddle,' semantic struggle over concepts becomes active between the past experience and the horizon of expectation on the future, and concepts undergoes 'temporalization', 'democratization', 'ideologization', 'politicization.'The struggle over the old and new conceptions in Korea was most noticeable in the opposition of the Neo-Confucian scholars of Hwangseongsinmun and the theorists of civilization of Doknipsinmun. The opposition and struggle demanded the change of understanding in every field, but there was difference of opinion over the conception of the past traditional knowledge system. For the theorists of civilization, 'the old(舊)' was not just 'past' and 'old-fashioned' things, but rather an obstacle to the building of new civilization. On the other hand, it contained the possibility of regeneration(新) for the Neo-Confucian scholars; that is, they suggested finding a guide into tomorrow by taking lessons from the past. The traditional knowledge system lost their holy status of learning(聖學) in the process of its change into a 'new learning(新學),' and religion and religious tradition also weakened. The traditional knowledge system could change itself into modern learning by accepting scientific methodology which pursues objectivity and rationality. This transformation of the traditional knowledge system and 'the formation of the new learning from the old learning' was accompanied by the intersection between the old and new conceptions. It is necessary to pay attention to the role played by the concept of Sil(hak)(實學) or Practical Learning in the intersection of the old and new conceptions. Various modern media published before and after the 20th century show clearly the multi-layered development of the old and new conceptions, and it is noticeable that 'Sil(hak)' as conceptual frame of reference contributed to the transformation of the traditional knowledge system into the new learning. Although Silhak often designated, or was even considered equivalent to, the Western learning, Neo-Confucian scholars reinterpreted the concept of 'Silhak' which the theorists of civilization had monopolized until then, and opened the way to change the traditional knowledge system into the new learning. They re-appropriated the concept of Silhak, and enabled it to be invested with values, which were losing their own status due to the overwhelming scientific technology. With Japanese occupation of Korea by force, the attempt to transform the traditional knowledge system independently was obliged to reach its own limit, but its theory of 'making new learning from old one' can be considered to get over both the contradiction of Dondoseogi(東道西器: principle of preserving Eastern philosophy while accepting Western technology) and the de-subjectivity of the theory of civilization. While developing its own logic, the theory of Dongdoseogi was compelled to bring in the contradiction of considering the indivisible(道and 器) as divisible, though it tried to cope with the reality where the principle of morality and that of competition were opposed each other and the ideologies of 'evolution' and 'progress' prevailed. On the other hand, the theory of civilization was not free from the criticism that it brought about a crack in subjectivity due to its internalization of the West, cutting itself off from the traditional knowledge system.

Construction of Event Networks from Large News Data Using Text Mining Techniques (텍스트 마이닝 기법을 적용한 뉴스 데이터에서의 사건 네트워크 구축)

  • Lee, Minchul;Kim, Hea-Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.183-203
    • /
    • 2018
  • News articles are the most suitable medium for examining the events occurring at home and abroad. Especially, as the development of information and communication technology has brought various kinds of online news media, the news about the events occurring in society has increased greatly. So automatically summarizing key events from massive amounts of news data will help users to look at many of the events at a glance. In addition, if we build and provide an event network based on the relevance of events, it will be able to greatly help the reader in understanding the current events. In this study, we propose a method for extracting event networks from large news text data. To this end, we first collected Korean political and social articles from March 2016 to March 2017, and integrated the synonyms by leaving only meaningful words through preprocessing using NPMI and Word2Vec. Latent Dirichlet allocation (LDA) topic modeling was used to calculate the subject distribution by date and to find the peak of the subject distribution and to detect the event. A total of 32 topics were extracted from the topic modeling, and the point of occurrence of the event was deduced by looking at the point at which each subject distribution surged. As a result, a total of 85 events were detected, but the final 16 events were filtered and presented using the Gaussian smoothing technique. We also calculated the relevance score between events detected to construct the event network. Using the cosine coefficient between the co-occurred events, we calculated the relevance between the events and connected the events to construct the event network. Finally, we set up the event network by setting each event to each vertex and the relevance score between events to the vertices connecting the vertices. The event network constructed in our methods helped us to sort out major events in the political and social fields in Korea that occurred in the last one year in chronological order and at the same time identify which events are related to certain events. Our approach differs from existing event detection methods in that LDA topic modeling makes it possible to easily analyze large amounts of data and to identify the relevance of events that were difficult to detect in existing event detection. We applied various text mining techniques and Word2vec technique in the text preprocessing to improve the accuracy of the extraction of proper nouns and synthetic nouns, which have been difficult in analyzing existing Korean texts, can be found. In this study, the detection and network configuration techniques of the event have the following advantages in practical application. First, LDA topic modeling, which is unsupervised learning, can easily analyze subject and topic words and distribution from huge amount of data. Also, by using the date information of the collected news articles, it is possible to express the distribution by topic in a time series. Second, we can find out the connection of events in the form of present and summarized form by calculating relevance score and constructing event network by using simultaneous occurrence of topics that are difficult to grasp in existing event detection. It can be seen from the fact that the inter-event relevance-based event network proposed in this study was actually constructed in order of occurrence time. It is also possible to identify what happened as a starting point for a series of events through the event network. The limitation of this study is that the characteristics of LDA topic modeling have different results according to the initial parameters and the number of subjects, and the subject and event name of the analysis result should be given by the subjective judgment of the researcher. Also, since each topic is assumed to be exclusive and independent, it does not take into account the relevance between themes. Subsequent studies need to calculate the relevance between events that are not covered in this study or those that belong to the same subject.

Twitter Issue Tracking System by Topic Modeling Techniques (토픽 모델링을 이용한 트위터 이슈 트래킹 시스템)

  • Bae, Jung-Hwan;Han, Nam-Gi;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.109-122
    • /
    • 2014
  • People are nowadays creating a tremendous amount of data on Social Network Service (SNS). In particular, the incorporation of SNS into mobile devices has resulted in massive amounts of data generation, thereby greatly influencing society. This is an unmatched phenomenon in history, and now we live in the Age of Big Data. SNS Data is defined as a condition of Big Data where the amount of data (volume), data input and output speeds (velocity), and the variety of data types (variety) are satisfied. If someone intends to discover the trend of an issue in SNS Big Data, this information can be used as a new important source for the creation of new values because this information covers the whole of society. In this study, a Twitter Issue Tracking System (TITS) is designed and established to meet the needs of analyzing SNS Big Data. TITS extracts issues from Twitter texts and visualizes them on the web. The proposed system provides the following four functions: (1) Provide the topic keyword set that corresponds to daily ranking; (2) Visualize the daily time series graph of a topic for the duration of a month; (3) Provide the importance of a topic through a treemap based on the score system and frequency; (4) Visualize the daily time-series graph of keywords by searching the keyword; The present study analyzes the Big Data generated by SNS in real time. SNS Big Data analysis requires various natural language processing techniques, including the removal of stop words, and noun extraction for processing various unrefined forms of unstructured data. In addition, such analysis requires the latest big data technology to process rapidly a large amount of real-time data, such as the Hadoop distributed system or NoSQL, which is an alternative to relational database. We built TITS based on Hadoop to optimize the processing of big data because Hadoop is designed to scale up from single node computing to thousands of machines. Furthermore, we use MongoDB, which is classified as a NoSQL database. In addition, MongoDB is an open source platform, document-oriented database that provides high performance, high availability, and automatic scaling. Unlike existing relational database, there are no schema or tables with MongoDB, and its most important goal is that of data accessibility and data processing performance. In the Age of Big Data, the visualization of Big Data is more attractive to the Big Data community because it helps analysts to examine such data easily and clearly. Therefore, TITS uses the d3.js library as a visualization tool. This library is designed for the purpose of creating Data Driven Documents that bind document object model (DOM) and any data; the interaction between data is easy and useful for managing real-time data stream with smooth animation. In addition, TITS uses a bootstrap made of pre-configured plug-in style sheets and JavaScript libraries to build a web system. The TITS Graphical User Interface (GUI) is designed using these libraries, and it is capable of detecting issues on Twitter in an easy and intuitive manner. The proposed work demonstrates the superiority of our issue detection techniques by matching detected issues with corresponding online news articles. The contributions of the present study are threefold. First, we suggest an alternative approach to real-time big data analysis, which has become an extremely important issue. Second, we apply a topic modeling technique that is used in various research areas, including Library and Information Science (LIS). Based on this, we can confirm the utility of storytelling and time series analysis. Third, we develop a web-based system, and make the system available for the real-time discovery of topics. The present study conducted experiments with nearly 150 million tweets in Korea during March 2013.

Suggestion of Urban Regeneration Type Recommendation System Based on Local Characteristics Using Text Mining (텍스트 마이닝을 활용한 지역 특성 기반 도시재생 유형 추천 시스템 제안)

  • Kim, Ikjun;Lee, Junho;Kim, Hyomin;Kang, Juyoung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.149-169
    • /
    • 2020
  • "The Urban Renewal New Deal project", one of the government's major national projects, is about developing underdeveloped areas by investing 50 trillion won in 100 locations on the first year and 500 over the next four years. This project is drawing keen attention from the media and local governments. However, the project model which fails to reflect the original characteristics of the area as it divides project area into five categories: "Our Neighborhood Restoration, Housing Maintenance Support Type, General Neighborhood Type, Central Urban Type, and Economic Base Type," According to keywords for successful urban regeneration in Korea, "resident participation," "regional specialization," "ministerial cooperation" and "public-private cooperation", when local governments propose urban regeneration projects to the government, they can see that it is most important to accurately understand the characteristics of the city and push ahead with the projects in a way that suits the characteristics of the city with the help of local residents and private companies. In addition, considering the gentrification problem, which is one of the side effects of urban regeneration projects, it is important to select and implement urban regeneration types suitable for the characteristics of the area. In order to supplement the limitations of the 'Urban Regeneration New Deal Project' methodology, this study aims to propose a system that recommends urban regeneration types suitable for urban regeneration sites by utilizing various machine learning algorithms, referring to the urban regeneration types of the '2025 Seoul Metropolitan Government Urban Regeneration Strategy Plan' promoted based on regional characteristics. There are four types of urban regeneration in Seoul: "Low-use Low-Level Development, Abandonment, Deteriorated Housing, and Specialization of Historical and Cultural Resources" (Shon and Park, 2017). In order to identify regional characteristics, approximately 100,000 text data were collected for 22 regions where the project was carried out for a total of four types of urban regeneration. Using the collected data, we drew key keywords for each region according to the type of urban regeneration and conducted topic modeling to explore whether there were differences between types. As a result, it was confirmed that a number of topics related to real estate and economy appeared in old residential areas, and in the case of declining and underdeveloped areas, topics reflecting the characteristics of areas where industrial activities were active in the past appeared. In the case of the historical and cultural resource area, since it is an area that contains traces of the past, many keywords related to the government appeared. Therefore, it was possible to confirm political topics and cultural topics resulting from various events. Finally, in the case of low-use and under-developed areas, many topics on real estate and accessibility are emerging, so accessibility is good. It mainly had the characteristics of a region where development is planned or is likely to be developed. Furthermore, a model was implemented that proposes urban regeneration types tailored to regional characteristics for regions other than Seoul. Machine learning technology was used to implement the model, and training data and test data were randomly extracted at an 8:2 ratio and used. In order to compare the performance between various models, the input variables are set in two ways: Count Vector and TF-IDF Vector, and as Classifier, there are 5 types of SVM (Support Vector Machine), Decision Tree, Random Forest, Logistic Regression, and Gradient Boosting. By applying it, performance comparison for a total of 10 models was conducted. The model with the highest performance was the Gradient Boosting method using TF-IDF Vector input data, and the accuracy was 97%. Therefore, the recommendation system proposed in this study is expected to recommend urban regeneration types based on the regional characteristics of new business sites in the process of carrying out urban regeneration projects."