• Title/Summary/Keyword: 시스템적 접근

Search Result 5,414, Processing Time 0.041 seconds

Effects of Customers' Relationship Networks on Organizational Performance: Focusing on Facebook Fan Page (고객 간 관계 네트워크가 조직성과에 미치는 영향: 페이스북 기업 팬페이지를 중심으로)

  • Jeon, Su-Hyeon;Kwahk, Kee-Young
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.2
    • /
    • pp.57-79
    • /
    • 2016
  • It is a rising trend that the number of users using one of the social media channels, the Social Network Service, so called the SNS, is getting increased. As per to this social trend, more companies have interest in this networking platform and start to invest their funds in it. It has received much attention as a tool spreading and expanding the message that a company wants to deliver to its customers and has been recognized as an important channel in terms of the relationship marketing with them. The environment of media that is radically changing these days makes possible for companies to approach their customers in various ways. Particularly, the social network service, which has been developed rapidly, provides the environment that customers can freely talk about products. For companies, it also works as a channel that gives customized information to customers. To succeed in the online environment, companies need to not only build the relationship between companies and customers but focus on the relationship between customers as well. In response to the online environment with the continuous development of technology, companies have tirelessly made the novel marketing strategy. Especially, as the one-to-one marketing to customers become available, it is more important for companies to maintain the relationship marketing with their customers. Among many SNS, Facebook, which many companies use as a communication channel, provides a fan page service for each company that supports its business. Facebook fan page is the platform that the event, information and announcement can be shared with customers using texts, videos, and pictures. Companies open their own fan pages in order to inform their companies and businesses. Such page functions as the websites of companies and has a characteristic of their brand communities such as blogs as well. As Facebook has become the major communication medium with customers, companies recognize its importance as the effective marketing channel, but they still need to investigate their business performances by using Facebook. Although there are infinite potentials in Facebook fan page that even has a function as a community between users, which other platforms do not, it is incomplete to regard companies' Facebook fan pages as communities and analyze them. In this study, it explores the relationship among customers through the network of the Facebook fan page users. The previous studies on a company's Facebook fan page were focused on finding out the effective operational direction by analyzing the use state of the company. However, in this study, it draws out the structural variable of the network, which customer committment can be measured by applying the social network analysis methodology and investigates the influence of the structural characteristics of network on the business performance of companies in an empirical way. Through each company's Facebook fan page, the network of users who engaged in the communication with each company is exploited and it is the one-mode undirected binary network that respectively regards users and the relationship of them in terms of their marketing activities as the node and link. In this network, it draws out the structural variable of network that can explain the customer commitment, who pressed "like," made comments and shared the Facebook marketing message, of each company by calculating density, global clustering coefficient, mean geodesic distance, diameter. By exploiting companies' historical performance such as net income and Tobin's Q indicator as the result variables, this study investigates influence on companies' business performances. For this purpose, it collects the network data on the subjects of 54 companies among KOSPI-listed companies, which have posted more than 100 articles on their Facebook fan pages during the data collection period. Then it draws out the network indicator of each company. The indicator related to companies' performances is calculated, based on the posted value on DART website of the Financial Supervisory Service. From the academic perspective, this study suggests a new approach through the social network analysis methodology to researchers who attempt to study the business-purpose utilization of the social media channel. From the practical perspective, this study proposes the more substantive marketing performance measurements to companies performing marketing activities through the social media and it is expected that it will bring a foundation of establishing smart business strategies by using the network indicators.

Acceleration of computation speed for elastic wave simulation using a Graphic Processing Unit (그래픽 프로세서를 이용한 탄성파 수치모사의 계산속도 향상)

  • Nakata, Norimitsu;Tsuji, Takeshi;Matsuoka, Toshifumi
    • Geophysics and Geophysical Exploration
    • /
    • v.14 no.1
    • /
    • pp.98-104
    • /
    • 2011
  • Numerical simulation in exploration geophysics provides important insights into subsurface wave propagation phenomena. Although elastic wave simulations take longer to compute than acoustic simulations, an elastic simulator can construct more realistic wavefields including shear components. Therefore, it is suitable for exploration of the responses of elastic bodies. To overcome the long duration of the calculations, we use a Graphic Processing Unit (GPU) to accelerate the elastic wave simulation. Because a GPU has many processors and a wide memory bandwidth, we can use it in a parallelised computing architecture. The GPU board used in this study is an NVIDIA Tesla C1060, which has 240 processors and a 102 GB/s memory bandwidth. Despite the availability of a parallel computing architecture (CUDA), developed by NVIDIA, we must optimise the usage of the different types of memory on the GPU device, and the sequence of calculations, to obtain a significant speedup of the computation. In this study, we simulate two- (2D) and threedimensional (3D) elastic wave propagation using the Finite-Difference Time-Domain (FDTD) method on GPUs. In the wave propagation simulation, we adopt the staggered-grid method, which is one of the conventional FD schemes, since this method can achieve sufficient accuracy for use in numerical modelling in geophysics. Our simulator optimises the usage of memory on the GPU device to reduce data access times, and uses faster memory as much as possible. This is a key factor in GPU computing. By using one GPU device and optimising its memory usage, we improved the computation time by more than 14 times in the 2D simulation, and over six times in the 3D simulation, compared with one CPU. Furthermore, by using three GPUs, we succeeded in accelerating the 3D simulation 10 times.

Human Health Risk, Environmental and Economic Assessment Based on Multimedia Fugacity Model for Determination of Best Available Technology (BAT) for VOC Reduction in Industrial Complex (산업단지 VOC 저감 최적가용기법(BAT) 선정을 위한 다매체 거동모델 기반 인체위해성·환경성·경제성 평가)

  • Kim, Yelin;Rhee, Gahee;Heo, Sungku;Nam, Kijeon;Li, Qian;Yoo, ChangKyoo
    • Korean Chemical Engineering Research
    • /
    • v.58 no.3
    • /
    • pp.325-345
    • /
    • 2020
  • Determination of Best available technology (BAT) was suggested to reduce volatile organic compounds (VOCs) in a petrochemical industrial complex, by conducting human health risk, environmental, and economic assessment based on multimedia fugacity model. Fate and distribution of benzene, toluene, ethylbenzene, and xylene (BTEX) was predicted by the multimedia fugacity model, which represent VOCs emitted from the industrial complex in U-city. Media-integrated human health risk assessment and sensitivity analysis were conducted to predict the human health risk of BTEX and identify the critical variable which has adverse effects on human health. Besides, the environmental and economic assessment was conducted to determine the BAT for VOCs reduction. It is concluded that BTEX highly remained in soil media (60%, 61%, 64% and 63%), and xylene has remained as the highest proportion of BTEX in each environment media. From the candidates of BAT, the absorption was excluded due to its high human health risk. Moreover, it is identified that the half-life and exposure coefficient of each exposure route are highly correlated with human health risk by sensitivity analysis. In last, considering environmental and economic assessment, the regenerative thermal oxidation, the regenerative catalytic oxidation, the bio-filtration, the UV oxidation, and the activated carbon adsorption were determined as BAT for reducing VOCs in the petrochemical industrial complex. The suggested BAT determination methodology based on the media-integrated approach can contribute to the application of BAT into the workplace to efficiently manage the discharge facilities and operate an integrated environmental management system.

A Study of Radiation Exposure in Proton Therapy Facility (양성자치료기 가속기 시설에서의 작업종사자의 방사선 피폭 연구)

  • Lee, Sang-Hoon;Shin, Dong-Ho;Yoon, Myong-Geun;Shin, Jung-Wook;Rah, Jeong-Eun;Kwak, Jung-Won;Park, Sung-Yong;Shin, Kyung-Hwan;Lee, Doo-Hyun;Ahn, Sung-Hwan;Kim, Dae-Yong;Cho, Kwan-Ho;Lee, Se-Byeong
    • Progress in Medical Physics
    • /
    • v.20 no.1
    • /
    • pp.37-42
    • /
    • 2009
  • Proton therapy facility, which is recently installed at National Cancer Center in Korea, generally produces a large amount of radiation near cyclotron due to the secondary particles and radioisotopes caused by collision between proton and nearby materials during the acceleration. Although the level of radiation by radioisotope decreases in length of time, radiation exposure problem still exists since workers are easily exposed by a low level of radiation for a long time due to their job assignment for maintenance or repair of the proton facility. In this paper, the working environment near cyclotron, where the highest radiation exposure is expected, was studied by measuring the degree of radiation and its duration for an appropriate level of protective action guide. To do this, we measured the radiation change in the graphite based energy degrader, the efficiency of transmitted beam and relative activation degree of the transmission beam line. The results showed that while the level of radiation exposure around cyclotron and beam line during the operation is much higher than the other radiation therapy facilities, the radiation exposure rate per year is under the limit recommended by the law showing 1~3 mSv/year.

  • PDF

Improvement of Radar Rainfall Estimation Using Radar Reflectivity Data from the Hybrid Lowest Elevation Angles (혼합 최저고도각 반사도 자료를 이용한 레이더 강우추정 정확도 향상)

  • Lyu, Geunsu;Jung, Sung-Hwa;Nam, Kyung-Yeub;Kwon, Soohyun;Lee, Cheong-Ryong;Lee, Gyuwon
    • Journal of the Korean earth science society
    • /
    • v.36 no.1
    • /
    • pp.109-124
    • /
    • 2015
  • A novel approach, hybrid surface rainfall (KNU-HSR) technique developed by Kyungpook Natinal University, was utilized for improving the radar rainfall estimation. The KNU-HSR technique estimates radar rainfall at a 2D hybrid surface consistings of the lowest radar bins that is immune to ground clutter contaminations and significant beam blockage. Two HSR techniques, static and dynamic HSRs, were compared and evaluated in this study. Static HSR technique utilizes beam blockage map and ground clutter map to yield the hybrid surface whereas dynamic HSR technique additionally applies quality index map that are derived from the fuzzy logic algorithm for a quality control in real time. The performances of two HSRs were evaluated by correlation coefficient (CORR), total ratio (RATIO), mean bias (BIAS), normalized standard deviation (NSD), and mean relative error (MRE) for ten rain cases. Dynamic HSR (CORR=0.88, BIAS= $-0.24mm\;hr^{-1}$, NSD=0.41, MRE=37.6%) shows better performances than static HSR without correction of reflectivity calibration bias (CORR=0.87, BIAS= $-2.94mm\;hr^{-1}$, NSD=0.76, MRE=58.4%) for all skill scores. Dynamic HSR technique overestimates surface rainfall at near range whereas it underestimates rainfall at far ranges due to the effects of beam broadening and increasing the radar beam height. In terms of NSD and MRE, dynamic HSR shows the best results regardless of the distance from radar. Static HSR significantly overestimates a surface rainfall at weaker rainfall intensity. However, RATIO of dynamic HSR remains almost 1.0 for all ranges of rainfall intensity. After correcting system bias of reflectivity, NSD and MRE of dynamic HSR are improved by about 20 and 15%, respectively.

Consumers Perceptions on Monosodium L-glutamate in Social Media (소셜미디어 분석을 통한 소비자들의 L-글루타민산나트륨에 대한 인식 조사)

  • Lee, Sooyeon;Lee, Wonsung;Moon, Il-Chul;Kwon, Hoonjeong
    • Journal of Food Hygiene and Safety
    • /
    • v.31 no.3
    • /
    • pp.153-166
    • /
    • 2016
  • The purpose of this study was to investigate consumers' perceptions on monosodium L-glutamate (MSG) in social media. Data were collected from Naver blogs and Naver web communities (Korean representative portal web-site), and media reports including comment sections on a Yonhap news website (Korean largest news agency). The results from Naver blogs and Naver web communities showed that it was primarily mentioned MSG-use restaurant reviews, 'MSG-no added' products, its safety, and methods of reducing MSG in food. When TV shows on current affairs, newspaper, or TV news reported uses and side effects of MSG, search volume for MSG has increased in both PC and mobile search engines. Search volume has increased especially when TV shows on current affairs reported it. There are more periods with increased search volume for Mobile than PC. Also, it was mainly commented about safety of MSG, criticism of low-quality foods, abuse of MSG, and distrust of government below the news on the Yonhap news site. The label of MSG-no added products in market emphasized "MSG-free" even though it is allocated as an acceptable daily intake (ADI) not-specified by the Joint FAO/WHO Expert Committee on Food Additives (JECFA). When consumers search for MSG (monosodium L-glutamate) or purchase food on market, they might perceive that 'MSG-no added' products are better. Competent authorities, offices of education and local government provide guidelines based on no added MSG principle and these policies might affect consumers' perceptions. TV program or news program could be a powerful and effective consumer communication channel about MSG through Mobile rather than PC. Therefore media including TV should report item on monosodium L-glutamate with responsibility and information based on scientific background for consumers to get reliable information.

Personal Information Overload and User Resistance in the Big Data Age (빅데이터 시대의 개인정보 과잉이 사용자 저항에 미치는 영향)

  • Lee, Hwansoo;Lim, Dongwon;Zo, Hangjung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.1
    • /
    • pp.125-139
    • /
    • 2013
  • Big data refers to the data that cannot be processes with conventional contemporary data technologies. As smart devices and social network services produces vast amount of data, big data attracts much attention from researchers. There are strong demands form governments and industries for bib data as it can create new values by drawing business insights from data. Since various new technologies to process big data introduced, academic communities also show much interest to the big data domain. A notable advance related to the big data technology has been in various fields. Big data technology makes it possible to access, collect, and save individual's personal data. These technologies enable the analysis of huge amounts of data with lower cost and less time, which is impossible to achieve with traditional methods. It even detects personal information that people do not want to open. Therefore, people using information technology such as the Internet or online services have some level of privacy concerns, and such feelings can hinder continued use of information systems. For example, SNS offers various benefits, but users are sometimes highly exposed to privacy intrusions because they write too much personal information on it. Even though users post their personal information on the Internet by themselves, the data sometimes is not under control of the users. Once the private data is posed on the Internet, it can be transferred to anywhere by a few clicks, and can be abused to create fake identity. In this way, privacy intrusion happens. This study aims to investigate how perceived personal information overload in SNS affects user's risk perception and information privacy concerns. Also, it examines the relationship between the concerns and user resistance behavior. A survey approach and structural equation modeling method are employed for data collection and analysis. This study contributes meaningful insights for academic researchers and policy makers who are planning to develop guidelines for privacy protection. The study shows that information overload on the social network services can bring the significant increase of users' perceived level of privacy risks. In turn, the perceived privacy risks leads to the increased level of privacy concerns. IF privacy concerns increase, it can affect users to from a negative or resistant attitude toward system use. The resistance attitude may lead users to discontinue the use of social network services. Furthermore, information overload is mediated by perceived risks to affect privacy concerns rather than has direct influence on perceived risk. It implies that resistance to the system use can be diminished by reducing perceived risks of users. Given that users' resistant behavior become salient when they have high privacy concerns, the measures to alleviate users' privacy concerns should be conceived. This study makes academic contribution of integrating traditional information overload theory and user resistance theory to investigate perceived privacy concerns in current IS contexts. There is little big data research which examined the technology with empirical and behavioral approach, as the research topic has just emerged. It also makes practical contributions. Information overload connects to the increased level of perceived privacy risks, and discontinued use of the information system. To keep users from departing the system, organizations should develop a system in which private data is controlled and managed with ease. This study suggests that actions to lower the level of perceived risks and privacy concerns should be taken for information systems continuance.

Stock Price Prediction by Utilizing Category Neutral Terms: Text Mining Approach (카테고리 중립 단어 활용을 통한 주가 예측 방안: 텍스트 마이닝 활용)

  • Lee, Minsik;Lee, Hong Joo
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.123-138
    • /
    • 2017
  • Since the stock market is driven by the expectation of traders, studies have been conducted to predict stock price movements through analysis of various sources of text data. In order to predict stock price movements, research has been conducted not only on the relationship between text data and fluctuations in stock prices, but also on the trading stocks based on news articles and social media responses. Studies that predict the movements of stock prices have also applied classification algorithms with constructing term-document matrix in the same way as other text mining approaches. Because the document contains a lot of words, it is better to select words that contribute more for building a term-document matrix. Based on the frequency of words, words that show too little frequency or importance are removed. It also selects words according to their contribution by measuring the degree to which a word contributes to correctly classifying a document. The basic idea of constructing a term-document matrix was to collect all the documents to be analyzed and to select and use the words that have an influence on the classification. In this study, we analyze the documents for each individual item and select the words that are irrelevant for all categories as neutral words. We extract the words around the selected neutral word and use it to generate the term-document matrix. The neutral word itself starts with the idea that the stock movement is less related to the existence of the neutral words, and that the surrounding words of the neutral word are more likely to affect the stock price movements. And apply it to the algorithm that classifies the stock price fluctuations with the generated term-document matrix. In this study, we firstly removed stop words and selected neutral words for each stock. And we used a method to exclude words that are included in news articles for other stocks among the selected words. Through the online news portal, we collected four months of news articles on the top 10 market cap stocks. We split the news articles into 3 month news data as training data and apply the remaining one month news articles to the model to predict the stock price movements of the next day. We used SVM, Boosting and Random Forest for building models and predicting the movements of stock prices. The stock market opened for four months (2016/02/01 ~ 2016/05/31) for a total of 80 days, using the initial 60 days as a training set and the remaining 20 days as a test set. The proposed word - based algorithm in this study showed better classification performance than the word selection method based on sparsity. This study predicted stock price volatility by collecting and analyzing news articles of the top 10 stocks in market cap. We used the term - document matrix based classification model to estimate the stock price fluctuations and compared the performance of the existing sparse - based word extraction method and the suggested method of removing words from the term - document matrix. The suggested method differs from the word extraction method in that it uses not only the news articles for the corresponding stock but also other news items to determine the words to extract. In other words, it removed not only the words that appeared in all the increase and decrease but also the words that appeared common in the news for other stocks. When the prediction accuracy was compared, the suggested method showed higher accuracy. The limitation of this study is that the stock price prediction was set up to classify the rise and fall, and the experiment was conducted only for the top ten stocks. The 10 stocks used in the experiment do not represent the entire stock market. In addition, it is difficult to show the investment performance because stock price fluctuation and profit rate may be different. Therefore, it is necessary to study the research using more stocks and the yield prediction through trading simulation.

An Analysis of the Specialist's Preference for the Model of Park-Based Mixed-Use Districts in Securing Urban Parks and Green Spaces Via Private Development (민간개발 주도형 도시공원.녹지 확보를 위한 공원복합용도지구 모형에 대한 전문가 선호도 분석)

  • Lee, Jeung-Eun;Cho, Se-Hwan
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.39 no.6
    • /
    • pp.1-11
    • /
    • 2011
  • The research was aimed to verify the feasibility of the model of Park-Based Mixed-Use Districts(PBMUD) around urban large park to secure private-based urban parks through the revision of the urban zoning system. The PBMUD is a type of urban zoning district in which park-oriented land use is mixed with the urban land uses of residents, advertising, business, culture, education and research. The PBMUD, delineated from and based on a new paradigm of landscape urbanism, is a new urban strategy to secure urban parks and to cultivate urban regeneration around parks and green spaces to enhance the quality of the urban landscape and to ameliorate urban environmental disasters like climate change. This study performed a questionnaire survey and analysis after a review of literature related to PBMUD. The study looked for specialists in the fields of urban planning and landscape architecture such as officials, researchers and engineers to respond to the questionnaire, which asked about degree of preference. The conclusions of this study were as follows. Firstly, specialists prefer the PBMUD at 79.3% for to 20.7% against ratio, indicating the feasibility of the model of PBMUD. The second, the most preferable reasons for the model, were the possibility of securing park space around urban parks and green spaces that assures access to park and communication with each area. The third, the main reason for non-preference for the model, was a lack of understanding of PBMUD added to the problems of unprofitable laws and regulations related to urban planning and development. These proposed a revision of the related laws and regulations such as the laws for planning and use of national land, laws for architecture etc. The fourth, the most preferred type of PBMUD, was cultural use mixed with park use in every kind of mix of land use. The degree of preference was lower in the order of use of commercial, residential, business, and education(research) when mixed with park use. The number of mixed-use amenities with in the park was found to be an indicator determining preference. The greater the number, the lower was preference frequencies, especially when related to research and business use. The fifth, the preference frequencies of the more than 70% among the respondents to the mixed-use ratio between park use and the others, was in a ratio of 60% park use and 40% other urban use. These research results will help to launch new future research subjects on the revision of zoning regulations in the laws for the planning and uses of national land and architectural law as well as criteria and indicators of subdivision planning as related to a PBMUD model.

Derivation of Digital Music's Ranking Change Through Time Series Clustering (시계열 군집분석을 통한 디지털 음원의 순위 변화 패턴 분류)

  • Yoo, In-Jin;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.171-191
    • /
    • 2020
  • This study focused on digital music, which is the most valuable cultural asset in the modern society and occupies a particularly important position in the flow of the Korean Wave. Digital music was collected based on the "Gaon Chart," a well-established music chart in Korea. Through this, the changes in the ranking of the music that entered the chart for 73 weeks were collected. Afterwards, patterns with similar characteristics were derived through time series cluster analysis. Then, a descriptive analysis was performed on the notable features of each pattern. The research process suggested by this study is as follows. First, in the data collection process, time series data was collected to check the ranking change of digital music. Subsequently, in the data processing stage, the collected data was matched with the rankings over time, and the music title and artist name were processed. Each analysis is then sequentially performed in two stages consisting of exploratory analysis and explanatory analysis. First, the data collection period was limited to the period before 'the music bulk buying phenomenon', a reliability issue related to music ranking in Korea. Specifically, it is 73 weeks starting from December 31, 2017 to January 06, 2018 as the first week, and from May 19, 2019 to May 25, 2019. And the analysis targets were limited to digital music released in Korea. In particular, digital music was collected based on the "Gaon Chart", a well-known music chart in Korea. Unlike private music charts that are being serviced in Korea, Gaon Charts are charts approved by government agencies and have basic reliability. Therefore, it can be considered that it has more public confidence than the ranking information provided by other services. The contents of the collected data are as follows. Data on the period and ranking, the name of the music, the name of the artist, the name of the album, the Gaon index, the production company, and the distribution company were collected for the music that entered the top 100 on the music chart within the collection period. Through data collection, 7,300 music, which were included in the top 100 on the music chart, were identified for a total of 73 weeks. On the other hand, in the case of digital music, since the cases included in the music chart for more than two weeks are frequent, the duplication of music is removed through the pre-processing process. For duplicate music, the number and location of the duplicated music were checked through the duplicate check function, and then deleted to form data for analysis. Through this, a list of 742 unique music for analysis among the 7,300-music data in advance was secured. A total of 742 songs were secured through previous data collection and pre-processing. In addition, a total of 16 patterns were derived through time series cluster analysis on the ranking change. Based on the patterns derived after that, two representative patterns were identified: 'Steady Seller' and 'One-Hit Wonder'. Furthermore, the two patterns were subdivided into five patterns in consideration of the survival period of the music and the music ranking. The important characteristics of each pattern are as follows. First, the artist's superstar effect and bandwagon effect were strong in the one-hit wonder-type pattern. Therefore, when consumers choose a digital music, they are strongly influenced by the superstar effect and the bandwagon effect. Second, through the Steady Seller pattern, we confirmed the music that have been chosen by consumers for a very long time. In addition, we checked the patterns of the most selected music through consumer needs. Contrary to popular belief, the steady seller: mid-term pattern, not the one-hit wonder pattern, received the most choices from consumers. Particularly noteworthy is that the 'Climbing the Chart' phenomenon, which is contrary to the existing pattern, was confirmed through the steady-seller pattern. This study focuses on the change in the ranking of music over time, a field that has been relatively alienated centering on digital music. In addition, a new approach to music research was attempted by subdividing the pattern of ranking change rather than predicting the success and ranking of music.