• Title/Summary/Keyword: stock

Search Result 4,955, Processing Time 0.038 seconds

A Study on the Critical Success Factors of Social Commerce through the Analysis of the Perception Gap between the Service Providers and the Users: Focused on Ticket Monster in Korea (서비스제공자와 사용자의 인식차이 분석을 통한 소셜커머스 핵심성공요인에 대한 연구: 한국의 티켓몬스터 중심으로)

  • Kim, Il Jung;Lee, Dae Chul;Lim, Gyoo Gun
    • Asia pacific journal of information systems
    • /
    • v.24 no.2
    • /
    • pp.211-232
    • /
    • 2014
  • Recently, there is a growing interest toward social commerce using SNS(Social Networking Service), and the size of its market is also expanding due to popularization of smart phones, tablet PCs and other smart devices. Accordingly, various studies have been attempted but it is shown that most of the previous studies have been conducted from perspectives of the users. The purpose of this study is to derive user-centered CSF(Critical Success Factor) of social commerce from the previous studies and analyze the CSF perception gap between social commerce service providers and users. The CSF perception gap between two groups shows that there is a difference between ideal images the service providers hope for and the actual image the service users have on social commerce companies. This study provides effective improvement directions for social commerce companies by presenting current business problems and its solution plans. For this, This study selected Korea's representative social commerce business Ticket Monster, which is dominant in sales and staff size together with its excellent funding power through M&A by stock exchange with the US social commerce business Living Social with Amazon.com as a shareholder in August, 2011, as a target group of social commerce service provider. we have gathered questionnaires from both service providers and the users from October 22, 2012 until October 31, 2012 to conduct an empirical analysis. We surveyed 160 service providers of Ticket Monster We also surveyed 160 social commerce users who have experienced in using Ticket Monster service. Out of 320 surveys, 20 questionaries which were unfit or undependable were discarded. Consequently the remaining 300(service provider 150, user 150)were used for this empirical study. The statistics were analyzed using SPSS 12.0. Implications of the empirical analysis result of this study are as follows: First of all, There are order differences in the importance of social commerce CSF between two groups. While service providers regard Price Economic as the most important CSF influencing purchasing intention, the users regard 'Trust' as the most important CSF influencing purchasing intention. This means that the service providers have to utilize the unique strong point of social commerce which make the customers be trusted rathe than just focusing on selling product at a discounted price. It means that service Providers need to enhance effective communication skills by using SNS and play a vital role as a trusted adviser who provides curation services and explains the value of products through information filtering. Also, they need to pay attention to preventing consumer damages from deceptive and false advertising. service providers have to create the detailed reward system in case of a consumer damages caused by above problems. It can make strong ties with customers. Second, both service providers and users tend to consider that social commerce CSF influencing purchasing intention are Price Economic, Utility, Trust, and Word of Mouth Effect. Accordingly, it can be learned that users are expecting the benefit from the aspect of prices and economy when using social commerce, and service providers should be able to suggest the individualized discount benefit through diverse methods using social network service. Looking into it from the aspect of usefulness, service providers are required to get users to be cognizant of time-saving, efficiency, and convenience when they are using social commerce. Therefore, it is necessary to increase the usefulness of social commerce through the introduction of a new management strategy, such as intensification of search engine of the Website, facilitation in payment through shopping basket, and package distribution. Trust, as mentioned before, is the most important variable in consumers' mind, so it should definitely be managed for sustainable management. If the trust in social commerce should fall due to consumers' damage case due to false and puffery advertising forgeries, it could have a negative influence on the image of the social commerce industry in general. Instead of advertising with famous celebrities and using a bombastic amount of money on marketing expenses, the social commerce industry should be able to use the word of mouth effect between users by making use of the social network service, the major marketing method of initial social commerce. The word of mouth effect occurring from consumers' spontaneous self-marketer's duty performance can bring not only reduction effect in advertising cost to a service provider but it can also prepare the basis of discounted price suggestion to consumers; in this context, the word of mouth effect should be managed as the CSF of social commerce. Third, Trade safety was not derived as one of the CSF. Recently, with e-commerce like social commerce and Internet shopping increasing in a variety of methods, the importance of trade safety on the Internet also increases, but in this study result, trade safety wasn't evaluated as CSF of social commerce by both groups. This study judges that it's because both service provider groups and user group are perceiving that there is a reliable PG(Payment Gateway) which acts for e-payment of Internet transaction. Accordingly, it is understood that both two groups feel that social commerce can have a corporate identity by website and differentiation in products and services in sales, but don't feel a big difference by business in case of e-payment system. In other words, trade safety should be perceived as natural, basic universal service. Fourth, it's necessary that service providers should intensify the communication with users by making use of social network service which is the major marketing method of social commerce and should be able to use the word of mouth effect between users. The word of mouth effect occurring from consumers' spontaneous self- marketer's duty performance can bring not only reduction effect in advertising cost to a service provider but it can also prepare the basis of discounted price suggestion to consumers. in this context, it is judged that the word of mouth effect should be managed as CSF of social commerce. In this paper, the characteristics of social commerce are limited as five independent variables, however, if an additional study is proceeded with more various independent variables, more in-depth study results will be derived. In addition, this research targets social commerce service providers and the users, however, in the consideration of the fact that social commerce is a two-sided market, drawing CSF through an analysis of perception gap between social commerce service providers and its advertisement clients would be worth to be dealt with in a follow-up study.

Evaluation on the Immunization Module of Non-chart System in Private Clinic for Development of Internet Information System of National Immunization Programme m Korea (국가 예방접종 인터넷정보시스템 개발을 위한 의원정보시스템의 예방접종 모듈 평가연구)

  • Lee, Moo-Sik;Lee, Kun-Sei;Lee, Seok-Gu;Shin, Eui-Chul;Kim, Keon-Yeop;Na, Bak-Ju;Hong, Jee-Young;Kim, Yun-Jeong;Park, Sook-Kyung;Kim, Bo-Kyung;Kwon, Yun-Hyung;Kim, Young-Taek
    • Journal of agricultural medicine and community health
    • /
    • v.29 no.1
    • /
    • pp.65-75
    • /
    • 2004
  • Objectives: Immunizations have been one of the most effective measures preventing from infectious diseases. It is quite important national infectious disease prevention policy to keep the immunizations rate high and monitor the immunizations rate continuously. To do this, Korean CDC introduced the National Immunization Registry Program(NIRP) which has been implementing since 2000 at the Public Health Centers(PHC). The National Immunization Registry Program will be near completed after sharing, connecting and transfering vaccination data between public and private sector. The aims of this study was to evaluate the immunization module of non-chart system in private clinic with health information system of public health center(made by POSDATA Co., LTD) and immunization registry program(made by BIT Computer Co., LTD). Methods: The analysis and survey were done by specialists in medical, health field, and health information fields from 2001. November to 2002. January. We made the analysis and recommendation about the immunization module of non-chart system in private clinic. Results and Conclusions: To make improvement on immunization module, the system will be revised on various function like receipt and registration, preliminary medical examination, reference and inquiry, registration of vaccine, print-out various sheet, function of transfer vaccination data, issue function of vaccination certification, function of reminder and recall, function of statistical calculation, and management of vaccine stock. There are needs of an accurate assessment of current immunization module on each private non-chart system. And further studies will be necessary to make it an accurate system under changing health policy related national immunization program. We hope that the result of this study may contribute to establish the National Immunization Registry Program.

  • PDF

When do we use the Recycling Autograft in Limb Salvage Surgery? (사지구제술에서 언제 재활용 자가골 이식술이 유용한가?)

  • Kim, Jae-Do;Jang, Jae-Ho;Cho, Yool;Kim, Ji-Youn;Chung, So-Hak
    • The Journal of the Korean bone and joint tumor society
    • /
    • v.14 no.2
    • /
    • pp.95-105
    • /
    • 2008
  • Purpose: To identify which is the best procedure in recycling autograft according to the resection & reconstruction type and recycling methods, and so when the recycling autograft is used in limb salvage surgery. Materials and Methods: We have treated fifty-eight patients (34 male, 24 female; age range 5 to 74 years, mean age 36.5 years), who had the malignant musculoskeletal tumors, with recycling autograft (47 patients with extracoporeal irradiation, 11 patients with pasteurization) from December 1995 to February 2006. The resection and reconstruction type was 3 cases with fragmentary, 8 intercalary, 23 rAPC (recycling-Autograft-Prosthesis composite), 18 osteoarticular, 5 total joint and 1 soft tissue (achilles tendon). The result was evaluated by the radiologic union at junctional site, the functional score by musculoskeletal tumor society score and complications according to the resection & reconstruction type and recycling methods. Results: The junctional union was obtained at 15.0 months in extracoporeal irradiation and 12.6 months in pasteurization. Also the mean radiologic union was shown at 6.0 months in fragmentary, 12.8 months in intercalary, 10 months in rAPC, 23.3 months in osteoarticular and 15.6 months in total joint. The functional score was 65.5% in fragmentary, 60.8% in intercalary, 62.8% in APC (except pelvis), 66.0% in osteoarticular and 66.6% in total joint. We have experienced 1 infection, 1 prutrusio acetabuli in pasteurization (18.1%) and other 22 complications (3 deep infections, 8 nonunions, 2 fractures, 2 epiphyseal problems, 5 joint instabilities, 2 local recurrence) in extracoporeal irradiation (46.8%). Also we have experienced 3 complications (3 nonunions) in intercalary (37.5%), 9 complications (4 nonunions, 1 deep infection, 1 periprosthetic fracture, 1 epiphyseal problem, 1 local recurrence, 1 protrusio acetabuli) in rAPC (50.0%), 6 complications (2 deep infections, 2 nonunions, 1 epiphyseal problem, 1 pathologic fracture) in osteoarticular (33.3%), 5 complications (5 joint instabilities) in total joint (100%) and 1 complication(1 local recurrence) in soft tissue (100%). Conclusion: In our experience, according to the resection & reconstruction type fragmentary and intercalary may have several advantages such as good radiologic and functional result and low rate of complication. And it seems that rAPC was available in case which have no sufficient residual bone stock. Also the pasteurization may have more advantages than that of the extracorporeal irradiation.

  • PDF

Visualizing the Results of Opinion Mining from Social Media Contents: Case Study of a Noodle Company (소셜미디어 콘텐츠의 오피니언 마이닝결과 시각화: N라면 사례 분석 연구)

  • Kim, Yoosin;Kwon, Do Young;Jeong, Seung Ryul
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.4
    • /
    • pp.89-105
    • /
    • 2014
  • After emergence of Internet, social media with highly interactive Web 2.0 applications has provided very user friendly means for consumers and companies to communicate with each other. Users have routinely published contents involving their opinions and interests in social media such as blogs, forums, chatting rooms, and discussion boards, and the contents are released real-time in the Internet. For that reason, many researchers and marketers regard social media contents as the source of information for business analytics to develop business insights, and many studies have reported results on mining business intelligence from Social media content. In particular, opinion mining and sentiment analysis, as a technique to extract, classify, understand, and assess the opinions implicit in text contents, are frequently applied into social media content analysis because it emphasizes determining sentiment polarity and extracting authors' opinions. A number of frameworks, methods, techniques and tools have been presented by these researchers. However, we have found some weaknesses from their methods which are often technically complicated and are not sufficiently user-friendly for helping business decisions and planning. In this study, we attempted to formulate a more comprehensive and practical approach to conduct opinion mining with visual deliverables. First, we described the entire cycle of practical opinion mining using Social media content from the initial data gathering stage to the final presentation session. Our proposed approach to opinion mining consists of four phases: collecting, qualifying, analyzing, and visualizing. In the first phase, analysts have to choose target social media. Each target media requires different ways for analysts to gain access. There are open-API, searching tools, DB2DB interface, purchasing contents, and so son. Second phase is pre-processing to generate useful materials for meaningful analysis. If we do not remove garbage data, results of social media analysis will not provide meaningful and useful business insights. To clean social media data, natural language processing techniques should be applied. The next step is the opinion mining phase where the cleansed social media content set is to be analyzed. The qualified data set includes not only user-generated contents but also content identification information such as creation date, author name, user id, content id, hit counts, review or reply, favorite, etc. Depending on the purpose of the analysis, researchers or data analysts can select a suitable mining tool. Topic extraction and buzz analysis are usually related to market trends analysis, while sentiment analysis is utilized to conduct reputation analysis. There are also various applications, such as stock prediction, product recommendation, sales forecasting, and so on. The last phase is visualization and presentation of analysis results. The major focus and purpose of this phase are to explain results of analysis and help users to comprehend its meaning. Therefore, to the extent possible, deliverables from this phase should be made simple, clear and easy to understand, rather than complex and flashy. To illustrate our approach, we conducted a case study on a leading Korean instant noodle company. We targeted the leading company, NS Food, with 66.5% of market share; the firm has kept No. 1 position in the Korean "Ramen" business for several decades. We collected a total of 11,869 pieces of contents including blogs, forum contents and news articles. After collecting social media content data, we generated instant noodle business specific language resources for data manipulation and analysis using natural language processing. In addition, we tried to classify contents in more detail categories such as marketing features, environment, reputation, etc. In those phase, we used free ware software programs such as TM, KoNLP, ggplot2 and plyr packages in R project. As the result, we presented several useful visualization outputs like domain specific lexicons, volume and sentiment graphs, topic word cloud, heat maps, valence tree map, and other visualized images to provide vivid, full-colored examples using open library software packages of the R project. Business actors can quickly detect areas by a swift glance that are weak, strong, positive, negative, quiet or loud. Heat map is able to explain movement of sentiment or volume in categories and time matrix which shows density of color on time periods. Valence tree map, one of the most comprehensive and holistic visualization models, should be very helpful for analysts and decision makers to quickly understand the "big picture" business situation with a hierarchical structure since tree-map can present buzz volume and sentiment with a visualized result in a certain period. This case study offers real-world business insights from market sensing which would demonstrate to practical-minded business users how they can use these types of results for timely decision making in response to on-going changes in the market. We believe our approach can provide practical and reliable guide to opinion mining with visualized results that are immediately useful, not just in food industry but in other industries as well.

Studies on the Amylase Production by Bacteria (세균(細菌)에 의(依)한 Amylase생산(生産)에 관한 연구(硏究))

  • Park, Yoon-Joong
    • Applied Biological Chemistry
    • /
    • v.13 no.2
    • /
    • pp.153-170
    • /
    • 1970
  • 1. Isolation and identification of amylase-producing bacteria. The powerful strain A-12 and S-8 were respectively isolated from air and soil after screening a large number of amylase-producing bacteria. Their bacterial characteristics have been investigated and it has been found that all characteristics of strain A-12 and S-8 are similar to Bac. subtilis of Bergey's manual except for the acid formation from a few carbohydrates and the citrate utilization, i.e., the strain A-12 shows negative in the citrate utilization, and the acid formation from arabinose and xylose, S-8 shows negative in the acid formation from xylose. 2. Amylase production by Liquid cultures with solid materials. Several conditions for amylase production by strain A-12 in stationary cultures have been studied. The results obtained are as follows. (1) The optimum conditions are:temperature $35^{\circ}C$, initial pH 6.5 to 7.0 and incubation time 3 to 4 days. (2) The amylase production is not affected by the preservation period of the stock cultures. (3) Among the various solid material, the defatted soy bean is found to be the best for t1e amylase production. However, the alkali treatment of the defatted soy bean gives no effect contrary to the cage of defatted rape seed. The addition of soluble starch to the alkali extract of defatted soy bean shows the increased amylase production. (4) Up to 1% addition of ethanol to carbon dificient media gives the improved amylase production, whereas the above effect is not found in the case of carbon rich media. (5) The amylase production can be increased 2.5 times when 10% of defatted soy bean is admixed to cheaply available wheat bran. (6) The excellent effect is found for amylase production when 20% of wheat bran is admixed to defatted dry milk which is a poor medium. The activity is found to be $D^{40^{\circ}}_{30'}$ 7,000(L.S.V. 1,800) in 10% medium. (7) No significant effect is observed due to the addition of various inorganic salts. 3. Amylase production by solid cultures. Several conditions for amylase production by strain A-12 in wheat bran cultures have been studied and the results obtained are as follows. (1) The optimum conditions: are temperature $33^{\circ}C$, incubation lime 2 days, water content added 150 to 175% and the thickness of the medium 1.5cm, The activity is found to be $D^{40^{\circ}}_{30'}$ 36,000(L.S.V. 15,000) (2) No significant effect is found in the case of the additions of various organic and inorganic substances.

  • PDF

Preparation of Powdered Smoked-Dried Mackerel Soup and Its Taste Compounds (고등어분말수우프의 제조 및 정미성분에 관한 연구)

  • LEE Eung-Ho;OH Kwang-Soo;AHN Chang-Bum;CHUNG Bu-Gil;BAE You-Kyung;HA Jin-Hwan
    • Korean Journal of Fisheries and Aquatic Sciences
    • /
    • v.20 no.1
    • /
    • pp.41-51
    • /
    • 1987
  • This study was carried out to prepare powdered smoked-dried mackerel which can be used as a soup base, and to examine storage stability and the taste compounds of Products. Raw mackerel are filleted, toiled for 10 minutes and pressed to remove lipids, and then soaked in extract solution of skipjack meat. This soaked mackerel are smoked 3 times to $10-12\%$ moisture content at $80^{\circ}C$ for 8 hours. And the smoked-dried mackerel were pulverized to 50 mesh. Finally, the powdered smoked-dried mackerel were packed in a laminated film $bag(PET/Al\;foil/CPP:\;5{\mu}m/15{\mu}m/70{\mu}m,\;15\times17cm)$ with air(product C), nitrogen(product N) and oxygen absorber(product O), and then stored at room temperature for 100 days. The moisture and crude lipid content of powdered smoked-dried mackerel was $11.3-12.3\%,\;12\%$, respectively, and water activity is 0.52-0.56. And these values showed little changes during storage. The pH, VBN and amino nitrogen content increased slowly during storage. Hydrophilic and lipophilic brown pigment formation showed a tendency of increase in product(C) and showed little change in product(N) and (O). The TBA value, peroxide value and carbonyl value of product(N) and (O) were lower than those of product (C). The major fatty acids of products were 16:0, 18:1, 22:6, 18:0 and 20:5, and polyenoic acids decreased, while saturated and monoenoic acids increased during processing and storage of products. The IMP content in products were 420.2-454.2 mg/100 g and decreased slightly with storage period. And major non-volatile organic acids in products were lactic acid, succinic acid and $\alpha-ketoglutaric$ acid. In free amino acids and related compounds, major ones are histidine, alanine, hydroxyproline, lysine, glutamic acid and anserine, which occupied $80.8\%$ of total free amino acids. The taste compounds of powdered smoked-dried mackerel were free amino acids and related compounds (1,279.4 mg/100 g), non-volatile organic acids(948.1 mg/100 g), nucleotides and their related compounds (672.8 mg/100 g), total creatinine(430.4 ntg/100 g), tetaine(86.6 mg/100 g) and small amount of TMAO. The extraction condition of powdered smoked-dried mackerel in preparing soup stock is appropriate at $100^{\circ}C$ for 1 minute. Judging from the results of taste and sensory evaluation, it is concluded that the powdered smoked-dried mackerel can be used as natural flavoring substance in preparing soups and broth.

  • PDF

A Study on Risk Parity Asset Allocation Model with XGBoos (XGBoost를 활용한 리스크패리티 자산배분 모형에 관한 연구)

  • Kim, Younghoon;Choi, HeungSik;Kim, SunWoong
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.135-149
    • /
    • 2020
  • Artificial intelligences are changing world. Financial market is also not an exception. Robo-Advisor is actively being developed, making up the weakness of traditional asset allocation methods and replacing the parts that are difficult for the traditional methods. It makes automated investment decisions with artificial intelligence algorithms and is used with various asset allocation models such as mean-variance model, Black-Litterman model and risk parity model. Risk parity model is a typical risk-based asset allocation model which is focused on the volatility of assets. It avoids investment risk structurally. So it has stability in the management of large size fund and it has been widely used in financial field. XGBoost model is a parallel tree-boosting method. It is an optimized gradient boosting model designed to be highly efficient and flexible. It not only makes billions of examples in limited memory environments but is also very fast to learn compared to traditional boosting methods. It is frequently used in various fields of data analysis and has a lot of advantages. So in this study, we propose a new asset allocation model that combines risk parity model and XGBoost machine learning model. This model uses XGBoost to predict the risk of assets and applies the predictive risk to the process of covariance estimation. There are estimated errors between the estimation period and the actual investment period because the optimized asset allocation model estimates the proportion of investments based on historical data. these estimated errors adversely affect the optimized portfolio performance. This study aims to improve the stability and portfolio performance of the model by predicting the volatility of the next investment period and reducing estimated errors of optimized asset allocation model. As a result, it narrows the gap between theory and practice and proposes a more advanced asset allocation model. In this study, we used the Korean stock market price data for a total of 17 years from 2003 to 2019 for the empirical test of the suggested model. The data sets are specifically composed of energy, finance, IT, industrial, material, telecommunication, utility, consumer, health care and staple sectors. We accumulated the value of prediction using moving-window method by 1,000 in-sample and 20 out-of-sample, so we produced a total of 154 rebalancing back-testing results. We analyzed portfolio performance in terms of cumulative rate of return and got a lot of sample data because of long period results. Comparing with traditional risk parity model, this experiment recorded improvements in both cumulative yield and reduction of estimated errors. The total cumulative return is 45.748%, about 5% higher than that of risk parity model and also the estimated errors are reduced in 9 out of 10 industry sectors. The reduction of estimated errors increases stability of the model and makes it easy to apply in practical investment. The results of the experiment showed improvement of portfolio performance by reducing the estimated errors of the optimized asset allocation model. Many financial models and asset allocation models are limited in practical investment because of the most fundamental question of whether the past characteristics of assets will continue into the future in the changing financial market. However, this study not only takes advantage of traditional asset allocation models, but also supplements the limitations of traditional methods and increases stability by predicting the risks of assets with the latest algorithm. There are various studies on parametric estimation methods to reduce the estimated errors in the portfolio optimization. We also suggested a new method to reduce estimated errors in optimized asset allocation model using machine learning. So this study is meaningful in that it proposes an advanced artificial intelligence asset allocation model for the fast-developing financial markets.

Sentiment Analysis of Movie Review Using Integrated CNN-LSTM Mode (CNN-LSTM 조합모델을 이용한 영화리뷰 감성분석)

  • Park, Ho-yeon;Kim, Kyoung-jae
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.141-154
    • /
    • 2019
  • Rapid growth of internet technology and social media is progressing. Data mining technology has evolved to enable unstructured document representations in a variety of applications. Sentiment analysis is an important technology that can distinguish poor or high-quality content through text data of products, and it has proliferated during text mining. Sentiment analysis mainly analyzes people's opinions in text data by assigning predefined data categories as positive and negative. This has been studied in various directions in terms of accuracy from simple rule-based to dictionary-based approaches using predefined labels. In fact, sentiment analysis is one of the most active researches in natural language processing and is widely studied in text mining. When real online reviews aren't available for others, it's not only easy to openly collect information, but it also affects your business. In marketing, real-world information from customers is gathered on websites, not surveys. Depending on whether the website's posts are positive or negative, the customer response is reflected in the sales and tries to identify the information. However, many reviews on a website are not always good, and difficult to identify. The earlier studies in this research area used the reviews data of the Amazon.com shopping mal, but the research data used in the recent studies uses the data for stock market trends, blogs, news articles, weather forecasts, IMDB, and facebook etc. However, the lack of accuracy is recognized because sentiment calculations are changed according to the subject, paragraph, sentiment lexicon direction, and sentence strength. This study aims to classify the polarity analysis of sentiment analysis into positive and negative categories and increase the prediction accuracy of the polarity analysis using the pretrained IMDB review data set. First, the text classification algorithm related to sentiment analysis adopts the popular machine learning algorithms such as NB (naive bayes), SVM (support vector machines), XGboost, RF (random forests), and Gradient Boost as comparative models. Second, deep learning has demonstrated discriminative features that can extract complex features of data. Representative algorithms are CNN (convolution neural networks), RNN (recurrent neural networks), LSTM (long-short term memory). CNN can be used similarly to BoW when processing a sentence in vector format, but does not consider sequential data attributes. RNN can handle well in order because it takes into account the time information of the data, but there is a long-term dependency on memory. To solve the problem of long-term dependence, LSTM is used. For the comparison, CNN and LSTM were chosen as simple deep learning models. In addition to classical machine learning algorithms, CNN, LSTM, and the integrated models were analyzed. Although there are many parameters for the algorithms, we examined the relationship between numerical value and precision to find the optimal combination. And, we tried to figure out how the models work well for sentiment analysis and how these models work. This study proposes integrated CNN and LSTM algorithms to extract the positive and negative features of text analysis. The reasons for mixing these two algorithms are as follows. CNN can extract features for the classification automatically by applying convolution layer and massively parallel processing. LSTM is not capable of highly parallel processing. Like faucets, the LSTM has input, output, and forget gates that can be moved and controlled at a desired time. These gates have the advantage of placing memory blocks on hidden nodes. The memory block of the LSTM may not store all the data, but it can solve the CNN's long-term dependency problem. Furthermore, when LSTM is used in CNN's pooling layer, it has an end-to-end structure, so that spatial and temporal features can be designed simultaneously. In combination with CNN-LSTM, 90.33% accuracy was measured. This is slower than CNN, but faster than LSTM. The presented model was more accurate than other models. In addition, each word embedding layer can be improved when training the kernel step by step. CNN-LSTM can improve the weakness of each model, and there is an advantage of improving the learning by layer using the end-to-end structure of LSTM. Based on these reasons, this study tries to enhance the classification accuracy of movie reviews using the integrated CNN-LSTM model.

Fish Stock Assessment by Hydroacoustic Methods and its Applications - I - Estimation of Fish School Target Strength - (음향에 의한 어족생물의 자원조사 연구 - I - 어군반사강도의 추정 -)

  • Lee, Dae-Jae;Shin, Hyeong-Il;Shin, Hyong-Ho
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.31 no.2
    • /
    • pp.142-152
    • /
    • 1995
  • The combined bottom trawl and hydroacoustic survey was conducted by using the training ship Oshoro Maru belong to Hokkaido University in November 1989-1992 and the training ship Nagasaki Maru belong to Nagasaki University in April 1994 in the East China Sea, respectively. The aim of the investigations was to collect the target strength data of fish school in relation to the biomass estimation of fish in the survey area. The hydroacoustic survey was performed by using the scientific echo sounder system operating at three frequencies of 25, 50 and 100kHz with a microcomputer-based echo integrator. Fish samples were collected by bottom trawling and during the trawl surveys, the openings of otter board and net mouth were measured. The target strength of fish school was estimated from the relationship between the volume back scattering strength for the depth strata of bottom trawling and the weight per unit volume of trawl catches. A portion of the trawl catches preserved in frozon condition on board, the target strength measurements for the defrosted samples of ten species were conducted in the laboratory tank, and the relationship between target strength and fish weight was examined. In order to investigate the effect of swimbladder on target strength, the volume of the swimbladder of white croaker, Argyrosomus argentatus, sampled by bottom trawling was measured by directly removing the gas in the swimbladder with a syringe on board. The results obtained can be summarized as follows: 1.The relationship between the mean volume back scattering strength (, dB) for the depth strata of trawl hauls and the weight(C, $kg/\textrm{m}^3$) per unit volume of trawl catches were expressed by the following equations : 25kHz : = - 29.8+10Log(C) 50kHz : = - 32.4+10Log(C) 100kHz : = - 31.7+10Log(C) The mean target strength estimates for three frequencies of 25, 50 and 100 kHz derived from these equations were -29.8dB/kg, -32.4dB/kg and -31.7dB/kg, respectively. 2. The relationship between target strength and body weight for the fish samples of ten species collected by trawl surveys were expressed by the following equations : 25kHz : TS = - 34.0+10Log($W^{\frac{2}{3}}$) 100kHz : TS = - 37.8+10Log($W^{\frac{2}{3}}$) The mean target strength estimates for two frequencies of 25 and 100 kHz derived from these equations were -34.0dB/kg, -37.8dB/kg, respectively. 3. The representative target strength values for demersal fish populations of the East China Sea at two frequencies of 25 and 100 kHz were estimated to be -31.4dB/kg, -33.8dB/kg, respectively. 4. The ratio of the equivalent radius of swimbladder to body length of white croaker was 0.089 and the volume of swimbladder was estimated to be approximately 10% of total body volume.

  • PDF

Animal Infectious Diseases Prevention through Big Data and Deep Learning (빅데이터와 딥러닝을 활용한 동물 감염병 확산 차단)

  • Kim, Sung Hyun;Choi, Joon Ki;Kim, Jae Seok;Jang, Ah Reum;Lee, Jae Ho;Cha, Kyung Jin;Lee, Sang Won
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.137-154
    • /
    • 2018
  • Animal infectious diseases, such as avian influenza and foot and mouth disease, occur almost every year and cause huge economic and social damage to the country. In order to prevent this, the anti-quarantine authorities have tried various human and material endeavors, but the infectious diseases have continued to occur. Avian influenza is known to be developed in 1878 and it rose as a national issue due to its high lethality. Food and mouth disease is considered as most critical animal infectious disease internationally. In a nation where this disease has not been spread, food and mouth disease is recognized as economic disease or political disease because it restricts international trade by making it complex to import processed and non-processed live stock, and also quarantine is costly. In a society where whole nation is connected by zone of life, there is no way to prevent the spread of infectious disease fully. Hence, there is a need to be aware of occurrence of the disease and to take action before it is distributed. Epidemiological investigation on definite diagnosis target is implemented and measures are taken to prevent the spread of disease according to the investigation results, simultaneously with the confirmation of both human infectious disease and animal infectious disease. The foundation of epidemiological investigation is figuring out to where one has been, and whom he or she has met. In a data perspective, this can be defined as an action taken to predict the cause of disease outbreak, outbreak location, and future infection, by collecting and analyzing geographic data and relation data. Recently, an attempt has been made to develop a prediction model of infectious disease by using Big Data and deep learning technology, but there is no active research on model building studies and case reports. KT and the Ministry of Science and ICT have been carrying out big data projects since 2014 as part of national R &D projects to analyze and predict the route of livestock related vehicles. To prevent animal infectious diseases, the researchers first developed a prediction model based on a regression analysis using vehicle movement data. After that, more accurate prediction model was constructed using machine learning algorithms such as Logistic Regression, Lasso, Support Vector Machine and Random Forest. In particular, the prediction model for 2017 added the risk of diffusion to the facilities, and the performance of the model was improved by considering the hyper-parameters of the modeling in various ways. Confusion Matrix and ROC Curve show that the model constructed in 2017 is superior to the machine learning model. The difference between the2016 model and the 2017 model is that visiting information on facilities such as feed factory and slaughter house, and information on bird livestock, which was limited to chicken and duck but now expanded to goose and quail, has been used for analysis in the later model. In addition, an explanation of the results was added to help the authorities in making decisions and to establish a basis for persuading stakeholders in 2017. This study reports an animal infectious disease prevention system which is constructed on the basis of hazardous vehicle movement, farm and environment Big Data. The significance of this study is that it describes the evolution process of the prediction model using Big Data which is used in the field and the model is expected to be more complete if the form of viruses is put into consideration. This will contribute to data utilization and analysis model development in related field. In addition, we expect that the system constructed in this study will provide more preventive and effective prevention.