• Title/Summary/Keyword: 정보시스템 평가

Search Result 8,549, Processing Time 0.039 seconds

A Study of Anomaly Detection for ICT Infrastructure using Conditional Multimodal Autoencoder (ICT 인프라 이상탐지를 위한 조건부 멀티모달 오토인코더에 관한 연구)

  • Shin, Byungjin;Lee, Jonghoon;Han, Sangjin;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.3
    • /
    • pp.57-73
    • /
    • 2021
  • Maintenance and prevention of failure through anomaly detection of ICT infrastructure is becoming important. System monitoring data is multidimensional time series data. When we deal with multidimensional time series data, we have difficulty in considering both characteristics of multidimensional data and characteristics of time series data. When dealing with multidimensional data, correlation between variables should be considered. Existing methods such as probability and linear base, distance base, etc. are degraded due to limitations called the curse of dimensions. In addition, time series data is preprocessed by applying sliding window technique and time series decomposition for self-correlation analysis. These techniques are the cause of increasing the dimension of data, so it is necessary to supplement them. The anomaly detection field is an old research field, and statistical methods and regression analysis were used in the early days. Currently, there are active studies to apply machine learning and artificial neural network technology to this field. Statistically based methods are difficult to apply when data is non-homogeneous, and do not detect local outliers well. The regression analysis method compares the predictive value and the actual value after learning the regression formula based on the parametric statistics and it detects abnormality. Anomaly detection using regression analysis has the disadvantage that the performance is lowered when the model is not solid and the noise or outliers of the data are included. There is a restriction that learning data with noise or outliers should be used. The autoencoder using artificial neural networks is learned to output as similar as possible to input data. It has many advantages compared to existing probability and linear model, cluster analysis, and map learning. It can be applied to data that does not satisfy probability distribution or linear assumption. In addition, it is possible to learn non-mapping without label data for teaching. However, there is a limitation of local outlier identification of multidimensional data in anomaly detection, and there is a problem that the dimension of data is greatly increased due to the characteristics of time series data. In this study, we propose a CMAE (Conditional Multimodal Autoencoder) that enhances the performance of anomaly detection by considering local outliers and time series characteristics. First, we applied Multimodal Autoencoder (MAE) to improve the limitations of local outlier identification of multidimensional data. Multimodals are commonly used to learn different types of inputs, such as voice and image. The different modal shares the bottleneck effect of Autoencoder and it learns correlation. In addition, CAE (Conditional Autoencoder) was used to learn the characteristics of time series data effectively without increasing the dimension of data. In general, conditional input mainly uses category variables, but in this study, time was used as a condition to learn periodicity. The CMAE model proposed in this paper was verified by comparing with the Unimodal Autoencoder (UAE) and Multi-modal Autoencoder (MAE). The restoration performance of Autoencoder for 41 variables was confirmed in the proposed model and the comparison model. The restoration performance is different by variables, and the restoration is normally well operated because the loss value is small for Memory, Disk, and Network modals in all three Autoencoder models. The process modal did not show a significant difference in all three models, and the CPU modal showed excellent performance in CMAE. ROC curve was prepared for the evaluation of anomaly detection performance in the proposed model and the comparison model, and AUC, accuracy, precision, recall, and F1-score were compared. In all indicators, the performance was shown in the order of CMAE, MAE, and AE. Especially, the reproduction rate was 0.9828 for CMAE, which can be confirmed to detect almost most of the abnormalities. The accuracy of the model was also improved and 87.12%, and the F1-score was 0.8883, which is considered to be suitable for anomaly detection. In practical aspect, the proposed model has an additional advantage in addition to performance improvement. The use of techniques such as time series decomposition and sliding windows has the disadvantage of managing unnecessary procedures; and their dimensional increase can cause a decrease in the computational speed in inference.The proposed model has characteristics that are easy to apply to practical tasks such as inference speed and model management.

Estimation of GARCH Models and Performance Analysis of Volatility Trading System using Support Vector Regression (Support Vector Regression을 이용한 GARCH 모형의 추정과 투자전략의 성과분석)

  • Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.107-122
    • /
    • 2017
  • Volatility in the stock market returns is a measure of investment risk. It plays a central role in portfolio optimization, asset pricing and risk management as well as most theoretical financial models. Engle(1982) presented a pioneering paper on the stock market volatility that explains the time-variant characteristics embedded in the stock market return volatility. His model, Autoregressive Conditional Heteroscedasticity (ARCH), was generalized by Bollerslev(1986) as GARCH models. Empirical studies have shown that GARCH models describes well the fat-tailed return distributions and volatility clustering phenomenon appearing in stock prices. The parameters of the GARCH models are generally estimated by the maximum likelihood estimation (MLE) based on the standard normal density. But, since 1987 Black Monday, the stock market prices have become very complex and shown a lot of noisy terms. Recent studies start to apply artificial intelligent approach in estimating the GARCH parameters as a substitute for the MLE. The paper presents SVR-based GARCH process and compares with MLE-based GARCH process to estimate the parameters of GARCH models which are known to well forecast stock market volatility. Kernel functions used in SVR estimation process are linear, polynomial and radial. We analyzed the suggested models with KOSPI 200 Index. This index is constituted by 200 blue chip stocks listed in the Korea Exchange. We sampled KOSPI 200 daily closing values from 2010 to 2015. Sample observations are 1487 days. We used 1187 days to train the suggested GARCH models and the remaining 300 days were used as testing data. First, symmetric and asymmetric GARCH models are estimated by MLE. We forecasted KOSPI 200 Index return volatility and the statistical metric MSE shows better results for the asymmetric GARCH models such as E-GARCH or GJR-GARCH. This is consistent with the documented non-normal return distribution characteristics with fat-tail and leptokurtosis. Compared with MLE estimation process, SVR-based GARCH models outperform the MLE methodology in KOSPI 200 Index return volatility forecasting. Polynomial kernel function shows exceptionally lower forecasting accuracy. We suggested Intelligent Volatility Trading System (IVTS) that utilizes the forecasted volatility results. IVTS entry rules are as follows. If forecasted tomorrow volatility will increase then buy volatility today. If forecasted tomorrow volatility will decrease then sell volatility today. If forecasted volatility direction does not change we hold the existing buy or sell positions. IVTS is assumed to buy and sell historical volatility values. This is somewhat unreal because we cannot trade historical volatility values themselves. But our simulation results are meaningful since the Korea Exchange introduced volatility futures contract that traders can trade since November 2014. The trading systems with SVR-based GARCH models show higher returns than MLE-based GARCH in the testing period. And trading profitable percentages of MLE-based GARCH IVTS models range from 47.5% to 50.0%, trading profitable percentages of SVR-based GARCH IVTS models range from 51.8% to 59.7%. MLE-based symmetric S-GARCH shows +150.2% return and SVR-based symmetric S-GARCH shows +526.4% return. MLE-based asymmetric E-GARCH shows -72% return and SVR-based asymmetric E-GARCH shows +245.6% return. MLE-based asymmetric GJR-GARCH shows -98.7% return and SVR-based asymmetric GJR-GARCH shows +126.3% return. Linear kernel function shows higher trading returns than radial kernel function. Best performance of SVR-based IVTS is +526.4% and that of MLE-based IVTS is +150.2%. SVR-based GARCH IVTS shows higher trading frequency. This study has some limitations. Our models are solely based on SVR. Other artificial intelligence models are needed to search for better performance. We do not consider costs incurred in the trading process including brokerage commissions and slippage costs. IVTS trading performance is unreal since we use historical volatility values as trading objects. The exact forecasting of stock market volatility is essential in the real trading as well as asset pricing models. Further studies on other machine learning-based GARCH models can give better information for the stock market investors.

The Effect of Corporate SNS Marketing on User Behavior: Focusing on Facebook Fan Page Analytics (기업의 SNS 마케팅 활동이 이용자 행동에 미치는 영향: 페이스북 팬페이지 애널리틱스를 중심으로)

  • Jeon, Hyeong-Jun;Seo, Bong-Goon;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.75-95
    • /
    • 2020
  • With the growth of social networks, various forms of SNS have emerged. Based on various motivations for use such as interactivity, information exchange, and entertainment, SNS users are also on the fast-growing trend. Facebook is the main SNS channel, and companies have started using Facebook pages as a public relations channel. To this end, in the early stages of operation, companies began to secure a number of fans, and as a result, the number of corporate Facebook fans has recently increased to as many as millions. from a corporate perspective, Facebook is attracting attention because it makes it easier for you to meet the customers you want. Facebook provides an efficient advertising platform based on the numerous data it has. Advertising targeting can be conducted using their demographic characteristics, behavior, or contact information. It is optimized for advertisements that can expose information to a desired target, so that results can be obtained more effectively. it rethink and communicate corporate brand image to customers through contents. The study was conducted through Facebook advertising data, and could be of great help to business people working in the online advertising industry. For this reason, the independent variables used in the research were selected based on the characteristics of the content that the actual business is concerned with. Recently, the company's Facebook page operation goal is to go beyond securing the number of fan pages, branding to promote its brand, and further aiming to communicate with major customers. the main figures for this assessment are Facebook's 'OK', 'Attachment', 'Share', and 'Number of Click' which are the dependent variables of this study. in order to measure the outcome of the target, the consumer's response is set as a key measurable key performance indicator (KPI), and a strategy is set and executed to achieve this. Here, KPI uses Facebook's ad numbers 'reach', 'exposure', 'like', 'share', 'comment', 'clicks', and 'CPC' depending on the situation. in order to achieve the corresponding figures, the consideration of content production must be prior, and in this study, the independent variables were organized by dividing into three considerations for content production into three. The effects of content material, content structure, and message styles on Facebook's user behavior were analyzed using regression analysis. Content materials are related to the content's difficulty, company relevance, and daily involvement. According to existing research, it was very important how the content would attract users' interest. Content could be divided into informative content and interesting content. Informational content is content related to the brand, and information exchange with users is important. Interesting content is defined as posts that are not related to brands related to interesting movies or anecdotes. Based on this, this study started with the assumption that the difficulty, company relevance, and daily involvement have an effect on the dependent variable. In addition, previous studies have found that content types affect Facebook user activity. I think it depends on the combination of photos and text used in the content. Based on this study, the actual photos were used and the hashtag and independent variables were also examined. Finally, we focused on the advertising message. In the previous studies, the effect of advertising messages on users was different depending on whether they were narrative or non-narrative, and furthermore, the influence on message intimacy was different. In this study, we conducted research on the behavior that Facebook users' behavior would be different depending on the language and formality. For dependent variables, 'OK' and 'Full Click Count' are set by every user's action on the content. In this study, we defined each independent variable in the existing study literature and analyzed the effect on the dependent variable, and found that 'good' factors such as 'self association', 'actual use', and 'hidden' are important. Could. Material difficulties', 'actual participation' and 'large scale * difficulties'. In addition, variables such as 'Self Connect', 'Actual Engagement' and 'Sexual Sexual Attention' have been shown to have a significant impact on 'Full Click'. It is expected that through research results, it is possible to contribute to the operation and production strategy of company Facebook operators and content creators by presenting a content strategy optimized for the purpose of the content. In this study, we defined each independent variable in the existing research literature and analyzed its effect on the dependent variable, and we could see that factors on 'good' were significant such as 'self-association', 'reality use', 'concernal material difficulty', 'real-life involvement' and 'massive*difficulty'. In addition, variables such as 'self-connection', 'real-life involvement' and 'formative*attention' were shown to have significant effects for 'full-click'. Through the research results, it is expected that by presenting an optimized content strategy for content purposes, it can contribute to the operation and production strategy of corporate Facebook operators and content producers.

Implementation Strategy for the Elderly Care Solution Based on Usage Log Analysis: Focusing on the Case of Hyodol Product (사용자 로그 분석에 기반한 노인 돌봄 솔루션 구축 전략: 효돌 제품의 사례를 중심으로)

  • Lee, Junsik;Yoo, In-Jin;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.117-140
    • /
    • 2019
  • As the aging phenomenon accelerates and various social problems related to the elderly of the vulnerable are raised, the need for effective elderly care solutions to protect the health and safety of the elderly generation is growing. Recently, more and more people are using Smart Toys equipped with ICT technology for care for elderly. In particular, log data collected through smart toys is highly valuable to be used as a quantitative and objective indicator in areas such as policy-making and service planning. However, research related to smart toys is limited, such as the development of smart toys and the validation of smart toy effectiveness. In other words, there is a dearth of research to derive insights based on log data collected through smart toys and to use them for decision making. This study will analyze log data collected from smart toy and derive effective insights to improve the quality of life for elderly users. Specifically, the user profiling-based analysis and elicitation of a change in quality of life mechanism based on behavior were performed. First, in the user profiling analysis, two important dimensions of classifying the type of elderly group from five factors of elderly user's living management were derived: 'Routine Activities' and 'Work-out Activities'. Based on the dimensions derived, a hierarchical cluster analysis and K-Means clustering were performed to classify the entire elderly user into three groups. Through a profiling analysis, the demographic characteristics of each group of elderlies and the behavior of using smart toy were identified. Second, stepwise regression was performed in eliciting the mechanism of change in quality of life. The effects of interaction, content usage, and indoor activity have been identified on the improvement of depression and lifestyle for the elderly. In addition, it identified the role of user performance evaluation and satisfaction with smart toy as a parameter that mediated the relationship between usage behavior and quality of life change. Specific mechanisms are as follows. First, the interaction between smart toy and elderly was found to have an effect of improving the depression by mediating attitudes to smart toy. The 'Satisfaction toward Smart Toy,' a variable that affects the improvement of the elderly's depression, changes how users evaluate smart toy performance. At this time, it has been identified that it is the interaction with smart toy that has a positive effect on smart toy These results can be interpreted as an elderly with a desire to meet emotional stability interact actively with smart toy, and a positive assessment of smart toy, greatly appreciating the effectiveness of smart toy. Second, the content usage has been confirmed to have a direct effect on improving lifestyle without going through other variables. Elderly who use a lot of the content provided by smart toy have improved their lifestyle. However, this effect has occurred regardless of the attitude the user has toward smart toy. Third, log data show that a high degree of indoor activity improves both the lifestyle and depression of the elderly. The more indoor activity, the better the lifestyle of the elderly, and these effects occur regardless of the user's attitude toward smart toy. In addition, elderly with a high degree of indoor activity are satisfied with smart toys, which cause improvement in the elderly's depression. However, it can be interpreted that elderly who prefer outdoor activities than indoor activities, or those who are less active due to health problems, are hard to satisfied with smart toys, and are not able to get the effects of improving depression. In summary, based on the activities of the elderly, three groups of elderly were identified and the important characteristics of each type were identified. In addition, this study sought to identify the mechanism by which the behavior of the elderly on smart toy affects the lives of the actual elderly, and to derive user needs and insights.

A Study on the Characteristics and Management Plan of Old Big Trees in the Sacred Natural Sites of Handan City, China (중국 한단시 자연성지 내 노거수의 특성과 관리방안)

  • Xi, Su-Ting;Shin, Hyun-Sil
    • Journal of the Korean Institute of Traditional Landscape Architecture
    • /
    • v.41 no.2
    • /
    • pp.35-45
    • /
    • 2023
  • First, The spatial distribution characteristics of old big trees were analyzed using ArcGIS figures by combining basic information such as species and ages of old big trees in Handan City, which were compiled by the local bureau of landscaping. The types of species, distribution by ages of trees, ownership status, growth status, and diversity status were comprehensively analyzed. Statistically, Styphnolobium, Acacia, Gleditsia, and Albizia of Fabaceae accounted for the majority, of which Sophora japonica accounted for the highest proportion. Sophora japonica is widely and intensively distributed to each prefecture and district in Handan city. According to the age and distribution, the old big trees over 1000 years old were mainly Sophora japonica, Zelkova serrata, Juniperus chinensis, Morus australis Koidz., Dalbergia hupeana Hance, Ceratonia siliqua L., and Pistacia chinensis, and Platycladus orientalis. Second, as found in each type of old big tree status, various types of old big tree status were investigated, the protection management system, protection management process, and protection management benefits were studied, and the protection of old big tree was closely related to the growth environment. Currently, the main driving force behind the protection of old big trees is the worship of old big trees. By depositing its sacredness to the old big tree and sublimating the natural character that nature gave to the old big tree into a guiding consciousness of social activities, nature's "beauty" and personality's "goodness" are well combined. The protection state of the old big tree is closely related to the degree of interaction with the surrounding environment and the participation of various cultures and subjects. In the process of continuously interacting with the surrounding environment during the long-term growth of old big trees, it seems that a natural sanctuary was formed around old big trees in the process of voluntarily establishing a "natural-cultural-scape" system involving bottom-up and top-down cross-regions, multicultural and multi-subjects. Third, China focused on protecting and recovering old big trees, but the protection management system is poor due to a lack of comprehensive consideration of historical and cultural values, plant diversity significance, and social values of old big trees in the management process. Three indicators of space's regional characteristics, property and protection characteristics, and value characteristics can be found in the evaluation of the natural characteristics of old giant trees, which are highly valuable in terms of traditional consciousness management, resource protection practice, faith system construction, and realization of life community values. A systematic management system should be supported as to whether they can be protected and developed for a long time. Fourth, as the perception of protected areas is not yet mature in China, "natural sanctuary" should be treated as an important research content in the process of establishing a nature reserve system. The form of natural sanctuary management, which focuses on bottom-up community participation, is a strong supplement to the current type of top-down nature reserve management in China. Based on this, the protection of old giant trees should be included in the form of a nature reserve called a natural monument in the nature reserve system. In addition, residents of the area around the nature reserve should be one of the main agents of biodiversity conservation.

Evaluation of Radiation Exposure to Medical Staff except Nuclear Medicine Department (핵의학 검사 시행하는 환자에 의한 병원 종사자 피폭선량 평가)

  • Lim, Jung Jin;Kim, Ha Kyoon;Kim, Jong Pil;Jo, Sung Wook;Kim, Jin Eui
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.20 no.2
    • /
    • pp.32-35
    • /
    • 2016
  • Purpose The goal for this study is to figure out that medical staff except Nuclear Medicine Department could be exposed to radiation from the patients who take Nuclear Medicine examination. Materials and Methods Total 250 patients (Bone scan 100, Myocardial SPECT 100, PET/CT 50) were involved from July to October in 2015, and we measured patient dose rate two times for every patients. First, we checked radiation dose rate right after injecting an isotope (radiopharmaceutical). Secondly, we measured radiation dose rate after each examination. Results In the case of Bone scan, dose rate were $0.0278{\pm}0.0036mSv/h$ after injection and $0.0060{\pm}0.0018mSv/h$ after examination (3 hrs 52 minutes after injection on average). For Myocardial SPECT, dose rate were $0.0245{\pm}0.0027mSv/h$ after injection and $0.0123{\pm}0.0041mSv/h$ after examination (2 hrs 09 minutes after injection on average). Lastly, for PET/CT, dose rate were $0.0439{\pm}0.0087mSv/h$ after examination (68 minutes after injection on average). Conclusion Compared to Nuclear Safety Commission Act, there was no significant harmful effect of the exposure from patients who have been administered radiopharmaceuticals. However, we should strive to keep ALARA(as low as reasonably achievable) principle for radiation protection.

  • PDF

Dose Verification Study of Brachytherapy Plans Using Monte Carlo Methods and CT Images (CT 영상 및 몬테칼로 계산에 기반한 근접 방사선치료계획의 선량분포 평가 방법 연구)

  • Cheong, Kwang-Ho;Lee, Me-Yeon;Kang, Sei-Kwon;Bae, Hoon-Sik;Park, So-Ah;Kim, Kyoung-Joo;Hwang, Tae-Jin;Oh, Do-Hoon
    • Progress in Medical Physics
    • /
    • v.21 no.3
    • /
    • pp.253-260
    • /
    • 2010
  • Most brachytherapy treatment planning systems employ a dosimetry formalism based on the AAPM TG-43 report which does not appropriately consider tissue heterogeneity. In this study we aimed to set up a simple Monte Carlo-based intracavitary high-dose-rate brachytherapy (IC-HDRB) plan verification platform, focusing particularly on the robustness of the direct Monte Carlo dose calculation using material and density information derived from CT images. CT images of slab phantoms and a uterine cervical cancer patient were used for brachytherapy plans based on the Plato (Nucletron, Netherlands) brachytherapy planning system. Monte Carlo simulations were implemented using the parameters from the Plato system and compared with the EBT film dosimetry and conventional dose computations. EGSnrc based DOSXYZnrc code was used for Monte Carlo simulations. Each $^{192}Ir$ source of the afterloader was approximately modeled as a parallel-piped shape inside the converted CT data set whose voxel size was $2{\times}2{\times}2\;mm^3$. Bracytherapy dose calculations based on the TG-43 showed good agreement with the Monte Carlo results in a homogeneous media whose density was close to water, but there were significant errors in high-density materials. For a patient case, A and B point dose differences were less than 3%, while the mean dose discrepancy was as much as 5%. Conventional dose computation methods might underdose the targets by not accounting for the effects of high-density materials. The proposed platform was shown to be feasible and to have good dose calculation accuracy. One should be careful when confirming the plan using a conventional brachytherapy dose computation method, and moreover, an independent dose verification system as developed in this study might be helpful.

Earthquake Monitoring : Future Strategy (지진관측 : 미래 발전 전략)

  • Chi, Heon-Cheol;Park, Jung-Ho;Kim, Geun-Young;Shin, Jin-Soo;Shin, In-Cheul;Lim, In-Seub;Jeong, Byung-Sun;Sheen, Dong-Hoon
    • Geophysics and Geophysical Exploration
    • /
    • v.13 no.3
    • /
    • pp.268-276
    • /
    • 2010
  • Earthquake Hazard Mitigation Law was activated into force on March 2009. By the law, the obligation to monitor the effect of earthquake on the facilities was extended to many organizations such as gas company and local governments. Based on the estimation of National Emergency Management Agency (NEMA), the number of free-surface acceleration stations would be expanded to more than 400. The advent of internet protocol and the more simplified operation have allowed the quick and easy installation of seismic stations. In addition, the dynamic range of seismic instruments has been continuously improved enough to evaluate damage intensity and to alert alarm directly for earthquake hazard mitigation. For direct visualization of damage intensity and area, Real Time Intensity COlor Mapping (RTICOM) is explained in detail. RTICOM would be used to retrieve the essential information for damage evaluation, Peak Ground Acceleration (PGA). Destructive earthquake damage is usually due to surface waves which just follow S wave. The peak amplitude of surface wave would be pre-estimated from the amplitude and frequency content of first arrival P wave. Earthquake Early Warning (EEW) system is conventionally defined to estimate local magnitude from P wave. The status of EEW is reviewed and the application of EEW to Odesan earthquake is exampled with ShakeMap in order to make clear its appearance. In the sense of rapidity, the earthquake announcement of Korea Meteorological Agency (KMA) might be dramatically improved by the adaption of EEW. In order to realize hazard mitigation, EEW should be applied to the local crucial facilities such as nuclear power plants and fragile semi-conduct plant. The distributed EEW is introduced with the application example of Uljin earthquake. Not only Nation-wide but also locally distributed EEW applications, all relevant information is needed to be shared in real time. The plan of extension of Korea Integrated Seismic System (KISS) is briefly explained in order to future cooperation of data sharing and utilization.

Anisotrpic radar crosshole tomography and its applications (이방성 레이다 시추공 토모그래피와 그 응용)

  • Kim Jung-Ho;Cho Seong-Jun;Yi Myeong-Jong
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2005.09a
    • /
    • pp.21-36
    • /
    • 2005
  • Although the main geology of Korea consists of granite and gneiss, it Is not uncommon to encounter anisotropy Phenomena in crosshole radar tomography even when the basement is crystalline rock. To solve the anisotropy Problem, we have developed and continuously upgraded an anisotropic inversion algorithm assuming a heterogeneous elliptic anisotropy to reconstruct three kinds of tomograms: tomograms of maximum and minimum velocities, and of the direction of the symmetry axis. In this paper, we discuss the developed algorithm and introduce some case histories on the application of anisotropic radar tomography in Korea. The first two case histories were conducted for the construction of infrastructure, and their main objective was to locate cavities in limestone. The last two were performed In a granite and gneiss area. The anisotropy in the granite area was caused by fine fissures aligned in the same direction, while that in the gneiss and limestone area by the alignment of the constituent minerals. Through these case histories we showed that the anisotropic characteristic itself gives us additional important information for understanding the internal status of basement rock. In particular, the anisotropy ratio defined by the normalized difference between maximum and minimum velocities as well as the direction of maximum velocity are helpful to interpret the borehole radar tomogram.

  • PDF

Estimation of the Accuracy of Genomic Breeding Value in Hanwoo (Korean Cattle) (한우의 유전체 육종가의 정확도 추정)

  • Lee, Seung Soo;Lee, Seung Hwan;Choi, Tae Jeong;Choy, Yun Ho;Cho, Kwang Hyun;Choi, You Lim;Cho, Yong Min;Kim, Nae Soo;Lee, Jung Jae
    • Journal of Animal Science and Technology
    • /
    • v.55 no.1
    • /
    • pp.13-18
    • /
    • 2013
  • This study was conducted to estimate the Genomic Estimated Breeding Value (GEBV) using Genomic Best Linear Unbiased Prediction (GBLUP) method in Hanwoo (Korean native cattle) population. The result is expected to adapt genomic selection onto the national Hanwoo evaluation system. Carcass weight (CW), eye muscle area (EMA), backfat thickness (BT), and marbling score (MS) were investigated in 552 Hanwoo progeny-tested steers at Livestock Improvement Main Center. Animals were genotyped with Illumina BovineHD BeadChip (777K SNPs). For statistical analysis, Genetic Relationship Matrix (GRM) was formulated on the basis of genotypes and the accuracy of GEBV was estimated with 10-fold Cross-validation method. The accuracies estimated with cross-validation method were between 0.915~0.957. In 534 progeny-tested steers, the maximum difference of GEBV accuracy compared to conventional EBV for CW, EMA, BT, and MS traits were 9.56%, 5.78%, 5.78%, and 4.18% respectively. In 3,674 pedigree traced bulls, maximum increased difference of GEBV for CW, EMA, BT, and MS traits were increased as 13.54%, 6.50%, 6.50%, and 4.31% respectively. This showed that the implementation of genomic pre-selection for candidate calves to test on meat production traits could improve the genetic gain by increasing accuracy and reducing generation interval in Hanwoo genetic evaluation system to select proven bulls.