• Title/Summary/Keyword: System Optimization

Search Result 6,480, Processing Time 0.032 seconds

The Study on New Radiating Structure with Multi-Layered Two-Dimensional Metallic Disk Array for Shaping flat-Topped Element Pattern (구형 빔 패턴 형성을 위한 다층 이차원 원형 도체 배열을 갖는 새로운 방사 구조에 대한 연구)

  • 엄순영;스코벨레프;전순익;최재익;박한규
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.13 no.7
    • /
    • pp.667-678
    • /
    • 2002
  • In this paper, a new radiating structure with a multi-layered two-dimensional metallic disk array was proposed for shaping the flat-topped element pattern. It is an infinite periodic planar array structure with metallic disks finitely stacked above the radiating circular waveguide apertures. The theoretical analysis was in detail performed using rigid full-wave analysis, and was based on modal representations for the fields in the partial regions of the array structure and for the currents on the metallic disks. The final system of linear algebraic equations was derived using the orthogonal property of vector wave functions, mode-matching method, boundary conditions and Galerkin's method, and also their unknown modal coefficients needed for calculation of the array characteristics were determined by Gauss elimination method. The application of the algorithm was demonstrated in an array design for shaping the flat-topped element patterns of $\pm$20$^{\circ}$ beam width in Ka-band. The optimal design parameters normalized by a wavelength for general applications are presented, which are obtained through optimization process on the basis of simulation and design experience. A Ka-band experimental breadboard with symmetric nineteen elements was fabricated to compare simulation results with experimental results. The metallic disks array structure stacked above the radiating circular waveguide apertures was realized using ion-beam deposition method on thin polymer films. It was shown that the calculated and measured element patterns of the breadboard were in very close agreement within the beam scanning range. The result analysis for side lobe and grating lobe was done, and also a blindness phenomenon was discussed, which may cause by multi-layered metallic disk structure at the broadside. Input VSWR of the breadboard was less than 1.14, and its gains measured at 29.0 GHz. 29.5 GHz and 30 GHz were 10.2 dB, 10.0 dB and 10.7 dB, respectively. The experimental and simulation results showed that the proposed multi-layered metallic disk array structure could shape the efficient flat-topped element pattern.

Optimization of Microbial Production of Ethanol form Carbon Monoxide (미생물을 이용한 일산화탄소로부터 에탄올 생산공정 최적화)

  • 강환구;이충렬
    • KSBB Journal
    • /
    • v.17 no.1
    • /
    • pp.73-79
    • /
    • 2002
  • The method to optimize the microbial production of ethanol from CO using Clostridium ljungdahlii was developed. The kinetic parameter study on CO conversion with Clostridium ljungdahlii was carried out and maximum CO conversion rate of 37.14 mmol/L-hr-O.D. and $K_{m}$ / of 0.9516 atm were obtained. It was observed that method of two stage fermentation, which consists of cell growth stage and ethanol production stage, was effective to produce ethanol. When pH was shifted from 5.5 to 4.5 and ammonium solution was supplied to culture media as nitrogen source at ethanol production stage, the concentration of ethanol produced was increased 20 times higher than that without shift. Ethanol production from CO in a fermenter with Clostridium ljungdahlii was optimized and the concentration of ethanol produced was 45 g/L and maximun ethanol productivity was 0.75 g ethanol/L-hr.

Optimization of Human Thrombopoietin Production in Insert Cells Using Baculovirus Expression System (베큘로 바이러스 발현 시스템에 의한 곤충세포에서의 인간 트롬보포이에틴 생산 최적화)

  • 고여욱;손미영;박상규;안혜경;박승국;박명환;양재명
    • KSBB Journal
    • /
    • v.13 no.2
    • /
    • pp.181-186
    • /
    • 1998
  • In order to obtain high-level production of recombinant human thrombopoietin (rhTPO) in insect cell line, HTI-TN-5B1-4 (TN5), conditions for optimal rhTPO expression such as multiplicity of infection (MOI), the cell density at infection, harvesting time and type of culture method as well as growth media were determined. When TN5 cells were cultured as anchorage-dependent state in 60-mm dish, cell density $2\times^6$ cells,MOI of 10 and Garvesting the culture media at 72 hr post-infection wrere the cinditions for highest rh TPO production. High production of rhTPO was also achieved by using EXPRESS FIVE serum free media rather than SF900II serum free media-1. Anchorage-dependent TN5 cells were adapted as a suspension culture when they were grown in the presence of heparin. TN5 cells were successfully cultured at 0.2 L scale in suspension culture without having aggregation. When TN5 cells were cultured as suspension state, cell density of $0.6\times10^6$ cells/mL, MOI of 1 and harvesting the culture media at 72 hr post-infection, gave the highest yield of rhTPO.

  • PDF

Game Theoretic Optimization of Investment Portfolio Considering the Performance of Information Security Countermeasure (정보보호 대책의 성능을 고려한 투자 포트폴리오의 게임 이론적 최적화)

  • Lee, Sang-Hoon;Kim, Tae-Sung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.37-50
    • /
    • 2020
  • Information security has become an important issue in the world. Various information and communication technologies, such as the Internet of Things, big data, cloud, and artificial intelligence, are developing, and the need for information security is increasing. Although the necessity of information security is expanding according to the development of information and communication technology, interest in information security investment is insufficient. In general, measuring the effect of information security investment is difficult, so appropriate investment is not being practice, and organizations are decreasing their information security investment. In addition, since the types and specification of information security measures are diverse, it is difficult to compare and evaluate the information security countermeasures objectively, and there is a lack of decision-making methods about information security investment. To develop the organization, policies and decisions related to information security are essential, and measuring the effect of information security investment is necessary. Therefore, this study proposes a method of constructing an investment portfolio for information security measures using game theory and derives an optimal defence probability. Using the two-person game model, the information security manager and the attacker are assumed to be the game players, and the information security countermeasures and information security threats are assumed as the strategy of the players, respectively. A zero-sum game that the sum of the players' payoffs is zero is assumed, and we derive a solution of a mixed strategy game in which a strategy is selected according to probability distribution among strategies. In the real world, there are various types of information security threats exist, so multiple information security measures should be considered to maintain the appropriate information security level of information systems. We assume that the defence ratio of the information security countermeasures is known, and we derive the optimal solution of the mixed strategy game using linear programming. The contributions of this study are as follows. First, we conduct analysis using real performance data of information security measures. Information security managers of organizations can use the methodology suggested in this study to make practical decisions when establishing investment portfolio for information security countermeasures. Second, the investment weight of information security countermeasures is derived. Since we derive the weight of each information security measure, not just whether or not information security measures have been invested, it is easy to construct an information security investment portfolio in a situation where investment decisions need to be made in consideration of a number of information security countermeasures. Finally, it is possible to find the optimal defence probability after constructing an investment portfolio of information security countermeasures. The information security managers of organizations can measure the specific investment effect by drawing out information security countermeasures that fit the organization's information security investment budget. Also, numerical examples are presented and computational results are analyzed. Based on the performance of various information security countermeasures: Firewall, IPS, and Antivirus, data related to information security measures are collected to construct a portfolio of information security countermeasures. The defence ratio of the information security countermeasures is created using a uniform distribution, and a coverage of performance is derived based on the report of each information security countermeasure. According to numerical examples that considered Firewall, IPS, and Antivirus as information security countermeasures, the investment weights of Firewall, IPS, and Antivirus are optimized to 60.74%, 39.26%, and 0%, respectively. The result shows that the defence probability of the organization is maximized to 83.87%. When the methodology and examples of this study are used in practice, information security managers can consider various types of information security measures, and the appropriate investment level of each measure can be reflected in the organization's budget.

Optimal Operation of Gas Engine for Biogas Plant in Sewage Treatment Plant (하수처리장 바이오가스 플랜트의 가스엔진 최적 운영 방안)

  • Kim, Gill Jung;Kim, Lae Hyun
    • Journal of Energy Engineering
    • /
    • v.28 no.2
    • /
    • pp.18-35
    • /
    • 2019
  • The Korea District Heating Corporation operates a gas engine generator with a capacity of $4500m^3 /day$ of biogas generated from the sewage treatment plant of the Nanji Water Recycling Center and 1,500 kW. However, the actual operation experience of the biogas power plant is insufficient, and due to lack of accumulated technology and know-how, frequent breakdown and stoppage of the gas engine causes a lot of economic loss. Therefore, it is necessary to prepare technical fundamental measures for stable operation of the power plant In this study, a series of process problems of the gas engine plant using the biogas generated in the sewage treatment plant of the Nanji Water Recovery Center were identified and the optimization of the actual operation was made by minimizing the problems in each step. In order to purify the gas, which is the main cause of the failure stop, the conditions for establishing the quality standard of the adsorption capacity of the activated carbon were established through the analysis of the components and the adsorption test for the active carbon being used at present. In addition, the system was applied to actual operation by applying standards for replacement cycle of activated carbon to minimize impurities, strengthening measurement period of hydrogen sulfide, localization of activated carbon, and strengthening and improving the operation standards of the plant. As a result, the operating performance of gas engine # 1 was increased by 530% and the operation of the second engine was increased by 250%. In addition, improvement of vent line equipment has reduced work process and increased normal operation time and operation rate. In terms of economic efficiency, it also showed a sales increase of KRW 77,000 / year. By applying the strengthening and improvement measures of operating standards, it is possible to reduce the stoppage of the biogas plant, increase the utilization rate, It is judged to be an operational plan.

Anomaly Detection for User Action with Generative Adversarial Networks (적대적 생성 모델을 활용한 사용자 행위 이상 탐지 방법)

  • Choi, Nam woong;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.43-62
    • /
    • 2019
  • At one time, the anomaly detection sector dominated the method of determining whether there was an abnormality based on the statistics derived from specific data. This methodology was possible because the dimension of the data was simple in the past, so the classical statistical method could work effectively. However, as the characteristics of data have changed complexly in the era of big data, it has become more difficult to accurately analyze and predict the data that occurs throughout the industry in the conventional way. Therefore, SVM and Decision Tree based supervised learning algorithms were used. However, there is peculiarity that supervised learning based model can only accurately predict the test data, when the number of classes is equal to the number of normal classes and most of the data generated in the industry has unbalanced data class. Therefore, the predicted results are not always valid when supervised learning model is applied. In order to overcome these drawbacks, many studies now use the unsupervised learning-based model that is not influenced by class distribution, such as autoencoder or generative adversarial networks. In this paper, we propose a method to detect anomalies using generative adversarial networks. AnoGAN, introduced in the study of Thomas et al (2017), is a classification model that performs abnormal detection of medical images. It was composed of a Convolution Neural Net and was used in the field of detection. On the other hand, sequencing data abnormality detection using generative adversarial network is a lack of research papers compared to image data. Of course, in Li et al (2018), a study by Li et al (LSTM), a type of recurrent neural network, has proposed a model to classify the abnormities of numerical sequence data, but it has not been used for categorical sequence data, as well as feature matching method applied by salans et al.(2016). So it suggests that there are a number of studies to be tried on in the ideal classification of sequence data through a generative adversarial Network. In order to learn the sequence data, the structure of the generative adversarial networks is composed of LSTM, and the 2 stacked-LSTM of the generator is composed of 32-dim hidden unit layers and 64-dim hidden unit layers. The LSTM of the discriminator consists of 64-dim hidden unit layer were used. In the process of deriving abnormal scores from existing paper of Anomaly Detection for Sequence data, entropy values of probability of actual data are used in the process of deriving abnormal scores. but in this paper, as mentioned earlier, abnormal scores have been derived by using feature matching techniques. In addition, the process of optimizing latent variables was designed with LSTM to improve model performance. The modified form of generative adversarial model was more accurate in all experiments than the autoencoder in terms of precision and was approximately 7% higher in accuracy. In terms of Robustness, Generative adversarial networks also performed better than autoencoder. Because generative adversarial networks can learn data distribution from real categorical sequence data, Unaffected by a single normal data. But autoencoder is not. Result of Robustness test showed that he accuracy of the autocoder was 92%, the accuracy of the hostile neural network was 96%, and in terms of sensitivity, the autocoder was 40% and the hostile neural network was 51%. In this paper, experiments have also been conducted to show how much performance changes due to differences in the optimization structure of potential variables. As a result, the level of 1% was improved in terms of sensitivity. These results suggest that it presented a new perspective on optimizing latent variable that were relatively insignificant.

A Recidivism Prediction Model Based on XGBoost Considering Asymmetric Error Costs (비대칭 오류 비용을 고려한 XGBoost 기반 재범 예측 모델)

  • Won, Ha-Ram;Shim, Jae-Seung;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.127-137
    • /
    • 2019
  • Recidivism prediction has been a subject of constant research by experts since the early 1970s. But it has become more important as committed crimes by recidivist steadily increase. Especially, in the 1990s, after the US and Canada adopted the 'Recidivism Risk Assessment Report' as a decisive criterion during trial and parole screening, research on recidivism prediction became more active. And in the same period, empirical studies on 'Recidivism Factors' were started even at Korea. Even though most recidivism prediction studies have so far focused on factors of recidivism or the accuracy of recidivism prediction, it is important to minimize the prediction misclassification cost, because recidivism prediction has an asymmetric error cost structure. In general, the cost of misrecognizing people who do not cause recidivism to cause recidivism is lower than the cost of incorrectly classifying people who would cause recidivism. Because the former increases only the additional monitoring costs, while the latter increases the amount of social, and economic costs. Therefore, in this paper, we propose an XGBoost(eXtream Gradient Boosting; XGB) based recidivism prediction model considering asymmetric error cost. In the first step of the model, XGB, being recognized as high performance ensemble method in the field of data mining, was applied. And the results of XGB were compared with various prediction models such as LOGIT(logistic regression analysis), DT(decision trees), ANN(artificial neural networks), and SVM(support vector machines). In the next step, the threshold is optimized to minimize the total misclassification cost, which is the weighted average of FNE(False Negative Error) and FPE(False Positive Error). To verify the usefulness of the model, the model was applied to a real recidivism prediction dataset. As a result, it was confirmed that the XGB model not only showed better prediction accuracy than other prediction models but also reduced the cost of misclassification most effectively.

Economic Impact of HEMOS-Cloud Services for M&S Support (M&S 지원을 위한 HEMOS-Cloud 서비스의 경제적 효과)

  • Jung, Dae Yong;Seo, Dong Woo;Hwang, Jae Soon;Park, Sung Uk;Kim, Myung Il
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.10 no.10
    • /
    • pp.261-268
    • /
    • 2021
  • Cloud computing is a computing paradigm in which users can utilize computing resources in a pay-as-you-go manner. In a cloud system, resources can be dynamically scaled up and down to the user's on-demand so that the total cost of ownership can be reduced. The Modeling and Simulation (M&S) technology is a renowned simulation-based method to obtain engineering analysis and results through CAE software without actual experimental action. In general, M&S technology is utilized in Finite Element Analysis (FEA), Computational Fluid Dynamics (CFD), Multibody dynamics (MBD), and optimization fields. The work procedure through M&S is divided into pre-processing, analysis, and post-processing steps. The pre/post-processing are GPU-intensive job that consists of 3D modeling jobs via CAE software, whereas analysis is CPU or GPU intensive. Because a general-purpose desktop needs plenty of time to analyze complicated 3D models, CAE software requires a high-end CPU and GPU-based workstation that can work fluently. In other words, for executing M&S, it is absolutely required to utilize high-performance computing resources. To mitigate the cost issue from equipping such tremendous computing resources, we propose HEMOS-Cloud service, an integrated cloud and cluster computing environment. The HEMOS-Cloud service provides CAE software and computing resources to users who want to experience M&S in business sectors or academics. In this paper, the economic ripple effect of HEMOS-Cloud service was analyzed by using industry-related analysis. The estimated results of using the experts-guided coefficients are the production inducement effect of KRW 7.4 billion, the value-added effect of KRW 4.1 billion, and the employment-inducing effect of 50 persons per KRW 1 billion.

Are you a Machine or Human?: The Effects of Human-likeness on Consumer Anthropomorphism Depending on Construal Level (Are you a Machine or Human?: 소셜 로봇의 인간 유사성과 소비자 해석수준이 의인화에 미치는 영향)

  • Lee, Junsik;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.129-149
    • /
    • 2021
  • Recently, interest in social robots that can socially interact with humans is increasing. Thanks to the development of ICT technology, social robots have become easier to provide personalized services and emotional connection to individuals, and the role of social robots is drawing attention as a means to solve modern social problems and the resulting decline in the quality of individual lives. Along with the interest in social robots, the spread of social robots is also increasing significantly. Many companies are introducing robot products to the market to target various target markets, but so far there is no clear trend leading the market. Accordingly, there are more and more attempts to differentiate robots through the design of social robots. In particular, anthropomorphism has been studied importantly in social robot design, and many approaches have been attempted to anthropomorphize social robots to produce positive effects. However, there is a lack of research that systematically describes the mechanism by which anthropomorphism for social robots is formed. Most of the existing studies have focused on verifying the positive effects of the anthropomorphism of social robots on consumers. In addition, the formation of anthropomorphism of social robots may vary depending on the individual's motivation or temperament, but there are not many studies examining this. A vague understanding of anthropomorphism makes it difficult to derive design optimal points for shaping the anthropomorphism of social robots. The purpose of this study is to verify the mechanism by which the anthropomorphism of social robots is formed. This study confirmed the effect of the human-likeness of social robots(Within-subjects) and the construal level of consumers(Between-subjects) on the formation of anthropomorphism through an experimental study of 3×2 mixed design. Research hypotheses on the mechanism by which anthropomorphism is formed were presented, and the hypotheses were verified by analyzing data from a sample of 206 people. The first hypothesis in this study is that the higher the human-likeness of the robot, the higher the level of anthropomorphism for the robot. Hypothesis 1 was supported by a one-way repeated measures ANOVA and a post hoc test. The second hypothesis in this study is that depending on the construal level of consumers, the effect of human-likeness on the level of anthropomorphism will be different. First, this study predicts that the difference in the level of anthropomorphism as human-likeness increases will be greater under high construal condition than under low construal condition.Second, If the robot has no human-likeness, there will be no difference in the level of anthropomorphism according to the construal level. Thirdly,If the robot has low human-likeness, the low construal level condition will make the robot more anthropomorphic than the high construal level condition. Finally, If the robot has high human-likeness, the high construal levelcondition will make the robot more anthropomorphic than the low construal level condition. We performed two-way repeated measures ANOVA to test these hypotheses, and confirmed that the interaction effect of human-likeness and construal level was significant. Further analysis to specifically confirm interaction effect has also provided results in support of our hypotheses. The analysis shows that the human-likeness of the robot increases the level of anthropomorphism of social robots, and the effect of human-likeness on anthropomorphism varies depending on the construal level of consumers. This study has implications in that it explains the mechanism by which anthropomorphism is formed by considering the human-likeness, which is the design attribute of social robots, and the construal level of consumers, which is the way of thinking of individuals. We expect to use the findings of this study as the basis for design optimization for the formation of anthropomorphism in social robots.

K-DEV: A Borehole Deviation Logging Probe Applicable to Steel-cased Holes (철재 케이싱이 설치된 시추공에서도 적용가능한 공곡검층기 K-DEV)

  • Yoonho, Song;Yeonguk, Jo;Seungdo, Kim;Tae Jong, Lee;Myungsun, Kim;In-Hwa, Park;Heuisoon, Lee
    • Geophysics and Geophysical Exploration
    • /
    • v.25 no.4
    • /
    • pp.167-176
    • /
    • 2022
  • We designed a borehole deviation survey tool applicable for steel-cased holes, K-DEV, and developed a prototype for a depth of 500 m aiming to development of own equipment required to secure deep subsurface characterization technologies. K-DEV is equipped with sensors that provide digital output with verified high performance; moreover, it is also compatible with logging winch systems used in Korea. The K-DEV prototype has a nonmagnetic stainless steel housing with an outer diameter of 48.3 mm, which has been tested in the laboratory for water resistance up to 20 MPa and for durability by running into a 1-km deep borehole. We confirmed the operational stability and data repeatability of the prototype by constantly logging up and down to the depth of 600 m. A high-precision micro-electro-mechanical system (MEMS) gyroscope was used for the K-DEV prototype as the gyro sensor, which is crucial for azimuth determination in cased holes. Additionally, we devised an accurate trajectory survey algorithm by employing Unscented Kalman filtering and data fusion for optimization. The borehole test with K-DEV and a commercial logging tool produced sufficiently similar results. Furthermore, the issue of error accumulation due to drift over time of the MEMS gyro was successfully overcome by compensating with stationary measurements for the same attitude at the wellhead before and after logging, as demonstrated by the nearly identical result to the open hole. We believe that the methodology of K-DEV development and operational stability, as well as the data reliability of the prototype, were confirmed through these test applications.