• Title/Summary/Keyword: 정보 공정성

Search Result 1,477, Processing Time 0.036 seconds

A Study on Optimal PID Controller Design Ensure the Absolute Stability (절대안정도를 보장하는 최적 PID 제어기 설계에 관한 연구)

  • Cho, Joon-Ho
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.2
    • /
    • pp.124-129
    • /
    • 2021
  • In this paper, an optimal controller design that guarantees absolute stability is proposed. The order of application of the thesis determines whether the delay time is included, and if the delay time is included, the delay time is approximated through the Pade approximation method. Then, the open loop transfer function for the process model and the controller transfer function is obtained, and the absolute stability interval is calculated by the Routh-Hurwitz discrimination method. In the last step, the optimal Proportional and Integral and Derivative(PID) control parameter value is calculated using a genetic algorithm using the interval obtained in the previous step. As a result, it was confirmed that the proposed method guarantees stability and is superior to the existing method in performance index by designing an optimal controller. If we study the compensation method for the delay time in the future, it is judged that better performance indicators will be obtained.

A Study on the Multilateral Discussion Trends of Service Trade Agreement (서비스무역규범의 다자간 논의 동향 고찰)

  • Jeong, Hee-Jin;Jang, Eun-Hee
    • Journal of Convergence for Information Technology
    • /
    • v.12 no.4
    • /
    • pp.270-277
    • /
    • 2022
  • Today, due to the serviceization of the economy, volume and position Global trade in Services have gradually increased. Stable and fair trade can be achieved through solid international trade law. Multilateral discussions on service trade agreement have been stagnant, but have recently shown considerable outcome. Service trade agreement(GATS) deal with various service trade barriers in member countries that hinder free trade in services. Recently, a group of WTO members have established the 「Joint Initiative on Services Domestic Regulation」. The agreement aims to ensure that licensing and qualification requirements and procedures, as well as technical standards do not constitute unnecessary barriers to services trade. This study is to understand the type and statistics of service trade barriers that affect actual service transactions. At the same time, It aims to examine the progress of discussions on multilateral service trade agreement for resolving service trade barriers.

Extraction of Primary Factors Influencing Dam Operation Using Factor Analysis (요인분석 통계기법을 이용한 댐 운영에 대한 영향 요인 추출)

  • Kang, Min-Goo;Jung, Chan-Yong;Lee, Gwang-Man
    • Journal of Korea Water Resources Association
    • /
    • v.40 no.10
    • /
    • pp.769-781
    • /
    • 2007
  • Factor analysis has been usually employed in reducing quantity of data and summarizing information on a system or phenomenon. In this analysis methodology, variables are grouped into several factors by consideration of statistic characteristics, and the results are used for dropping variables which have lower weight than others. In this study, factor analysis was applied for extracting primary factors influencing multi-dam system operation in the Han River basin, where there are two multi-purpose dams such as Soyanggang Dam and Chungju Dam, and water has been supplied by integrating two dams in water use season. In order to fulfill factor analysis, first the variables related to two dams operation were gathered and divided into five groups (Soyanggang Dam: inflow, hydropower product, storage management, storage, and operation results of the past; Chungju Dam: inflow, hydropower product, water demand, storage, and operation results of the past). And then, considering statistic properties, in the gathered variables, some variables were chosen and grouped into five factors; hydrological condition, dam operation of the past, dam operation at normal season, water demand, and downstream dam operation. In order to check the appropriateness and applicability of factors, a multiple regression equation was newly constructed using factors as description variables, and those factors were compared with terms of objective function used in operation water resources optimally in a river basin. Reviewing the results through two check processes, it was revealed that the suggested approach provided satisfactory results. And, it was expected for extracted primary factors to be useful for making dam operation schedule considering the future situation and previous results.

The Comparative Understanding between Red Ginseng and White Ginsengs, Processed Ginsengs (Panax ginseng C. A. Meyer) (홍삼과 백삼의 비교 고찰)

  • Nam, Ki-Yeul
    • Journal of Ginseng Research
    • /
    • v.29 no.1
    • /
    • pp.1-18
    • /
    • 2005
  • Ginseng Radix, the root of Panax ginseng C. A. Meyer has been used in Eastern Asia for 2000 years as a tonic and restorative, promoting health and longevity. Two varieties are commercially available: white ginseng(Ginseng Radix Alba) is produced by air-drying the root, while red ginseng(Ginseng Radix Rubra) is produced by steaming the root followed by drying. These two varieties of different processing have somewhat differences by heat processing between them. During the heat processing for preparing red ginseng, it has been found to exhibit inactivation of catabolic enzymes, thereby preventing deterioration of ginseng quality and the increased antioxidant-like substances which inhibit lipid peroxide formation, and also good gastro-intestinal absorption by gelatinization of starch. Moreover, studies of changes in ginsenosides composition due to different processing of ginseng roots have been undertaken. The results obtained showed that red ginseng differ from white ginseng due to the lack of acidic malonyl-ginsenosides. The heating procedure in red ginseng was proved to degrade the thermally unstable malonyl-ginsenoside into corresponding netural ginsenosides. Also the steaming process of red ginseng causes degradation or transformation of neutral ginsenosides. Ginsenosides $Rh_2,\;Rh_4,\;Rs_3,\;Rs_4\;and\;Rg_5$, found only in red ginseng, have been known to be hydrolyzed products derived from original saponin by heat processing, responsible for inhibitory effects on the growth of cancer cells through the induction of apoptosis. 20(S)-ginsenoside $Rg_3$ was also formed in red ginseng and was shown to exhibit vasorelaxation properties, antimetastatic activities, and anti-platelet aggregation activity. Recently, steamed red ginseng at high temperature was shown to provide enhance the yield of ginsenosides $Rg_3\;and\;Rg_5$ characteristic of red ginseng Additionally, one of non-saponin constituents, panaxytriol, was found to be structually transformed from polyacetylenic alcohol(panaxydol) showing cytotoxicity during the preparation of red ginseng and also maltol, antioxidant maillard product, from maltose and arginyl-fructosyl-glucose, amino acid derivative, from arginine and maltose. In regard to the in vitro and in vivo comparative biological activities, red ginseng was reported to show more potent activities on the antioxidant effect, anticarcinogenic effect and ameliorative effect on blood circulation than those of white ginseng. In oriental medicine, the ability of red ginseng to supplement the vacancy(허) was known to be relatively stronger than that of white ginseng, but very few are known on its comparative clinical studies. Further investigation on the preclinical and clinical experiments are needed to show the differences of indications and efficacies between red and white ginsengs on the basis of oriental medicines.

Opportunity Tree Framework Design For Optimization of Software Development Project Performance (소프트웨어 개발 프로젝트 성능의 최적화를 위한 Opportunity Tree 모델 설계)

  • Song Ki-Won;Lee Kyung-Whan
    • The KIPS Transactions:PartD
    • /
    • v.12D no.3 s.99
    • /
    • pp.417-428
    • /
    • 2005
  • Today, IT organizations perform projects with vision related to marketing and financial profit. The objective of realizing the vision is to improve the project performing ability in terms of QCD. Organizations have made a lot of efforts to achieve this objective through process improvement. Large companies such as IBM, Ford, and GE have made over $80\%$ of success through business process re-engineering using information technology instead of business improvement effect by computers. It is important to collect, analyze and manage the data on performed projects to achieve the objective, but quantitative measurement is difficult as software is invisible and the effect and efficiency caused by process change are not visibly identified. Therefore, it is not easy to extract the strategy of improvement. This paper measures and analyzes the project performance, focusing on organizations' external effectiveness and internal efficiency (Qualify, Delivery, Cycle time, and Waste). Based on the measured project performance scores, an OT (Opportunity Tree) model was designed for optimizing the project performance. The process of design is as follows. First, meta data are derived from projects and analyzed by quantitative GQM(Goal-Question-Metric) questionnaire. Then, the project performance model is designed with the data obtained from the quantitative GQM questionnaire and organization's performance score for each area is calculated. The value is revised by integrating the measured scores by area vision weights from all stakeholders (CEO, middle-class managers, developer, investor, and custom). Through this, routes for improvement are presented and an optimized improvement method is suggested. Existing methods to improve software process have been highly effective in division of processes' but somewhat unsatisfactory in structural function to develop and systemically manage strategies by applying the processes to Projects. The proposed OT model provides a solution to this problem. The OT model is useful to provide an optimal improvement method in line with organization's goals and can reduce risks which may occur in the course of improving process if it is applied with proposed methods. In addition, satisfaction about the improvement strategy can be improved by obtaining input about vision weight from all stakeholders through the qualitative questionnaire and by reflecting it to the calculation. The OT is also useful to optimize the expansion of market and financial performance by controlling the ability of Quality, Delivery, Cycle time, and Waste.

A Study on the Types of Dispute and its Solution through the Analysis on the Disputes Case of Franchise (프랜차이즈 분쟁사례 분석을 통한 분쟁의 유형과 해결에 관한 연구)

  • Kim, Kyu Won;Lee, Jae Han;Lim, Hyun Cheol
    • The Korean Journal of Franchise Management
    • /
    • v.2 no.1
    • /
    • pp.173-199
    • /
    • 2011
  • A franchisee has to depend on the overall system, such as knowhow and management support, from a franchisor in the franchise system and the two parties do not start with the same position in economic or information power because the franchisor controls or supports through selling or management styles. For this, unfair trades the franchisor's over controlling and limiting the franchisee might occur and other side effects by the people who give the franchisee scam trades has negatively influenced on the development of franchise industry and national economy. So, the purpose of this study is preventing unfair trade for the franchisee from understanding the causes and problems of dispute between the franchisor and the franchisee focused on the dispute cases submitted the Korea Fair Trade Mediation Agency and seeking ways to secure the transparency of recruitment process and justice of franchise management process. The results of the case analysis are followed; first, affiliation contracts should run on the franchisor's exact public information statement and the surely understanding of the franchisee. Secondly, the franchisor needs to use their past experiences and investigated data for recruiting franchisees. Thirdly, in the case of making a contract with the franchisee, the franchisor has to make sure the business area by checking it with franchisee in person. Fourthly, the contracts are important in affiliation contracts, so enacting the possibility of disputes makes the disputes decreased. Fifthly, lots of investigation and interests are needed for protecting rights and interests between the franchisor and franchisee and preventing the disputes by catching the cause and more practical solutions of the disputes from the government.

Detection of Phantom Transaction using Data Mining: The Case of Agricultural Product Wholesale Market (데이터마이닝을 이용한 허위거래 예측 모형: 농산물 도매시장 사례)

  • Lee, Seon Ah;Chang, Namsik
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.161-177
    • /
    • 2015
  • With the rapid evolution of technology, the size, number, and the type of databases has increased concomitantly, so data mining approaches face many challenging applications from databases. One such application is discovery of fraud patterns from agricultural product wholesale transaction instances. The agricultural product wholesale market in Korea is huge, and vast numbers of transactions have been made every day. The demand for agricultural products continues to grow, and the use of electronic auction systems raises the efficiency of operations of wholesale market. Certainly, the number of unusual transactions is also assumed to be increased in proportion to the trading amount, where an unusual transaction is often the first sign of fraud. However, it is very difficult to identify and detect these transactions and the corresponding fraud occurred in agricultural product wholesale market because the types of fraud are more intelligent than ever before. The fraud can be detected by verifying the overall transaction records manually, but it requires significant amount of human resources, and ultimately is not a practical approach. Frauds also can be revealed by victim's report or complaint. But there are usually no victims in the agricultural product wholesale frauds because they are committed by collusion of an auction company and an intermediary wholesaler. Nevertheless, it is required to monitor transaction records continuously and to make an effort to prevent any fraud, because the fraud not only disturbs the fair trade order of the market but also reduces the credibility of the market rapidly. Applying data mining to such an environment is very useful since it can discover unknown fraud patterns or features from a large volume of transaction data properly. The objective of this research is to empirically investigate the factors necessary to detect fraud transactions in an agricultural product wholesale market by developing a data mining based fraud detection model. One of major frauds is the phantom transaction, which is a colluding transaction by the seller(auction company or forwarder) and buyer(intermediary wholesaler) to commit the fraud transaction. They pretend to fulfill the transaction by recording false data in the online transaction processing system without actually selling products, and the seller receives money from the buyer. This leads to the overstatement of sales performance and illegal money transfers, which reduces the credibility of market. This paper reviews the environment of wholesale market such as types of transactions, roles of participants of the market, and various types and characteristics of frauds, and introduces the whole process of developing the phantom transaction detection model. The process consists of the following 4 modules: (1) Data cleaning and standardization (2) Statistical data analysis such as distribution and correlation analysis, (3) Construction of classification model using decision-tree induction approach, (4) Verification of the model in terms of hit ratio. We collected real data from 6 associations of agricultural producers in metropolitan markets. Final model with a decision-tree induction approach revealed that monthly average trading price of item offered by forwarders is a key variable in detecting the phantom transaction. The verification procedure also confirmed the suitability of the results. However, even though the performance of the results of this research is satisfactory, sensitive issues are still remained for improving classification accuracy and conciseness of rules. One such issue is the robustness of data mining model. Data mining is very much data-oriented, so data mining models tend to be very sensitive to changes of data or situations. Thus, it is evident that this non-robustness of data mining model requires continuous remodeling as data or situation changes. We hope that this paper suggest valuable guideline to organizations and companies that consider introducing or constructing a fraud detection model in the future.

Base Practice Importance Analysis by Software Process Assessors' Characteristics (소프트웨어 프로세스 심사원 특성에 따른 기본 프랙티스의 중요도 분석)

  • Lee, Jong-Moo;Yoo, Young-Kwan;Kim, Gil-Jo;Chun, In-Geol
    • Korean Business Review
    • /
    • v.13
    • /
    • pp.179-193
    • /
    • 2000
  • It is not simple to manage software quality, because software development process and product itself are very complex. Recently ISO/IEC 15504- international standard for software process improvement, capability determination and development - was completed and applied to many local trials, and their results are being reported as registered local trials. The first step of software process assessment is established by examining whether its base practices are performed and which level they are achieved. And as far as assessment responsibility and credibility are concerned, assessment is generally performed by team-based assessors. Therefore assessment team construction and its preference of practice importances have a great effect on the credibility and the objectivity of assessment. In this paper, we analyse a SPICE-based software process assessment trial by comparing base practice importances of assessed processes, which are expressed through a real assessment, with its final rating results. Survey data are collected from assessors and interviewees who were engaged in the SPICE trial that was performed by ISO/IEC 15504, and final data analysis are derived from the factor analysis method. It is convinced that the result of this paper is able to enhance the credibility of software process assessment by provisions of objective and rational criteria and preference information for assessment team construction and base practice importances in future.

  • PDF

Automatic Drawing and Structural Editing of Road Lane Markings for High-Definition Road Maps (정밀도로지도 제작을 위한 도로 노면선 표시의 자동 도화 및 구조화)

  • Choi, In Ha;Kim, Eui Myoung
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.39 no.6
    • /
    • pp.363-369
    • /
    • 2021
  • High-definition road maps are used as the basic infrastructure for autonomous vehicles, so the latest road information must be quickly reflected. However, the current drawing and structural editing process of high-definition road maps are manually performed. In addition, it takes the longest time to generate road lanes, which are the main construction targets. In this study, the point cloud of the road lane markings, in which color types(white, blue, and yellow) were predicted through the PointNet model pre-trained in previous studies, were used as input data. Based on the point cloud, this study proposed a methodology for automatically drawing and structural editing of the layer of road lane markings. To verify the usability of the 3D vector data constructed through the proposed methodology, the accuracy was analyzed according to the quality inspection criteria of high-definition road maps. In the positional accuracy test of the vector data, the RMSE (Root Mean Square Error) for horizontal and vertical errors were within 0.1m to verify suitability. In the structural editing accuracy test of the vector data, the structural editing accuracy of the road lane markings type and kind were 88.235%, respectively, and the usability was verified. Therefore, it was found that the methodology proposed in this study can efficiently construct vector data of road lanes for high-definition road maps.

A Study on the Field Data Applicability of Seismic Data Processing using Open-source Software (Madagascar) (오픈-소스 자료처리 기술개발 소프트웨어(Madagascar)를 이용한 탄성파 현장자료 전산처리 적용성 연구)

  • Son, Woohyun;Kim, Byoung-yeop
    • Geophysics and Geophysical Exploration
    • /
    • v.21 no.3
    • /
    • pp.171-182
    • /
    • 2018
  • We performed the seismic field data processing using an open-source software (Madagascar) to verify if it is applicable to processing of field data, which has low signal-to-noise ratio and high uncertainties in velocities. The Madagascar, based on Python, is usually supposed to be better in the development of processing technologies due to its capabilities of multidimensional data analysis and reproducibility. However, this open-source software has not been widely used so far for field data processing because of complicated interfaces and data structure system. To verify the effectiveness of the Madagascar software on field data, we applied it to a typical seismic data processing flow including data loading, geometry build-up, F-K filter, predictive deconvolution, velocity analysis, normal moveout correction, stack, and migration. The field data for the test were acquired in Gunsan Basin, Yellow Sea using a streamer consisting of 480 channels and 4 arrays of air-guns. The results at all processing step are compared with those processed with Landmark's ProMAX (SeisSpace R5000) which is a commercial processing software. Madagascar shows relatively high efficiencies in data IO and management as well as reproducibility. Additionally, it shows quick and exact calculations in some automated procedures such as stacking velocity analysis. There were no remarkable differences in the results after applying the signal enhancement flows of both software. For the deeper part of the substructure image, however, the commercial software shows better results than the open-source software. This is simply because the commercial software has various flows for de-multiple and provides interactive processing environments for delicate processing works compared to Madagascar. Considering that many researchers around the world are developing various data processing algorithms for Madagascar, we can expect that the open-source software such as Madagascar can be widely used for commercial-level processing with the strength of expandability, cost effectiveness and reproducibility.