• Title/Summary/Keyword: Performance Measure Approach

Search Result 417, Processing Time 0.024 seconds

A Comparative Study on Reliability Index and Target Performance Measure Based Probabilistic Structural Design Optimizations (신뢰도지수와 목표성능치에 기반한 확률론적 구조설계 최적화기법에 대한 비교연구)

  • 양영순;이재옥
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2000.10a
    • /
    • pp.32-39
    • /
    • 2000
  • Probabilistic structural design optimization, which is characterized by the so-called probabilistic. constraints which introduce permissible probability of violation, is preferred to deterministic design optimization since unpredictable inherent uncertainties and randomness in structural and environmental properties are to be taken quantitatively into account by probabilistic design optimization. In this paper, the well-known reliability index based MPFP(Most Probable Failure Point) search approach and the newly introduced target performance measure based MPTP(Minimum Performance Target Point) search approach are summarized and compared. The present comparison focuses on the number of iterations required for the estimation of probabilistic constraints and a technique for improvement which removes exhaustive iterations is presented as well. A 10 bar truss problem is examined for this.

  • PDF

Performance evaluation of quality management activity using activity based costing (활동중심원가계산을 이용한 품질관리활동의 성과평가)

  • 이홍우;이진춘
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.7 no.1
    • /
    • pp.1-9
    • /
    • 2002
  • Though Quality Management(QM) is key determinant for corporate success as shown in Jappanese cases, however, its performance wasn't translated into the context of profitability, which is a good managerial means. Meanwhile the quality cost theory is a different attempt to measure the quality management performance with a financial scale, which doesnot have a reasonable measure. This study suggests a new approach to measure the performance of quality management using ABC(Activity-based Costing), and explains its usefulness with a case study.

  • PDF

A Study of Effects of Stock Option on Firm's Performance (주식매수선택권이 기업성과에 미친 영향에 대한 연구)

  • Shin, Yeon-Soo
    • The Journal of Information Technology
    • /
    • v.9 no.4
    • /
    • pp.75-85
    • /
    • 2006
  • This study is to test the influence of stock option granting information on the firm's performance. The important issue in stock option is that agent cost is the important determinant factor for the long term performance. The agent cost arises between the manager and shareholders. So many study are concentrated in diminishing the agent cost, and develop some substitute tools to measure the agent cost. The event study about stock option analyzes returns around event date at a time. Event study provides estimation periods and cumulative returns. Announcements about stock option are generally associated with positive abnormal returns in short term period, but not showing positive effect in long term period. It is important to investigate the responses of stocks to new information contained in the announcements of stock option. Therefore it is important to study the long term performance in the case of stock option. The event time portfolio approach exists the CAR model, BHAR model and WR model. And the calendar time portfolio approach has the 3 factor model, 4 factor model, CTAR model, and RATS model. This study is forced to develop and arrange two approach method in evaluating the performance, the event time portfolio approach and calendar time portfolio approach.

  • PDF

Assessment of the effect of biofilm on the ship hydrodynamic performance by performance prediction method

  • Farkas, Andrea;Degiuli, Nastia;Martic, Ivana
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.13 no.1
    • /
    • pp.102-114
    • /
    • 2021
  • Biofouling represents an important problem in the shipping industry since it causes the increase in surface roughness. The most of ships in the current world fleet do not have good coating condition which represents an important problem due to strict rules regarding ship energy efficiency. Therefore, the importance of the control and management of the hull and propeller fouling is highlighted by the International Maritime Organization and the maintenance schedule optimization became valuable energy saving measure. For adequate implementation of this measure, the accurate prediction of the effects of biofouling on the hydrodynamic characteristics is required. Although computational fluid dynamics approach, based on the modified wall function approach, has imposed itself as one of the most promising tools for this prediction, it requires significant computational time. However, during the maintenance schedule optimization, it is important to rapidly predict the effect of biofouling on the ship hydrodynamic performance. In this paper, the effect of biofilm on the ship hydrodynamic performance is studied using the proposed performance prediction method for three merchant ships. The applicability of this method in the assessment of the effect of biofilm on the ship hydrodynamic performance is demonstrated by comparison of the obtained results using the proposed performance prediction method and computational fluid dynamics approach. The comparison has shown that the highest relative deviation is lower than 4.2% for all propulsion characteristics, lower than 1.5% for propeller rotation rate and lower than 5.2% for delivered power. Thus, a practical tool for the estimation of the effect of biofouling with lower fouling severity on the ship hydrodynamic performance is developed.

a linear system approach

  • 이태억
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1993.10a
    • /
    • pp.118-118
    • /
    • 1993
  • We consider a discrete event dynamic system called periodic job shop, where an identical mixture of items called minimal part set(MPS) is repetitively produced in the same processing order and the primary performance measure is the cycle time. The precedence relationships among events(starts of operations) are represented by a directed graph with rocurront otructure. When each operation starts as soon as all its preceding operations complete(called earliest starting), the occurrences of events are modeled in a linear system using a special algebra called minimax algebra. By investigating the eigenvalues and the eigenvectors, we develop conditions on the directed graph for which a stable steady state or a finite eigenvector exists. We demonstrate that each finite eigenvector, characterized as a finite linear combination of a class of eigenvalue, is the minimum among all the feasible schedules and an identical schedule pattern repeats every MPS. We develop an efficient algorithm to find a schedule among such schedules that minimizes a secondary performance measure related to work-in-process inventory. As a by-product of the linear system approach, we also propose a way of characterizing stable steady states of a class of discrete event dynamic systems.

  • PDF

Utterance Verification Using Search Confusion Rate and Its N-Best Approach

  • Kim, Kyu-Hong;Kim, Hoi-Rin;Hahn, Min-Soo
    • ETRI Journal
    • /
    • v.27 no.4
    • /
    • pp.461-464
    • /
    • 2005
  • Recently, a variety of confidence measures for utterance verification has been studied to improve speech recognition performance by rejecting out-of-vocabulary inputs. Most of the conventional confidence measures for utterance verification are based primarily on hypothesis testing or an approximated posterior probability, and their performances depend on the robustness of an alternative hypothesis or the prior probability. We introduce a novel confidence measure called a search confusion rate (SCR), which does not require an alternative hypothesis or the approximation of posterior probability. Our confusion-based approach shows better performance in additive noise-corrupted speech as well as in clean speech.

  • PDF

Development of a Personalized Similarity Measure using Genetic Algorithms for Collaborative Filtering

  • Lee, Soojung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.12
    • /
    • pp.219-226
    • /
    • 2018
  • Collaborative filtering has been most popular approach to recommend items in online recommender systems. However, collaborative filtering is known to suffer from data sparsity problem. As a simple way to overcome this problem in literature, Jaccard index has been adopted to combine with the existing similarity measures. We analyze performance of such combination in various data environments. We also find optimal weights of factors in the combination using a genetic algorithm to formulate a similarity measure. Furthermore, optimal weights are searched for each user independently, in order to reflect each user's different rating behavior. Performance of the resulting personalized similarity measure is examined using two datasets with different data characteristics. It presents overall superiority to previous measures in terms of recommendation and prediction qualities regardless of the characteristics of the data environment.

Performance Evaluation of FMS Using Generalized Stochastic Petri Nets (Generalized Stochastic 페트리네트를 이용한 유연생산시스템의 성능평가)

  • 서경원;박용수;박홍성;김종원
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1994.10a
    • /
    • pp.653-657
    • /
    • 1994
  • A symbolic performance analysis approach for flexible for manufactring systems (FMS) can be formulated based on the integration of Petri Nets (PN) and moment generating function (MGF) concept. In this method, generalized stochastic Petri Nets are used to define performance models for FMS, then MGF nased approach for evaluating stochastic PN is used to derive performance parameters of PN, and finally system performance is calculated. A GSPN model of machine cell is shown to illustrate the proposed method for evaluating such performance indices as production rate, utilization, work-in-process and lead time. The major advantage of this method over existing performance evaluation of FMS is the ability to compute symbolic solutions for performance. Finally future research toward automating performance measure for GSPN models of FMS is discussed.

  • PDF

An Application of Divisia Decomposition Analysis to the Measurement of Thermal Efficiency Improvement of Power Generation (화력발전소 효율개선 측정에 대한 디비지아분해기법의 적용)

  • Choi, Ki-Hong
    • Environmental and Resource Economics Review
    • /
    • v.9 no.5
    • /
    • pp.811-827
    • /
    • 2000
  • Since improved thermal efficiency reduces capacity requirements and energy costs, electricity producers often treat thermal efficiency as a measure of management or economic performance. The conventional measure of the thermal efficiency of a fossil-fuel generation system is the ratio of total electricity generation to the simple sum of energy inputs. As a refined approach, we present a novel thermal efficiency measure using the concept of the Divisia index number. Application of this approach to the Korean power sector shows improvement of thermal efficiency of 1.1% per year during 1970-1998. This is higher than the 0.9% improvement per year given by the conventional method. The difference is attributable to the effect of fuel substitution. In the Divisia decomposition context, we also show the limitations of the popular $T{\ddot{o}}rnqvist$ index formula and the superiority of the Sato-Vartia formula.

  • PDF

Learning Free Energy Kernel for Image Retrieval

  • Wang, Cungang;Wang, Bin;Zheng, Liping
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.8
    • /
    • pp.2895-2912
    • /
    • 2014
  • Content-based image retrieval has been the most important technique for managing huge amount of images. The fundamental yet highly challenging problem in this field is how to measure the content-level similarity based on the low-level image features. The primary difficulties lie in the great variance within images, e.g. background, illumination, viewpoint and pose. Intuitively, an ideal similarity measure should be able to adapt the data distribution, discover and highlight the content-level information, and be robust to those variances. Motivated by these observations, we in this paper propose a probabilistic similarity learning approach. We first model the distribution of low-level image features and derive the free energy kernel (FEK), i.e., similarity measure, based on the distribution. Then, we propose a learning approach for the derived kernel, under the criterion that the kernel outputs high similarity for those images sharing the same class labels and output low similarity for those without the same label. The advantages of the proposed approach, in comparison with previous approaches, are threefold. (1) With the ability inherited from probabilistic models, the similarity measure can well adapt to data distribution. (2) Benefitting from the content-level hidden variables within the probabilistic models, the similarity measure is able to capture content-level cues. (3) It fully exploits class label in the supervised learning procedure. The proposed approach is extensively evaluated on two well-known databases. It achieves highly competitive performance on most experiments, which validates its advantages.