• Title/Summary/Keyword: Performance Measure Approach

Search Result 410, Processing Time 0.022 seconds

An Efficient Taguchi Approach for the Performance Optimization of Health, Safety, Environment and Ergonomics in Generation Companies

  • Azadeh, Ali;Sheikhalishahi, Mohammad
    • Safety and Health at Work
    • /
    • v.6 no.2
    • /
    • pp.77-84
    • /
    • 2015
  • Background: A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. Methods: To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. Results: The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. Conclusion: The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.

A Rational Development Way of Performance Evaluation Measures for the Informatization Program by means of the Coincidence Analysis (부합성 분석을 이용한 정보화지원사업 성과평가지표의 합리적 도출 방안)

  • Choi Jeom-Ki;Park Il-Kyu;Kim Sang-Hoon
    • Journal of Information Technology Applications and Management
    • /
    • v.13 no.3
    • /
    • pp.145-179
    • /
    • 2006
  • The measurement of informatization program effectiveness continues to be a central concern of both academics and practitioners. However, most studies of it have been inclined too heavily toward testing validity and reliability of evaluation measure to strengthen the theoretical rigor. Drawbacks of this approach have been pointed out as being unsuitable for evaluating the effectiveness of informatization program due to lack of practical relevance. The paper comprises four sections to solve the problem. 1) Through literature review of theoretical conceptual models in IS effectiveness and the domestic and foreign cases on informatization program, we establish the evaluation framework which consists of evaluation phases, 6 evaluation areas, and 22 evaluation items, and developed evaluation measures of each items. 2) Different researchers have addressed different aspects of attribute of measure. With comparing them to each other, 3 mutually exclusive criteria on coincidence analysis(usability of measurement. reliability of measurement. and validity of content) are identified. 3) Data are gathered from respondents in informatizaiton program-related organizations such as policy agency, management agency, support agency, and recipient agency. 4) The degree of coincidence of each evaluation measure are analyzed by course of an coincidence analysis procedure, which includes the reliability test, arithmetical test, and criteria validity test. The academic implication of this study is that researchers can gain an insight into the coincidence analysis to approach the problem of deciding on an indicator of performance for informatization program. There is also practical implication for applying the potential means of improving the practical relevance of conducting evaluating the informatization program.

  • PDF

Economic Design of $\bar{X}$ Control Chart Using a Surrogate Variable (대용변수를 이용한 $\bar{X}$ 관리도의 경제적 설계)

  • Lee, Tae-Hoon;Lee, Jae-Hoon;Lee, Min-Koo;Lee, Joo-Ho
    • Journal of Korean Society for Quality Management
    • /
    • v.37 no.2
    • /
    • pp.46-57
    • /
    • 2009
  • The traditional approach to economic design of control charts is based on the assumption that a process is monitored using a performance variable. However, various types of automatic test equipments recently introduced as a part of factory automation usually measure surrogate variables instead of performance variables that are costly to measure. In this article we propose a model for economic design of a control chart which uses a surrogate variable that is highly correlated with the performance variable. The optimum values of the design parameters are determined by maximizing the total average income per cycle time. Numerical studies are performed to compare the proposed $\bar{X}$ control charts with the traditional model using the examples in Panagos et al. (1985).

Charactor Image Retrieval Using Color and Shape Information (컬러와 모양 정보를 이용한 캐릭터 이미지 검색)

  • 이동호;유광석;김회율
    • Journal of Broadcast Engineering
    • /
    • v.5 no.1
    • /
    • pp.50-60
    • /
    • 2000
  • In this paper, we propose a new composite feature consists of both color and shape information that are suitable for the task of character image retrieval. This approach extracts shape-based information using Zernike moments from Y image in YCbCr color space. Zernike moments can extract shape-based features that are invariant to rotation, translation, and scaling. We also extract color-based information from the DCT coefficients of Cr and Cb image. This approach is good method reflecting human visual property and is suitable for web application such as large image database system and animation because higher retrieval rate has been achieved using only 36 features. In experiment, this method is applied to 3,834 character images. We confirmed that this approach brought about excellent effect by ANMRR(Average of Normalized, Modified Retrieval Rank), which is used in the evaluation measure of MPEG-7 color descriptor and BEP(Bull's Eye Performance), which is used in evaluation measure of shape descriptor in character image retrieval.

  • PDF

Evaluating website performance using formula and balanced scorecard methods

  • Hou, Liyao;Hong, Jong-Yi;Suh, Eui-Ho
    • 한국경영정보학회:학술대회논문집
    • /
    • 2007.06a
    • /
    • pp.764-769
    • /
    • 2007
  • This paper proposes a new model to· evaluate the effectiveness of websites using balanced scorecard (BSC) and weighted formula based methods.fisrt, we Use BSC method to find out the cause-and-effect relationships between the measure and website activities, and our proposed model evaluates website performance from six perspectives: business value, operational excellence, customer value, website interface, management, maintenance, and learning and innovation. Next, we usedformula-based approach to identify what makes website performance low by developing the evaluation formula through investigating website users; finally, case studies of two famous websites are given to show how our method can be used Our evaluation model can not only evaluate website performance but also suggest how to improve performance.

  • PDF

A review of cognitive orientation to daily occupational performance with stroke

  • Ahn, Si-Nae
    • Physical Therapy Rehabilitation Science
    • /
    • v.6 no.4
    • /
    • pp.202-207
    • /
    • 2017
  • Objective: The self-decisions of the client regarding the meaningful work as a therapeutic approach of client-orientation. The Cognitive Orientation to daily Occupational Performance (CO-OP) is an occupation-oriented problem-solving approach. The purpose of this study was to describe the goals and intervention protocols of CO-OP in those affected by stroke. Design: A systematic review. Methods: Using EBSCOhost, PubMed, and ProQuest databases, we searched studies published in the past decade that utilized the CO-OP intervention. An initial search revealed 71,171 potential articles. After applying our search criteria to screen the titles, abstracts, and full-text, we included 7 articles that met our inclusion and exclusion criteria. In this study, we used the patient, intervention(s), comparison, outcome method to analyze the 7 selected studies. We analyzed the frequency of goals and intervention protocols. Results: Seven articles met our selection criteria; these studies included participants with an almost normal cognitive function from inpatient and outpatient rehabilitation facilities. CO-OP was used for 237 goals; the most used goal was the instrumental activities of daily living. The training procedure used 3 types of self-selecting goals in the activities. One of the goals was not trained, but was only evaluated to determine the generation effect. The most common outcome measurements included the Canadian Occupational Performance Measure and the Performance Quality Rating Scale. Conclusions: This research provided information about the effectiveness of CO-OP and selecting the correct evaluation tool to assess the efficiency of the intervention. This study suggests that treatment with CO-OP in occupational therapy is effective and that it outlines common protocols.

Evaluation of Regional Knowledge Innovation System in China: An Economic Framework Based on Dynamic Slacks-based Approach

  • CHIU, Sheng-Hsiung;LIN, Tzu-Yu
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.6 no.3
    • /
    • pp.141-149
    • /
    • 2019
  • The paper proposes a knowledge innovation performance model by the dynamic data envelopment analysis with slacks-based measure approach for evaluating the effectiveness of 30 regional knowledge innovation activities in China from 2010 to 2016. In recent years, China has paid more attention to knowledge innovation activities, as central and local governments have pushed on with their innovation projects by lots of investment whatever the difficulties may be. Decision-maker is usually interested in judge its knowledge innovation performance relative to target benchmark by exploring whether one provincial administration region performs better among others and/or if the growth of economy will be benefited greatly by the knowledge innovation activities. To acquire the managerial insight about this issue from a comprehensively designed performance evaluation model, knowledge innovation activity is conceptualized as an intertemporal production process. Invention patent and regional gross product are imposed on desirable outputs, highlighting the need for knowledge economy. The empirical result shows that knowledge innovation has a positive effect on economic development. At the same time, decision-maker should be interest in the economic effect of patents' type and quality. The government should then encourage new technical applications with greater commercial value from a market-oriented perspective, in order to benefit the most from the innovation process in the short-run.

High-performance computing for SARS-CoV-2 RNAs clustering: a data science-based genomics approach

  • Oujja, Anas;Abid, Mohamed Riduan;Boumhidi, Jaouad;Bourhnane, Safae;Mourhir, Asmaa;Merchant, Fatima;Benhaddou, Driss
    • Genomics & Informatics
    • /
    • v.19 no.4
    • /
    • pp.49.1-49.11
    • /
    • 2021
  • Nowadays, Genomic data constitutes one of the fastest growing datasets in the world. As of 2025, it is supposed to become the fourth largest source of Big Data, and thus mandating adequate high-performance computing (HPC) platform for processing. With the latest unprecedented and unpredictable mutations in severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the research community is in crucial need for ICT tools to process SARS-CoV-2 RNA data, e.g., by classifying it (i.e., clustering) and thus assisting in tracking virus mutations and predict future ones. In this paper, we are presenting an HPC-based SARS-CoV-2 RNAs clustering tool. We are adopting a data science approach, from data collection, through analysis, to visualization. In the analysis step, we present how our clustering approach leverages on HPC and the longest common subsequence (LCS) algorithm. The approach uses the Hadoop MapReduce programming paradigm and adapts the LCS algorithm in order to efficiently compute the length of the LCS for each pair of SARS-CoV-2 RNA sequences. The latter are extracted from the U.S. National Center for Biotechnology Information (NCBI) Virus repository. The computed LCS lengths are used to measure the dissimilarities between RNA sequences in order to work out existing clusters. In addition to that, we present a comparative study of the LCS algorithm performance based on variable workloads and different numbers of Hadoop worker nodes.

Robust model matching design using normalized left coprime factorization approach

  • Hanajima, Naohiko;Eisaka, Toshio;Yanagita, Yoshiho;Tsuchiya, Takeshi;Tagawa, Ryozaburo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1993.10b
    • /
    • pp.360-365
    • /
    • 1993
  • In this paper, we propose a new design procedure of the Robust Model Matching(RM) using the Normalized Left Coprime Factorization (NLCF) approach. The RMM aims at reducing the sensitivity of a given control system, but standard design procedures are not for robust stability. Therefore we try applying the robust stability condition based on NLCF to RMM procedure. We first formulate the RMM using the robust stability condition of NLCF approach, then we propose the new procedure of the RMM. The point is that the condition includes the measure of sensitivity of the RMM. In the proposed procedure, a cost function is determined through the condition and solved by H$_{\infty}$ contro technique. Finally we show a design example and check the performance..

  • PDF

A new demosaicing method based on trilateral filter approach (세방향 필터 접근법에 기반한 새로운 디모자익싱 기법)

  • Kim, Taekwon;Kim, Kiyun
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.11 no.4
    • /
    • pp.155-164
    • /
    • 2015
  • In this paper, we propose a new color interpolation method based on trilateral filter approach, which not only preserve the high-frequency components(image edge) while interpolating the missing raw data of color image(bayer data pattern), but also immune to the image noise components and better preserve the detail of the low-frequency components. The method is the trilateral filter approach applying a gradient to the low frequency components of the image signal in order to preserve the high-frequency components and the detail of the low-frequency components through the measure of the freedom of similarity among adjacent pixels. And also we perform Gaussian smoothing to the interpolated image data in order to robust to the noise. In this paper, we compare the conventional demosaicing algorithm and the proposed algorithm using 10 test images in terms of hue MAD, saturation MAD and CPSNR for the objective evaluation, and verify the performance of the proposed algorithm.