• Title/Summary/Keyword: measure matrix

Search Result 477, Processing Time 0.022 seconds

Development of An Instrument to Measure Hope for the Cancer Patients (암환자 간호를 위한 희망 측정도구 개발)

  • 김달숙;이소우
    • Journal of Korean Academy of Nursing
    • /
    • v.28 no.2
    • /
    • pp.441-456
    • /
    • 1998
  • The purpose of this study was to develop a reliable and valid instrument to measure hope for cancer patients in Korea. This Hope Scale(Kim & Lee Hope Scale ; KLHS ) was developed based on not only critical universal attributes explaining both basic hope (generalized hope) and specific hope but also particular characteristics varing from culture and situation, which were revealed in a comprehensive review of the literature. Initially 60 items were generated from three sources : 36 items from the Q-sample used in the Kim's study, 1992, 21 representative items(statements) from the rest Q-population of the above study, 3 items related to the newly discovered category in the new qualitative study using 10 open ended question(death and dying) from the new qualitative study on the 20 cancer patients. At first 3 items were eliminated by the critique of the content validity experts, who were high experienced nurse, nursing professors. And then 4 items were eliminated in consideration of corrected item total correlation coefficiency, theoretical framework of this study. After that, 14 items were eliminated in comparing two or three items identified with the same meaning in each factor by this research team with factor loading and communality. This Hope Scale was finally constructed with 39 items. Psychometric evaluation was done on 492 adults(104 cancer patients, 388 adults who imagined who were cancer patients ranging from 18 to 76 years old. The results revealed high internal consistency Alpha coefficiency of .9351. Princial Component Factor Analysis with Varimax Rotation resulted in 8 factors with more than 1.0 of Eigenvalue. Referring to Eigenvalues, percent of variances(>60%), reproduced correlation matrix, and our theoretical framework, we decided the eight factors were the best1 solution to represent hope dimensions sufficiently. The eight factors were "confidence in possibility of cure", "sense of internal satisfaction", "being in communion", "meaning of life", "Korean hope perspectives", "belief in god", "self confidence", "self-worth". Among these factors, "confidence in possibility of cure", "sense of internal satisfaction", "Korean hope perspectives" were identified as different hope dimensions from those of Nowotny Hope Scale and Herth Hope Scale. There was significant negative correlation of r=-.4736 between this hope scale and Beck Hopelessness Scale (BHS), and significant positive correlation of r=.3685 between this hope scale and Life Orientation Test (LOT) which indicate convergent and discriminant validity. The range of hope scores was from 71 to 244, with a mean of 171.97(SD=28.16).

  • PDF

A Study on the Compression and Major Pattern Extraction Method of Origin-Destination Data with Principal Component Analysis (주성분분석을 이용한 기종점 데이터의 압축 및 주요 패턴 도출에 관한 연구)

  • Kim, Jeongyun;Tak, Sehyun;Yoon, Jinwon;Yeo, Hwasoo
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.19 no.4
    • /
    • pp.81-99
    • /
    • 2020
  • Origin-destination data have been collected and utilized for demand analysis and service design in various fields such as public transportation and traffic operation. As the utilization of big data becomes important, there are increasing needs to store raw origin-destination data for big data analysis. However, it is not practical to store and analyze the raw data for a long period of time since the size of the data increases by the power of the number of the collection points. To overcome this storage limitation and long-period pattern analysis, this study proposes a methodology for compression and origin-destination data analysis with the compressed data. The proposed methodology is applied to public transit data of Sejong and Seoul. We first measure the reconstruction error and the data size for each truncated matrix. Then, to determine a range of principal components for removing random data, we measure the level of the regularity based on covariance coefficients of the demand data reconstructed with each range of principal components. Based on the distribution of the covariance coefficients, we found the range of principal components that covers the regular demand. The ranges are determined as 1~60 and 1~80 for Sejong and Seoul respectively.

Development of Integrated Accessibility Measurement Algorithm for the Seoul Metropolitan Public Transportation System (서울 대도시권 대중교통체계의 통합 시간거리 접근성 산출 알고리즘 개발)

  • Park, Jong Soo;Lee, Keumsook
    • Journal of the Korean Regional Science Association
    • /
    • v.33 no.1
    • /
    • pp.29-41
    • /
    • 2017
  • This study proposes an integrated accessibility measurement algorithm, which is applied to the Seoul Metropolitan public transportation system consisting of bus and subway networks, and analyzes the result. We construct a public transportation network graph linking bus-subway networks and take the time distance as the link weight in the graph. We develop a time-distance algorithm to measure the time distance between each pair of transit stations based on the T-card transaction database. The average travel time between nodes has been computed via the shortest-path algorithm applied to the time-distance matrix, which is obtained from the average speed of each transit route in the T-card transaction database. Here the walking time between nodes is also taken into account if walking is involved. The integrated time-distance accessibility of each node in the Seoul Metropolitan public transportation system has been computed from the T-card data of 2013. We make a comparison between the results and those of the bus system and of the subway system, and analyze the spatial patterns. This study is the first attempt to measure the integrated time-distance accessibility for the Seoul Metropolitan public transportation system consisting of 16,277 nodes with 600 bus routes and 16 subway lines.

Applications of Regularized Dequantizers for Compressed Images (압축된 영상에서 정규화 된 역양자화기의 응용)

  • Lee, Gun-Ho;Sung, Ju-Seung;Song, Moon-Ho
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.39 no.5
    • /
    • pp.11-20
    • /
    • 2002
  • Based on regularization principles, we propose a new dequantization scheme on DCT-based transform coding for reducing of blocking artifacts and minimizing the quantization error. The conventional image dequantization is simply to multiply the received quantized DCT coefficients by the quantization matrix. Therefore, for each DCT coefficients, we premise that the quantization noise is as large as half quantizer step size (in DCT domain). Our approach is based on basic constraint that quantization error is bounded to ${\pm}$(quantizer spacing/2) and at least there are not high frequency components corresponding to discontinuities across block boundaries of the images. Through regularization, our proposed dequantization scheme, sharply reduces blocking artifacts in decoded images. Our proposed algorithm guarantees that the dequantization process will map the quantized DCT coefficients will be evaluated against the standard JPEG, MPEG-1 and H.263 (with Annex J deblocking filter) decoding process. The experimental results will show visual improvements as well as numerical improvements in terms of the peak-signal-to-noise ratio (PSNR) and the blockiness measure (BM) to be defined.

A Study of Quality Metrics Process Design Methodology for Field Application Encryption under Network Security Environment (네트워크 보안 환경에서의 현장적용 중심 암호품질 만족도 평가 메트릭스 설계 프로세스)

  • Noh, SiChoon;Kim, Jeom goo
    • Convergence Security Journal
    • /
    • v.15 no.5
    • /
    • pp.29-35
    • /
    • 2015
  • The network security encryption type is divided into two, one is point-to-point, second method is link type. The level of security quality attributes are a system security quality requirements in a networked environment. Quality attributes can be observed and should be able to be measured. If the quality requirements can be presented as exact figures, quality requirements are defined specifically setting quality objectives. Functional requirements in the quality attribute is a requirement for a service function which can be obtained through the encryption. Non-functional requirements are requirements of the service quality that can be obtained through the encryption. Encryption quality evaluation system proposed in this study is to derive functional requirements and non-functional requirements 2 groups. Of the calculating measure of the evaluation index in the same category, the associated indication of the quality measure of each surface should be created. The quality matrix uses 2-factor analysis of the evaluation for the associated surface quality measurements. The quality requirements are calculated based on two different functional requirements and non-functional requirements. The results are calculated by analyzing the trend of the average value assessment. When used this way, it is possible to configure the network security encryption based on quality management.

Evaluation of the Impact on Manufacturing Temperature and Time in the Production Process of Bio-composites (바이오복합재료 제조 공정시 제조온도 및 시간에 의한 영향 평가)

  • Park, Sang-Yong;Han, Gyu-Seong;Kim, Hee-Soo;Yang, Han-Seung;Kim, Hyun-Joong
    • Journal of the Korean Wood Science and Technology
    • /
    • v.33 no.1 s.129
    • /
    • pp.29-37
    • /
    • 2005
  • The main objective of this research was conducted to evaluate the impacts on the thermoplastic polymer which is a matrix polymer and the rice husk flour (RHF) which is a reinforcing filler relative to the manufacturing temperature and time when bio-composites were manufactured. In order to evaluate the impacts on the rice husk flour relative to the manufacturing temperature, the rice husk flour was persevered for 10 minutes to 2 hours period at $220^{\circ}C$ temperature which was then added with the polypropylene (PP) and low-density polyethylene (LDPE) to complete the manufacturing process of the bio-composites and measure the corresponding mechanical properties. As preserving time increased at $220^{\circ}C$, the tensile and impact strength were decreased due to the thermal degradation of the main components within the rice husk flour. The thermogravimetric analysis (TGA) was used to measure weight loss caused by the actual manufacturing temperature and the result was that the thermoplastic polymer had not scarcely occurred weight change, but there had been increasing rate of weight loss relative to time for the rice husk flour and the bio-composites under the consistent temperature of $220^{\circ}C$ for 2 hour time period. Therefore, the proper manufacturing temperature and time settings are significantly important features in order to prevent the reduction of mechanical properties which were induced throughout the manufacturing process under the high manufacturing temperature.

A CF-based Health Functional Recommender System using Extended User Similarity Measure (확장된 사용자 유사도를 이용한 CF-기반 건강기능식품 추천 시스템)

  • Sein Hong;Euiju Jeong;Jaekyeong Kim
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.3
    • /
    • pp.1-17
    • /
    • 2023
  • With the recent rapid development of ICT(Information and Communication Technology) and the popularization of digital devices, the size of the online market continues to grow. As a result, we live in a flood of information. Thus, customers are facing information overload problems that require a lot of time and money to select products. Therefore, a personalized recommender system has become an essential methodology to address such issues. Collaborative Filtering(CF) is the most widely used recommender system. Traditional recommender systems mainly utilize quantitative data such as rating values, resulting in poor recommendation accuracy. Quantitative data cannot fully reflect the user's preference. To solve such a problem, studies that reflect qualitative data, such as review contents, are being actively conducted these days. To quantify user review contents, text mining was used in this study. The general CF consists of the following three steps: user-item matrix generation, Top-N neighborhood group search, and Top-K recommendation list generation. In this study, we propose a recommendation algorithm that applies an extended similarity measure, which utilize quantified review contents in addition to user rating values. After calculating review similarity by applying TF-IDF, Word2Vec, and Doc2Vec techniques to review content, extended similarity is created by combining user rating similarity and quantified review contents. To verify this, we used user ratings and review data from the e-commerce site Amazon's "Health and Personal Care". The proposed recommendation model using extended similarity measure showed superior performance to the traditional recommendation model using only user rating value-based similarity measure. In addition, among the various text mining techniques, the similarity obtained using the TF-IDF technique showed the best performance when used in the neighbor group search and recommendation list generation step.

Generating Pylogenetic Tree of Homogeneous Source Code in a Plagiarism Detection System

  • Ji, Jeong-Hoon;Park, Su-Hyun;Woo, Gyun;Cho, Hwan-Gue
    • International Journal of Control, Automation, and Systems
    • /
    • v.6 no.6
    • /
    • pp.809-817
    • /
    • 2008
  • Program plagiarism is widespread due to intelligent software and the global Internet environment. Consequently the detection of plagiarized source code and software is becoming important especially in academic field. Though numerous studies have been reported for detecting plagiarized pairs of codes, we cannot find any profound work on understanding the underlying mechanisms of plagiarism. In this paper, we study the evolutionary process of source codes regarding that the plagiarism procedure can be considered as evolutionary steps of source codes. The final goal of our paper is to reconstruct a tree depicting the evolution process in the source code. To this end, we extend the well-known bioinformatics approach, a local alignment approach, to detect a region of similar code with an adaptive scoring matrix. The asymmetric code similarity based on the local alignment can be considered as one of the main contribution of this paper. The phylogenetic tree or evolution tree of source codes can be reconstructed using this asymmetric measure. To show the effectiveness and efficiency of the phylogeny construction algorithm, we conducted experiments with more than 100 real source codes which were obtained from East-Asia ICPC(International Collegiate Programming Contest). Our experiments showed that the proposed algorithm is quite successful in reconstructing the evolutionary direction, which enables us to identify plagiarized codes more accurately and reliably. Also, the phylogeny construction algorithm is successfully implemented on top of the plagiarism detection system of an automatic program evaluation system.

Heat or radiofrequency plasma glow discharge treatment of a titanium alloy stimulates osteoblast gene expression in the MC3T3 osteoprogenitor cell line

  • Rapuano, Bruce E.;Hackshaw, Kyle;Macdonald, Daniel E.
    • Journal of Periodontal and Implant Science
    • /
    • v.42 no.3
    • /
    • pp.95-104
    • /
    • 2012
  • Purpose: The purpose of this study was to determine whether increasing the Ti6Al4V surface oxide negative charge through heat ($600^{\circ}C$) or radiofrequency plasma glow discharge (RFGD) pretreatment, with or without a subsequent coating with fibronectin, stimulated osteoblast gene marker expression in the MC3T3 osteoprogenitor cell line. Methods: Quantitative real-time polymerase chain reaction was used to measure changes over time in the mRNA levels for osteoblast gene markers, including alkaline phosphatase, bone sialoprotein, collagen type I (${\alpha}1$), osteocalcin, osteopontin and parathyroid hormone-related peptide (PTH-rP), and the osteoblast precursor genes Runx2 and osterix. Results: Osteoprogenitors began to differentiate earlier on disks that were pretreated with heat or RFGD. The pretreatments increased gene marker expression in the absence of a fibronectin coating. However, pretreatments increased osteoblast gene expression for fibronectin-coated disks more than uncoated disks, suggesting a surface oxide-mediated specific enhancement of fibronectin's bioactivity. Heat pretreatment had greater effects on the mRNA expression of genes for PTH-rP, alkaline phosphatase and osteocalcin while RFGD pretreatment had greater effects on osteopontin and bone sialoprotein gene expression. Conclusions: The results suggest that heat and RFGD pretreatments of the Ti6Al4V surface oxide stimulated osteoblast differentiation through an enhancement of (a) coated fibronectin's bioactivity and (b) the bioactivities of other serum or matrix proteins. The quantitative differences in the effects of the two pretreatments on osteoblast gene marker expression may have arisen from the unique physico-chemical characteristics of each resultant oxide surface. Therefore, engineering the Ti6Al4V surface oxide to become more negatively charged can be used to accelerate osteoblast differentiation through fibronectin-dependent and independent mechanisms.

Dual-Channel Acoustic Event Detection in Multisource Environments Using Nonnegative Tensor Factorization and Hidden Markov Model (비음수 텐서 분해 및 은닉 마코프 모델을 이용한 다음향 환경에서의 이중 채널 음향 사건 검출)

  • Jeon, Kwang Myung;Kim, Hong Kook
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.54 no.1
    • /
    • pp.121-128
    • /
    • 2017
  • In this paper, we propose a dual-channel acoustic event detection (AED) method using nonnegative tensor factorization (NTF) and hidden Markov model (HMM) in order to improve detection accuracy of AED in multisource environments. The proposed method first detects multiple acoustic events by utilizing channel gains obtained from the NTF technique applied to dual-channel input signals. After that, an HMM-based likelihood ratio test is carried out to verify the detected events by using channel gains. The detection accuracy of the proposed method is measured by F-measures under 9 different multisource conditions. Then, it is also compared with those of conventional AED methods such as Gaussian mixture model and nonnegative matrix factorization. It is shown from the experiments that the proposed method outperforms the convectional methods under all the multisource conditions.