• Title/Summary/Keyword: Bartlett's test

Search Result 29, Processing Time 0.026 seconds

Score Tests for Overdispersion

  • Kim, Choong-Rak;Jeong, Mee-Seon;Yang, Mee-Yeong
    • Journal of the Korean Statistical Society
    • /
    • v.23 no.1
    • /
    • pp.207-216
    • /
    • 1994
  • Count data are often overdispersed, and an appropriate test for the existence of the overdispersion is necessary. In this paper we derive a score test based on the extended quasi-likelihood and the pseudolikelihood after adjusting to the Bartlett factor. Also, we compare it with Levene (1960)'s F-type test suggested by Ganio and Schafer (1992).

  • PDF

A Portmanteau Test Based on the Discrete Cosine Transform (이산코사인변환을 기반으로 한 포트맨토 검정)

  • Oh, Sung-Un;Cho, Hye-Min;Yeo, In-Kwon
    • The Korean Journal of Applied Statistics
    • /
    • v.20 no.2
    • /
    • pp.323-332
    • /
    • 2007
  • We present a new type of portmanteau test in the frequency domain which is derived from the discrete cosine transform(DCT). For the stationary time series, DCT coefficients are asymptotically independent and their variances are expressed by linear combinations of autocovariances. The covariance matrix of DCT coefficients for white noises is diagonal matrix whose diagonal elements is the variance of time series. A simple way to test the independence of time series is that we divide DCT coefficients into two or three parts and then compare sample variances. We also do this by testing the slope in the linear regression model of which the response variables are absolute values or squares of coefficients. Simulation results show that the proposed tests has much higher powers than Ljung-Box test in most cases of our experiments.

A comparison of tests for homoscedasticity using simulation and empirical data

  • Anastasios Katsileros;Nikolaos Antonetsis;Paschalis Mouzaidis;Eleni Tani;Penelope J. Bebeli;Alex Karagrigoriou
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.1
    • /
    • pp.1-35
    • /
    • 2024
  • The assumption of homoscedasticity is one of the most crucial assumptions for many parametric tests used in the biological sciences. The aim of this paper is to compare the empirical probability of type I error and the power of ten parametric and two non-parametric tests for homoscedasticity with simulations under different types of distributions, number of groups, number of samples per group, variance ratio and significance levels, as well as through empirical data from an agricultural experiment. According to the findings of the simulation study, when there is no violation of the assumption of normality and the groups have equal variances and equal number of samples, the Bhandary-Dai, Cochran's C, Hartley's Fmax, Levene (trimmed mean) and Bartlett tests are considered robust. The Levene (absolute and square deviations) tests show a high probability of type I error in a small number of samples, which increases as the number of groups rises. When data groups display a nonnormal distribution, researchers should utilize the Levene (trimmed mean), O'Brien and Brown-Forsythe tests. On the other hand, if the assumption of normality is not violated but diagnostic plots indicate unequal variances between groups, researchers are advised to use the Bartlett, Z-variance, Bhandary-Dai and Levene (trimmed mean) tests. Assessing the tests being considered, the test that stands out as the most well-rounded choice is the Levene's test (trimmed mean), which provides satisfactory type I error control and relatively high power. According to the findings of the study and for the scenarios considered, the two non-parametric tests are not recommended. In conclusion, it is suggested to initially check for normality and consider the number of samples per group before choosing the most appropriate test for homoscedasticity.

A Study on Evaluation Model for Usability of Research Data Service (연구데이터 서비스의 유용성 평가 모형 연구)

  • Park, Jin Ho;Ko, Young Man;Kim, Hyun Soo
    • Journal of the Korean Society for information Management
    • /
    • v.36 no.4
    • /
    • pp.129-159
    • /
    • 2019
  • The Purpose of this study is to develop an evaluation model for usability of research data service from the angles of evaluating usefulness of research data service itself and research data use experience-based usability. First, the various cases of evaluating usability of data services are examined and 4 rating scales and 20 measuring indicators for research data service are derived as a result of comparative analysis. In order to verify validity and reliability of the rating scale and the measuring indicators, the study conducted a survey of 164 potential research data users. KMO Bartlett Analysis was performed for validity test, and Principle Component Analysis and Verimax Rotating Method were used for component analysis on measuring indicators. The result shows that the 4 intrinsic rating scales satisfy the validity criteria of KMO Barlett; A single component was determined from component analysis, which verifies the validity of measuring indicators of the current rating scale. However, the result of 12 user experience-based measuring indicators analysis identified 2 components that are each classified as rating scale of utilization level and that of participation level. Cronbach's alpha of all 6 rating scales was 0.6 or more for the overall scale.

Validity and Reliability of Adversity Quotient Profile for Measuring Overcoming of Adversity among Nurses in Korea (역경지수(Adversity Quotient Profile) 도구의 타당성 및 신뢰성 검증 -종합병원 간호사 중심으로-)

  • An, Ji-Yeon;Woo, Hae-Young;Song, Jung-Hee;Kim, Hye-Jin
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.4
    • /
    • pp.2285-2294
    • /
    • 2014
  • Stoltz's AQP is a tool to estimate the level of adversity. The purpose of study is to test the reliability and validity of AQP. The participants were 297 nurses of tertiary hospital in Korea. Data were collected from Aug. to Oct. 2013. Resilience Scale was used as criterion of AQP. Cronbach's alpha test, item total correlation, exploratory factor, correlation analysis, and ANOVA test were used. In result, factor analysis was accounted for four factors explaining 56.256% of total variance and corresponding factors were factor 1 (Control, 10.7%), factor 2 (Ownership, 13.7%), factor 3(Reach, 17.4%), factor 4 (Endurance, 14.9%). The internal consistency was acceptable (Cronbach's alpha=.84). Four factors were positively correlated with RS. AQP is a reliable and valid instrument to measure for adversity quotient in Korean nurse.

Genetic Variation in the Natural Populations of Abies holophylla Max. Based on RAPD Analysis (RAPD 분석(分析)에 의한 전나무 천연집단(天然集團)의 유전변이(遺傳變異))

  • Kim, In Sik;Hyun, Jung Oh
    • Journal of Korean Society of Forest Science
    • /
    • v.88 no.3
    • /
    • pp.408-418
    • /
    • 1999
  • On the basin of RAPD analysis, genetic diversity and structure of the natural populations of Abies holophylla was estimated by AMOVA procedure. The average value of percent of polymorphic markers was 71.9%. Most variation existed among individuals within population(80.2%). Genetic differentiation among populations(${\Phi}_{ST}$) was 0.198. When the populations were grouped as two region(i.e., Taebaek and Sobaek Mountain Regions), 8.5% of the total genetic variation was explained as regional differences. The heterogeneity of molecular variance among populations was investigated with Bartlett's test, which revealed that populations of Mt. Taebaek and Mt. Gariwang were more heterogeneous. Generally, the populations of Taebaek Mountain Reion were more heterogeneous than those of Sobaek Mountain Reion. Finally, the applicability of AMOVA to the populations frenetic study was discussed in comparison with other measures of genetic differentiation which were widely used.

  • PDF

Validity and Reliability of Daily Life Stress Scale for College Students (대학생 일상생활 스트레스 측정 도구의 타당도와 신뢰도)

  • Park, Jeong-Hye;Kang, Se-Won
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.22 no.5
    • /
    • pp.423-432
    • /
    • 2021
  • This study developed a scale to measure college students' perceived stress levels for adverse life events that may occur in their daily lives and confirmed its validity and reliability. The scale was developed in accordance with DeVellis' scale development guidelines. Data were collected from 1,242 students of a local university in 2020. The collected samples were randomly separated into two groups (A, B). Group A (N=620) was tested for an initial examination of the performance, exploratory factor analysis, multitrait-multimethod matrix, criterion-related validity, and reliability of each item; and verified with group B (N=622) for confirmatory factor analysis and reliability re-test. As a result, the final scale of 33 items and eight factors were developed. The KMO values were 0.92, and Bartlett's spherical test was significant (χ2=12532.42, p<.001); the number of factors with initial eigenvalues of 1.0 or higher was eight; the cumulative factor loadings of 71.5% and the commonality of each item was 0.56 or higher. The reliability of the scale was Cronbach's alpha 0.94; sub-factors' Cronbach's alpha was 0.78 to 0.90. Therefore, these findings suggest that the scale developed in this study would be useful for measuring the stress levels of daily life for college students.

Factor Analysis for Exploratory Research in the Distribution Science Field (유통과학분야에서 탐색적 연구를 위한 요인분석)

  • Yim, Myung-Seong
    • Journal of Distribution Science
    • /
    • v.13 no.9
    • /
    • pp.103-112
    • /
    • 2015
  • Purpose - This paper aims to provide a step-by-step approach to factor analytic procedures, such as principal component analysis (PCA) and exploratory factor analysis (EFA), and to offer a guideline for factor analysis. Authors have argued that the results of PCA and EFA are substantially similar. Additionally, they assert that PCA is a more appropriate technique for factor analysis because PCA produces easily interpreted results that are likely to be the basis of better decisions. For these reasons, many researchers have used PCA as a technique instead of EFA. However, these techniques are clearly different. PCA should be used for data reduction. On the other hand, EFA has been tailored to identify any underlying factor structure, a set of measured variables that cause the manifest variables to covary. Thus, it is needed for a guideline and for procedures to use in factor analysis. To date, however, these two techniques have been indiscriminately misused. Research design, data, and methodology - This research conducted a literature review. For this, we summarized the meaningful and consistent arguments and drew up guidelines and suggested procedures for rigorous EFA. Results - PCA can be used instead of common factor analysis when all measured variables have high communality. However, common factor analysis is recommended for EFA. First, researchers should evaluate the sample size and check for sampling adequacy before conducting factor analysis. If these conditions are not satisfied, then the next steps cannot be followed. Sample size must be at least 100 with communality above 0.5 and a minimum subject to item ratio of at least 5:1, with a minimum of five items in EFA. Next, Bartlett's sphericity test and the Kaiser-Mayer-Olkin (KMO) measure should be assessed for sampling adequacy. The chi-square value for Bartlett's test should be significant. In addition, a KMO of more than 0.8 is recommended. The next step is to conduct a factor analysis. The analysis is composed of three stages. The first stage determines a rotation technique. Generally, ML or PAF will suggest to researchers the best results. Selection of one of the two techniques heavily hinges on data normality. ML requires normally distributed data; on the other hand, PAF does not. The second step is associated with determining the number of factors to retain in the EFA. The best way to determine the number of factors to retain is to apply three methods including eigenvalues greater than 1.0, the scree plot test, and the variance extracted. The last step is to select one of two rotation methods: orthogonal or oblique. If the research suggests some variables that are correlated to each other, then the oblique method should be selected for factor rotation because the method assumes all factors are correlated in the research. If not, the orthogonal method is possible for factor rotation. Conclusions - Recommendations are offered for the best factor analytic practice for empirical research.

Moderating Effect of Value on Relationship between Foodservice Quality and Satisfaction at Family Restaurant in the Eastern Part of Chonnam (전남 동부권 패밀리레스토랑 음식서비스질과 만족의 관계에 미치는 가치의 조절효과)

  • Kang Jong-Heon;Lee Jun-Ho
    • Korean journal of food and cookery science
    • /
    • v.20 no.6 s.84
    • /
    • pp.581-588
    • /
    • 2004
  • The purpose of this study was to test the moderating effect of value on relationship between foodservice quality and satisfaction in family restaurant. Accordingly, this study surveyed by questionnaire concerning 18 items of foodservice quality, 1 item of overall satisfaction, 1 item of value as well as respondents' characteristics. The result of this study were as follows. KMO and Bartlett's test statistics showed that the data fitted factor analysis well. Results of factor analysis, average variance extracted estimates and shared variance showed that the convergent and discriminant validitys of 4 factors were supported, and cronbach's alpha showed that the internal consistency of 4 factors was supported. It was found that satisfaction was influenced by the interaction between 4 factors of service quality and value rather than by either service quality or value directly. Finally, The results indicated that high level of service quality might not lead customer satisfaction because of moderating effect of value.

Adaptive Noise Reduction Algorithm for an Image Based on a Bayesian Method

  • Kim, Yeong-Hwa;Nam, Ji-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.4
    • /
    • pp.619-628
    • /
    • 2012
  • Noise reduction is an important issue in the field of image processing because image noise lowers the quality of the original pure image. The basic difficulty is that the noise and the signal are not easily distinguished. Simple smoothing is the most basic and important procedure to effectively remove the noise; however, the weakness is that the feature area is simultaneously blurred. In this research, we use ways to measure the degree of noise with respect to the degree of image features and propose a Bayesian noise reduction method based on MAP (maximum a posteriori). Simulation results show that the proposed adaptive noise reduction algorithm using Bayesian MAP provides good performance regardless of the level of noise variance.