• Title/Summary/Keyword: Models, statistical

Search Result 3,012, Processing Time 0.028 seconds

An analysis of the signaling effect of FOMC statements (미 연준 통화정책방향 의결문의 시그널링 효과 분석)

  • Woo, Shinwook;Chang, Youngjae
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.3
    • /
    • pp.321-334
    • /
    • 2020
  • The US Federal Reserve (Fed) has decided to cut interest rates. When we look at the expression of the FOMC statements at the time of policy change period we can understand that Fed has been communicating with markets through a change of word selection. However, there is a criticism that the method of analyzing the expression of the decision sentence through the context can be subjective and limited in qualitative analysis. In this paper, we evaluate the signaling effect of FOMC statements based on previous research. We analyze decision making characteristics from the viewpoint of text mining and try to predict future policy trend changes by capturing changes in expressions between statements. For this purpose, a decision tree and neural network models are used. As a result of the analysis, it can be judged that the discrepancy indicators between statements could be used to predict the policy change in the future and that the US Federal Reserve has systematically implemented policy signaling through the policy statements.

Exploitation of the Dose/Time-Response Relationship for a New Measure of DNA Repari in the Single-Cell Gel Electrophoresis (Comet) Assay

  • Kim, Byung-Soo;Edler, Lutz;Park, Jin-Joo;Fournier, Dietrich Von;Haase, Wulf;Sautter-Bihl, Mare-Luise;Hagmuller, Egbert;Gotzes, Florian;Thielmann, Heinz Walter
    • Toxicological Research
    • /
    • v.20 no.2
    • /
    • pp.89-100
    • /
    • 2004
  • The comet assay (also called the single-cell gel electrophoresis assay) has been widely used for detecting DNA damage and repair in individual cells. Since the conventional methods of evaluating comet assay data using frequency statistics are unsatisfactory we developed a new quantitative measure of DNA damage/repair that is based on all information residing in the dose/time-response curves of a comet experiment. Blood samples were taken from 25 breast cancer patients before undergoing radiotherapy. The comet assay was performed under alkaline conditions using isolated lymphocytes. Tail DNA, tail length, tail moment and tail inertia of the comet were measured for each patient at four doses of $\gamma$-rays (0, 2, 4 and 8 Gy) and at four time points after irradiation (0, 10, 20 and 30 min) using 100 cells each. The resulting three-dimensional dose-time response surface was modeled by multiple regression, and the second derivative, termed 2D, on dose and time was determined. A software module was programmed in SAS/AF to compute 2D values. We applied the new method successfully to data obtained from cancer patients to be assessed for their radiation sensitivity. We computed the 2D values for the four damage measures, i.e., tail moment, tail length, tail DNA and tail inertia, and examined the pairwise correlation coefficients of 2D both on the log scale and the unlogged scale. 2D values based on tail moment and tail DNA showed a high correlation and, therefore, these two damage measures can be used interchangeably as far as DNA repair is concerned. 2D values based on tail inertia have a correlation profile different from the other 2D values which may reflect different facets of DNA damage/repair. Using the dose-time response surface, other statistical models, e.g., the proportional hazards model, become applicable for data analysis. The 2D approach can be applied to all DNA repair measures, Le., tail moment, tail length, tail DNA and tail inertia, and appears to be superior to conventional evaluation methods as it integrates all data of the dose/time-response curves of a comet assay.

Selection of bandwidth for local linear composite quantile regression smoothing (국소 선형 복합 분위수 회귀에서의 평활계수 선택)

  • Jhun, Myoungshic;Kang, Jongkyeong;Bang, Sungwan
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.5
    • /
    • pp.733-745
    • /
    • 2017
  • Local composite quantile regression is a useful non-parametric regression method widely used for its high efficiency. Data smoothing methods using kernel are typically used in the estimation process with performances that rely largely on the smoothing parameter rather than the kernel. However, $L_2$-norm is generally used as criterion to estimate the performance of the regression function. In addition, many studies have been conducted on the selection of smoothing parameters that minimize mean square error (MSE) or mean integrated square error (MISE). In this paper, we explored the optimality of selecting smoothing parameters that determine the performance of non-parametric regression models using local linear composite quantile regression. As evaluation criteria for the choice of smoothing parameter, we used mean absolute error (MAE) and mean integrated absolute error (MIAE), which have not been researched extensively due to mathematical difficulties. We proved the uniqueness of the optimal smoothing parameter based on MAE and MIAE. Furthermore, we compared the optimal smoothing parameter based on the proposed criteria (MAE and MIAE) with existing criteria (MSE and MISE). In this process, the properties of the proposed method were investigated through simulation studies in various situations.

Development of climate change uncertainty assessment method for projecting the water resources (기후변화에 따른 수자원 전망의 불확실성 평가기법 개발)

  • Lee, Moon-Hwan;So, Jae-Min;Bae, Deg-Hyo
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.8
    • /
    • pp.657-671
    • /
    • 2016
  • It is expected that water resources will be changed spatially and temporally due to the global climate change. The quantitative assessment of change in water availability and appropriate water resources management measures are needed for corresponding adaptation. However, there are large uncertainties in climate change impact assessment on water resources. For this reason, development of technology to evaluate the uncertainties quantitatively is required. The objectives of this study are to develop the climate change uncertainty assessment method and to apply it. The 5 RCMs (HadGEM3-RA, RegCM4, MM5, WRF, and RSM), 5 statistical post-processing methods (SPP) and 2 hydrological models (HYM) were applied for evaluation. The results of the uncertainty analysis showed that the RCM was the largest sources of uncertainty in Spring, Summer, Autumn (29.3~68.9%), the hydrological model was the largest source of uncertainty in Winter (46.5%). This method can be possible to analyze the changes in the total uncertainty according to the specific RCM, SPP, HYM model. And then it is expected to provide the method to reduce the total uncertainty.

Estimation of Software Development Efforts and Schedule Based on A Ballpark Schedule Estimation Table (개략적 일정추정표 기반 소프트웨어 개발노력과 일정 추정)

  • Park, Young-Mok
    • Journal of Internet Computing and Services
    • /
    • v.8 no.4
    • /
    • pp.105-117
    • /
    • 2007
  • In order to succeed in a bid or development, the project manager should estimate its cost and schedule more accurately in the early stage of the project. Usually, the nominal schedule of most projects can be derived from rule of thumb, first-order estimation practice, or ball-park schedule estimation table. But the rule-of-thumb models for the nominal schedule estimation are so various, and the first-order estimation practice does not provide sufficient information. So they do not help much to decide on the proper development effort and schedule for a particular size of project. This paper presents a statistical regression model for deciding the development effort and schedule of a project using the ball-park schedule estimation table. First, we have redefined such words suggested in the ballpark schedule estimation table as shortest possible schedule, efficient schedule and nominal schedule, Next, we have investigated the relationship between the development effort and the schedule. Finally, we have suggested a model for estimating the development effort and the more accurate schedule of such particular sizes of software as are not presented in the ball-park schedule estimation table. The experimental results show that our proposed regression analysis model decreases the mean magnitude of relative error by 2% at maximum. Also this model can estimated the development effort and schedule for a particular size of software.

  • PDF

Sensitivity Analysis for Unit Module Development of Hybrid tube Structural System (복합 튜브 구조시스템의 단위 모듈 개발에 대한 민감도 해석)

  • Lee, Yeon-Jong;Park, Sung-Soo
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.22 no.1
    • /
    • pp.167-175
    • /
    • 2018
  • This research deals, The characteristics of mechanics and behavior of the tube structural systems, It has been investigated and considered conventional theory and case models, It has shown the suitability, The best location, And optimal shape of the unit module system, Considered variables materials of stiffness increase and decrease in hybrid tube structural systems this study carried out adapting analysis of statistical concepts. In a concrete way, This study exams the effect of reducing horizontal displacement and the shear lag phenomenon, Also, The purpose of this study is to utilize the basic data on the design and study of future high-rise hybrid structural system using this research. As a result, The framed- tube structural system does not effectively cope with horizontal behavior of high-rise buildings, The results of using varying material tested resistance factors and lateral loads in hybrid tube structural system, When each material is compared Bracing material is identified as a key factor in lateral behavior. In a ratio of material quantity framed-tube structural system, The level of sensitivity affecting the horizontal displacement is greater then the beam's column, In case of braced tube structural system, Braced appeared to be most sensitive in comparison of material quantity ratio in columns and beams.

Derivation of Probability Plot Correlation Coefficient Test Statistics and Regression Equation for the GEV Model based on L-moments (L-모멘트 법 기반의 GEV 모형을 위한 확률도시 상관계수 검정 통계량 유도 및 회귀식 산정)

  • Ahn, Hyunjun;Jeong, Changsam;Heo, Jun-Haeng
    • Journal of Korean Society of Disaster and Security
    • /
    • v.13 no.1
    • /
    • pp.1-11
    • /
    • 2020
  • One of the important problem in statistical hydrology is to estimate the appropriated probability distribution for a given sample data. For the problem, a goodness-of-fit test is conducted based on the similarity between estimated probability distribution and assumed theoretical probability distribution. Probability plot correlation coefficient test (PPCC) is one of the goodness-of-fit test method. PPCC has high rejection power and its application is simple. In this study, test statistics of PPCC were derived for generalized extreme value distribution (GEV) models based on L-moments and these statistics were suggested by the multiple and nonlinear regression equations for its usability. To review the rejection power of the newly proposed method in this study, Monte Carlo simulation was performed with other goodness-of-fit tests including the existing PPCC test. The results showed that PPCC-A test which is proposed in this study demonstrated better rejection power than other methods, including the existing PPCC test. It is expected that the new method will be helpful to estimate the appropriate probability distribution model.

Least Cost and Optimum Mixing Programming by Yulmu Mixture Noddle (율무국수를 이용한 최소가격/최적배합 프로그래밍)

  • Kim, Sang-Soo;Kim, Byung-Yong;Hahm, Young-Tae;Shin, Dong-Hoon
    • Korean Journal of Food Science and Technology
    • /
    • v.31 no.2
    • /
    • pp.385-390
    • /
    • 1999
  • Noodle was made using a combination of yulmu, wheat and water through mixture design. Statistical models of yulmu noodle were shown by analysing tensile stress and color $(L^{*})$, and sensory evaluation with other constraints. Analysing the linear and non-linear model, the linearity in the values of tensile stress, lightness $(L^{*})$ and sensory evaluation showed that each component worked separately without interactions. In studying the component effect on the response by trace plot, the result indicated that the increase in the amount of yulmu enhanced tensile stress of noodle while degrading $L^{*}$ value and sensory evaluation score. In the range of satisfying the conditions of noodle in every tensile stress, $L^{*}$ value and sensory evaluation point, the optimum mixture ratio of yulmu : wheat : water was 2.27% : 66.28% : 28.45% based on least cost linear programming. In this calculation, the least cost was 9.924 and estimated potential results of the response for tensile stress was 2.234 N and those for $L^{*}$ was 82.39. Finally, the potential response results affected by mixture ratio of yulmu, wheat and water were screened using Excel.

  • PDF

Effect of Genetic-Environmental Interaction on Quality of Wheat (소맥(小麥) 품질특성(品質特性)의 유전(遺傳) 및 환경적(環境的) 변이(變異))

  • Chang, Hak-Gil;Kim, Chang-Sik;Hah, Duk-Mo;Shin, Hyo-Sun
    • Korean Journal of Food Science and Technology
    • /
    • v.18 no.1
    • /
    • pp.31-37
    • /
    • 1986
  • Seven cultivars of hard and soft wheat were evaluated by regression analysis for five bread quality characteristics to determine varietal response to environments. The regression coefficients were used as the measure of adaptability, and determination coefficients were used as the measure of stability by models of Eberhart and Russell. Phenotypic, genotypic and environmental correlation coefficient estimated for 6 characters tested in this experiments. Statistical analyses confirmed the strong influence of environment on five bread quality. A significant positive correlation exists between protein content, sedimentation value, pelshenke value and specific loaf volume. High heritability was found for sedimentation value ($h^2=0.747$), protein content ($h^2=0.557$) and specific loaf volume ($h^2=0.551$).

  • PDF

Comparison of the accuracy of digitally fabricated polyurethane model and conventional gypsum model

  • Kim, So-Yeun;Lee, So-Hyoun;Cho, Seong-Keun;Jeong, Chang-Mo;Jeon, Young-Chan;Yun, Mi-Jung;Huh, Jung-Bo
    • The Journal of Advanced Prosthodontics
    • /
    • v.6 no.1
    • /
    • pp.1-7
    • /
    • 2014
  • PURPOSE. The accuracy of a gypsum model (GM), which was taken using a conventional silicone impression technique, was compared with that of a polyurethane model (PM), which was taken using an iTero$^{TM}$ digital impression system. MATERIALS AND METHODS. The maxillary first molar artificial tooth was selected as the reference tooth. The GMs were fabricated through a silicone impression of a reference tooth, and PMs were fabricated by a digital impression (n=9, in each group). The reference tooth and experimental models were scanned using a 3 shape convince$^{TM}$ scan system. Each GM and PM image was superimposed on the registered reference model (RM) and 2D images were obtained. The discrepancies of the points registered on the superimposed images were measured and defined as GM-RM group and PM-RM group. Statistical analysis was performed using a Student's T-test (${\alpha}=0.05$). RESULTS. A comparison of the absolute value of the discrepancy revealed a significant difference between the two groups only at the occlusal surface. The GM group showed a smaller mean discrepancy than the PM group. Significant differences in the GM-RM group and PM-RM group were observed in the margins (point a and f), mesial mid-axial wall (point b) and occlusal surfaces (point c and d). CONCLUSION. Under the conditions examined, the digitally fabricated polyurethane model showed a tendency for a reduced size in the margin than the reference tooth. The conventional gypsum model showed a smaller discrepancy on the occlusal surface than the polyurethane model.