• Title/Summary/Keyword: trend algorithm

Search Result 435, Processing Time 0.027 seconds

3D imaging of fracture aperture density distribution for the design and assessment of grouting works (절리 암반내 그라우팅 설계 및 성과 판단을 위한 절리틈새 밀도 분포의 3차원 영상화 연구)

  • Kim, Jung-Yul;Kim, Yoo-Sung;Nam, Ji-Yeon
    • Proceedings of the Korean Geotechical Society Conference
    • /
    • 2004.03b
    • /
    • pp.113-120
    • /
    • 2004
  • Grouting works in fractured rocks have been performed to reinforce the underground and/or to block ground water flow at the foundation site of dam, bridge and so on. For the efficient grouting design, a prior knowledge of the fracture pattern of underground area to be grouted in very important. For the practical use, aperture sizes of open fractures that will be filled up with grouting materials will be kind of valuable information. Thus, the main purpose of this study is to develop a new technique (so called "GenFT") enable to form a three dimensional image of fracture aperture density distribution from Televiewer data. For this, the study is to focus on dealing with (1) estimating aperture size of each fracture automatically from Televiewer time image, (2) mapping extension of fracture planes on a given section, (3) evaluating aperture density distribution on the section by using both aperture size and fracture face mapping result of each fracture, (4) developing an algorithm that can transfer the previous results to any arbitrary(vertical and/or horizontal) section around the borehole. Since 3D imaging means "a strategy used to form an image of arbitrarily subdivided 2D sections with aperture density distribution", it will help avoid ambiguities of fracture pattern interpretation and hence will be of practical use not only for the design and assessment of grouting works but also for various engineering works. Examples of fields experiments are illustrated. It would seem that this technique might lead to reflecting future trend in underground survey.

  • PDF

The NHPP Bayesian Software Reliability Model Using Latent Variables (잠재변수를 이용한 NHPP 베이지안 소프트웨어 신뢰성 모형에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.6 no.3
    • /
    • pp.117-126
    • /
    • 2006
  • Bayesian inference and model selection method for software reliability growth models are studied. Software reliability growth models are used in testing stages of software development to model the error content and time intervals between software failures. In this paper, could avoid multiple integration using Gibbs sampling, which is a kind of Markov Chain Monte Carlo method to compute the posterior distribution. Bayesian inference for general order statistics models in software reliability with diffuse prior information and model selection method are studied. For model determination and selection, explored goodness of fit (the error sum of squares), trend tests. The methodology developed in this paper is exemplified with a software reliability random data set introduced by of Weibull distribution(shape 2 & scale 5) of Minitab (version 14) statistical package.

  • PDF

Keyword Network Analysis about the Trends of Social Welfare Researches - focused on the papers of KJSW during 1979~2015 - (사회복지학 연구동향에 관한 키워드 네트워크 분석 - 「한국사회복지학」 게재논문(1979-2015)을 중심으로 -)

  • Kam, Jeong Ki;Kam, Mi Ah;Park, Mi Hee
    • Korean Journal of Social Welfare
    • /
    • v.68 no.2
    • /
    • pp.185-211
    • /
    • 2016
  • This study analyzes key word networks of the papers which are published at Korean Journal of Social Welfare issued by Korean Academy of Social Welfare from 1979 to 2015. It aims at investigating the trends of social welfare researches in Korea by dividing the given period into two: 1979-2000 and 2001-2015. It shows the trends in three ways: methodologies, subjects, and intellectual structures. In order to identify intellectual structure, it calculate centrality indices basing on co-appearance frequency of key words. It also derives some values which explain relationship structure of key words by using pathfinder algorithm, and finally visualizes the intellectual structures by using the NodeXL program. Some implications of the findings of these analyses are discussed in the end.

  • PDF

High-performance Pipeline Architecture for Modified Booth Multipliers (Modified Booth 곱셈기를 위한 고성능 파이프라인 구조)

  • Kim, Soo-Jin;Cho, Kyeong-Soon
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.46 no.12
    • /
    • pp.36-42
    • /
    • 2009
  • This paper proposes the high-performance pipeline architecture for modified Booth multipliers. The proposed multiplier circuits are based on modified Booth algorithm and pipeline architecture which are the most widely used techniques to accelerate the multiplication speed. In order to implement the optimally pipelined multipliers, many kinds of experiments have been conducted. The experimental results show that the speed improvement gain exceeds the area penalty and this trend is manifested as the number of pipeline stages increases. It is also important to insert the pipeline registers at the proper positions. We described the proposed modified Booth multiplier circuits in Verilog HDL and synthesized the gate-level circuits using 0.13um standard cell library. The resultant multiplier circuits show better performance than others. Since they operate at GHz ranges, they can be used in the application systems requiring extremely high performance such as optical communication systems.

A Clock Skew Minimization Technique Considering Temperature Gradient (열 기울기를 고려한 클락 스큐 최소화 기법)

  • Ko, Se-Jin;Lim, Jae-Ho;Kim, Ki-Young;Kim, Seok-Yoon
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.47 no.7
    • /
    • pp.30-36
    • /
    • 2010
  • Due to the scaling of process parameters, the density on chips has been increasing. This trend increases not only the temperature on chips but also the gradient of the temperature depending on distances. In this paper, we propose the balanced skew tree generation technique for minimizing the clock skew that is affected by the temperature gradients on chips. We calculate the interconnect delay using Elmore delay equation, and find out the optimal balanced clock tree by modifying the clock trees that are generated through the DME(Deferred Merge Embedding) algorithm. We have implemented the proposed technique using C language for the performance evaluation. The experimental results show that the clock insertion point generated by the temperature gradient can be lowered below 54% and we confirm that the skew is remarkably decreased after applying the proposed technique.

A study on integrating and discovery of semantic based knowledge model (의미 기반의 지식모델 통합과 탐색에 관한 연구)

  • Chun, Seung-Su
    • Journal of Internet Computing and Services
    • /
    • v.15 no.6
    • /
    • pp.99-106
    • /
    • 2014
  • Generation and analysis methods have been proposed in recent years, such as using a natural language and formal language processing, artificial intelligence algorithms based knowledge model is effective meaning. its semantic based knowledge model has been used effective decision making tree and problem solving about specific context. and it was based on static generation and regression analysis, trend analysis with behavioral model, simulation support for macroeconomic forecasting mode on especially in a variety of complex systems and social network analysis. In this study, in this sense, integrating knowledge-based models, This paper propose a text mining derived from the inter-Topic model Integrated formal methods and Algorithms. First, a method for converting automatically knowledge map is derived from text mining keyword map and integrate it into the semantic knowledge model for this purpose. This paper propose an algorithm to derive a method of projecting a significant topic map from the map and the keyword semantically equivalent model. Integrated semantic-based knowledge model is available.

A Study on the Utility of Statistical Power Balance Method for Efficient Electromagnetic Analysis of Large and Complex Structures (복잡한 대형 구조물의 효율적인 전자파 해석을 위한 통계적인 PWB 방법의 유용성에 관한 연구)

  • Lee, Young-Seung;Park, Seung-Keun
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.24 no.2
    • /
    • pp.189-197
    • /
    • 2013
  • With the trend of technological advances in electronic communications and the advent of ubiquitous environments, the density of existing electronic equipment in the surroundings is increasing significantly. It is hence great importance to study the numerically efficient and fast algorithm for complex and large environments to identify their electromagnetic compatibility and interference characteristics of equipments installed in those structure. This paper introduces a statistical-based power balance method(PWB) for the analysis of these problems and considers its practical utility. The 2-dimensional lossy rectangular cavity was numerically revisited to clarify its relationship with the classical deterministic analysis solutions based on the Maxwell's equation. It can be shown that the statistical assumptions and analysis results from the power balance method correspond to the volume average over the realistic deterministic domain. This statistical power balance approach should be a sufficiently practical alternative to the electromagnetic problem of complex and large environment since it is apparent that the full-wave analysis methods have some severe limits of its computational burdens under the situation of complex and large environment.

Infinite Failure NHPP Software Mixture Reliability Growth Model Base on Record Value Statistics (기록값 통계량에 기초한 무한고장 NHPP 소프트웨어 혼합 신뢰성장 모형에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul;Kim, Kyung-Soo
    • Convergence Security Journal
    • /
    • v.7 no.3
    • /
    • pp.51-60
    • /
    • 2007
  • Infinite failure NHPP models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, exponential distribution and Rayleigh distribution model was reviewed, proposes the mixture reliability model, which made out efficiency substituted for situation for failure time Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on SSE and Kolmogorov distance, for the sake of efficient model, was employed. Analysis of failure using S27 data set for the sake of proposing shape parameter of the mixture distribution was employed. This analysis of failure data compared with the mixture distribution model and the existing model(using arithmetic and Laplace trend tests, bias tests) is presented.

  • PDF

A Study on Estimation Method for Optimal Composition Rate of Hybrid ESS Using Lead-acid and Lithium-ion Batteries (연축전지와 리튬이온전지용 하이브리드 ESS의 최적구성방안에 관한 연구)

  • Park, Soo-Young;Ryu, Sang-Won;Park, Jae-Bum;Kim, Byung-Ki;Kim, Mi-Young;Rho, Dae-Seok
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.65 no.6
    • /
    • pp.962-968
    • /
    • 2016
  • The large scaled lead-acid battery is widely used for efficient operation of the photovoltaic system in many islands. However, lithium-ion battery is now being introduced to mitigate the fluctuation of wind power and to replace lead-acid battery. Therefore, hybrid ESS(Energy Storage system) that combines lithium-ion battery with lead-acid battery is being required because lithium-ion battery is costly in present stage. Under this circumstance, this paper presents the optimal algorithm to create composition rate of hybrid ESS by considering fixed and variable costs in order to maximize advantage of each battery. With minimization of total cost including fixed and variable costs, the optimal composition rate can be calculated based on the various scenarios such as load variation, life cycle and cost trend. From simulation results, it is confirmed that the proposed algorithms are an effective tool to produce a optimal composition rate.

Prediction of Future Milk Yield with Random Regression Model Using Test-day Records in Holstein Cows

  • Park, Byoungho;Lee, Deukhwan
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.19 no.7
    • /
    • pp.915-921
    • /
    • 2006
  • Various random regression models with different order of Legendre polynomials for permanent environmental and genetic effects were constructed to predict future milk yield of Holstein cows in Korea. A total of 257,908 test-day (TD) milk yield records from a total of 28,135 cows belonging to 1,090 herds were considered for estimating (co)variance of the random covariate coefficients using an expectation-maximization REML algorithm in an animal mixed model. The variances did not change much between the models, having different order of Legendre polynomial, but a decreasing trend was observed with increase in the order of Legendre polynomial in the model. The R-squared value of the model increased and the residual variance reduced with the increase in order of Legendre polynomial in the model. Therefore, a model with $5^{th}$ order of Legendre polynomial was considered for predicting future milk yield. For predicting the future milk yield of cows, 132,771 TD records from 28,135 cows were randomly selected from the above data by way of preceding partial TD record, and then future milk yields were estimated using incomplete records from each cow randomly retained. Results suggested that we could predict the next four months milk yield with an error deviation of 4 kg. The correlation of more than 70% between predicted and observed values was estimated for the next four months milk yield. Even using only 3 TD records of some cows, the average milk yield of Korean Holstein cows would be predicted with high accuracy if compared with observed milk yield. Persistency of each cow was estimated which might be useful for selecting the cows with higher persistency. The results of the present study suggested the use of a $5^{th}$ order Legendre polynomial to predict the future milk yield of each cow.