• Title/Summary/Keyword: coverage function

Search Result 304, Processing Time 0.031 seconds

Effects on Regression Estimates under Misspecified Generalized Linear Mixed Models for Counts Data

  • Jeong, Kwang Mo
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.6
    • /
    • pp.1037-1047
    • /
    • 2012
  • The generalized linear mixed model(GLMM) is widely used in fitting categorical responses of clustered data. In the numerical approximation of likelihood function the normality is assumed for the random effects distribution; subsequently, the commercial statistical packages also routinely fit GLMM under this normality assumption. We may also encounter departures from the distributional assumption on the response variable. It would be interesting to investigate the impact on the estimates of parameters under misspecification of distributions; however, there has been limited researche on these topics. We study the sensitivity or robustness of the maximum likelihood estimators(MLEs) of GLMM for counts data when the true underlying distribution is normal, gamma, exponential, and a mixture of two normal distributions. We also consider the effects on the MLEs when we fit Poisson-normal GLMM whereas the outcomes are generated from the negative binomial distribution with overdispersion. Through a small scale Monte Carlo study we check the empirical coverage probabilities of parameters and biases of MLEs of GLMM.

Investigation of adsorption structure for methionine on Ge(100)

  • Yang, Se-Na;Yun, Yeong-Sang;Park, Seon-Min;Hwang, Han-Na;Hwang, Chan-Guk;Kim, Se-Hun;Lee, Han-Gil
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2010.02a
    • /
    • pp.369-369
    • /
    • 2010
  • Adsorption and ordering of methionine molecules on Ge(100) surface have been studied using high resolution photoemission spectroscopy and low-energy electron diffraction (LEED) to investigate the adsorption structure as a function of coverage. Analysis of C 1s, S 2p, N 1s, and O 1s core levels reveals quite different according to methionine coverage. We found that the relative population of the two types of thiolates induces a structural change in the ordering from $2\;{\times}\;1$ to $1\;{\times}\;1$. Such an unusual evolution of the methionine adsorption on the Ge(100) surface is discussed in relation to chemical reactions and possible molecular rearrangement on the surface.

  • PDF

Sensory Function Recovery by Free Tissue Transfer in the Extremities (사지에서 유리 조직 이식술에 의한 감각 기능 회복)

  • Lee, Jun-Mo;Kim, Kwon-Il;Hwang, Byung-Yun
    • Archives of Reconstructive Microsurgery
    • /
    • v.14 no.1
    • /
    • pp.14-17
    • /
    • 2005
  • Purpose: Authors have performed free tissue transplantation in the upper and lower extremities with sensory flaps and evaluated the sensory function recovery. Materials and methods: Between 1992 through 2004, sensory free flap articles published in the journal of the Korean microsurgical society, were reviewed and recovery of sensory function was assessed by static two-point discrimination test. Results: Static two point discrimination test showed average 6.7 mm in the thumb, average 12 mm in the hand and 7 cm of the dorsalis pedis flap, 20.5 mm of the lateral arm flap and over 8 cm of the forearm flap in the foot. Conclusion: Sensory flaps provide the protective and useful coverage in the upper and lower extremities and have benefit for activities for daily life in free tissue transferred patients.

  • PDF

A Feasibility Study of the IMRT Optimization with Pseudo-Biologic Objective Function (유사생물학적 대상 함수를 이용한 IMRT 최적화 알고리즘 가능성에 관한 연구)

  • Yi, Byong-Yong;Cho, Sam-Ju;Ahn, Seung-Do;Kim, Jong-Hoon;Choi, Eun-Kyung;Chang, Hye-Sook;Kwon, Soo-Il
    • Journal of Radiation Protection and Research
    • /
    • v.26 no.4
    • /
    • pp.417-424
    • /
    • 2001
  • The pseudo-biologic objective function has been designed for the IMRT optimization. The RTP Tool Box (RTB) was used for this study. The pseudo-biologic function is similar to the biological objective function in mathematical shape, but uses physical parameters. The concepts of the TCI (Target Coverage Index) and the OSI (Organ Score Index) have been introduced for the target and the normal organs, respectively. The pseudo-biologic objective function s has been defined using these TCI and OSI's. The OSI's from the pseudo-biological function showed better results than from the physical functions, while TCI's showed similar tendency. These results revealed the feasibility of the pseudo-biologic function as an IMRT objective function.

  • PDF

Multi-Hop Clock Synchronization Based on Robust Reference Node Selection for Ship Ad-Hoc Network

  • Su, Xin;Hui, Bing;Chang, KyungHi
    • Journal of Communications and Networks
    • /
    • v.18 no.1
    • /
    • pp.65-74
    • /
    • 2016
  • Ship ad-hoc network (SANET) extends the coverage of the maritime communication among ships with the reduced cost. To fulfill the growing demands of real-time services, the SANET requires an efficient clock time synchronization algorithm which has not been carefully investigated under the ad-hoc maritime environment. This is mainly because the conventional algorithms only suggest to decrease the beacon collision probability that diminishes the clock drift among the units. However, the SANET is a very large-scale network in terms of geographic scope, e.g., with 100 km coverage. The key factor to affect the synchronization performance is the signal propagation delay, which has not being carefully considered in the existing algorithms. Therefore, it requires a robust multi-hop synchronization algorithm to support the communication among hundreds of the ships under the maritime environment. The proposed algorithm has to face and overcome several challenges, i.e., physical clock, e.g., coordinated universal time (UTC)/global positioning system (GPS) unavailable due to the atrocious weather, network link stability, and large propagation delay in the SANET. In this paper, we propose a logical clock synchronization algorithm with multi-hop function for the SANET, namely multi-hop clock synchronization for SANET (MCSS). It works in an ad-hoc manner in case of no UTC/GPS being available, and the multi-hop function makes sure the link stability of the network. For the proposed MCSS, the synchronization time reference nodes (STRNs) are efficiently selected by considering the propagation delay, and the beacon collision can be decreased by the combination of adaptive timing synchronization procedure (ATSP) with the proposed STRN selection procedure. Based on the simulation results, we finalize the multi-hop frame structure of the SANET by considering the clock synchronization, where the physical layer parameters are contrived to meet the requirements of target applications.

A Backup Node Based Fault-tolerance Scheme for Coverage Preserving in Wireless Sensor Networks (무선 센서 네트워크에서의 감지범위 보존을 위한 백업 노드 기반 결함 허용 기법)

  • Hahn, Joo-Sun;Ha, Rhan
    • Journal of KIISE:Information Networking
    • /
    • v.36 no.4
    • /
    • pp.339-350
    • /
    • 2009
  • In wireless sensor networks, the limited battery resources of sensor nodes have a direct impact on network lifetime. To reduce unnecessary power consumption, it is often the case that only a minimum number of sensor nodes operate in active mode while the others are kept in sleep mode. In such a case, however, the network service can be easily unreliable if any active node is unable to perform its sensing or communication function because of an unexpected failure. Thus, for achieving reliable sensing, it is important to maintain the sensing level even when some sensor nodes fail. In this paper, we propose a new fault-tolerance scheme, called FCP(Fault-tolerant Coverage Preserving), that gives an efficient way to handle the degradation of the sensing level caused by sensor node failures. In the proposed FCP scheme, a set of backup nodes are pre-designated for each active node to be used to replace the active node in case of its failure. Experimental results show that the FCP scheme provides enhanced performance with reduced overhead in terms of sensing coverage preserving, the number of backup nodes and the amount of control messages. On the average, the percentage of coverage preserving is improved by 87.2% while the additional number of backup nodes and the additional amount of control messages are reduced by 57.6% and 99.5%, respectively, compared with previous fault-tolerance schemes.

Estimation of the exponentiated half-logistic distribution based on multiply Type-I hybrid censoring

  • Jeon, Young Eun;Kang, Suk-Bok
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.1
    • /
    • pp.47-64
    • /
    • 2020
  • In this paper, we derive some estimators of the scale parameter of the exponentiated half-logistic distribution based on the multiply Type-I hybrid censoring scheme. We assume that the shape parameter λ is known. We obtain the maximum likelihood estimator of the scale parameter σ. The scale parameter is estimated by approximating the given likelihood function using two different Taylor series expansions since the likelihood equation is not explicitly solved. We also obtain Bayes estimators using prior distribution. To obtain the Bayes estimators, we use the squared error loss function and general entropy loss function (shape parameter q = -0.5, 1.0). We also derive interval estimation such as the asymptotic confidence interval, the credible interval, and the highest posterior density interval. Finally, we compare the proposed estimators in the sense of the mean squared error through Monte Carlo simulation. The average length of 95% intervals and the corresponding coverage probability are also obtained.

A Case Study on Reliability Test of Embedded Software in the Multi-Function Radar (다기능레이더 소프트웨어 신뢰성시험 적용사례 및 결과)

  • Kim, Jong-Woo
    • Journal of IKEEE
    • /
    • v.19 no.3
    • /
    • pp.431-439
    • /
    • 2015
  • This paper introduces analysis technique and test procedure for verifying the reliability of the multi-function radar software. Also the process of software development and reliability test method for reducing the development period are described. Test results show that the verified software has reduced errors and improved reliability compared to the unverified software.

A Test Algorithm for Data Processing Function of MC68000 ${\mu}$ P (MC68000 ${\mu}$ P의 데이터 처리기능에 관한 시험 알고리즘)

  • Kim, Jong Hoon;Ahn, Gwang Seon
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.23 no.2
    • /
    • pp.197-205
    • /
    • 1986
  • In this paper, we present an efficient test algorithm for data processing function of MC68000 \ulcorner. The test vector for functional testing is determined by stuck-at, coupling and transition fault for data storage and transfer. But for data manipulation it is determined by a boolean function of micro-operation. This test algorithm is composed of 3 parts, choosing optimum test instructions for maximizing fault coverage and minimizing test process time, deciding the test order for minimizing test ambiguity, and processing the actual test.

  • PDF

Optimal Selection of Classifier Ensemble Using Genetic Algorithms (유전자 알고리즘을 이용한 분류자 앙상블의 최적 선택)

  • Kim, Myung-Jong
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.99-112
    • /
    • 2010
  • Ensemble learning is a method for improving the performance of classification and prediction algorithms. It is a method for finding a highly accurateclassifier on the training set by constructing and combining an ensemble of weak classifiers, each of which needs only to be moderately accurate on the training set. Ensemble learning has received considerable attention from machine learning and artificial intelligence fields because of its remarkable performance improvement and flexible integration with the traditional learning algorithms such as decision tree (DT), neural networks (NN), and SVM, etc. In those researches, all of DT ensemble studies have demonstrated impressive improvements in the generalization behavior of DT, while NN and SVM ensemble studies have not shown remarkable performance as shown in DT ensembles. Recently, several works have reported that the performance of ensemble can be degraded where multiple classifiers of an ensemble are highly correlated with, and thereby result in multicollinearity problem, which leads to performance degradation of the ensemble. They have also proposed the differentiated learning strategies to cope with performance degradation problem. Hansen and Salamon (1990) insisted that it is necessary and sufficient for the performance enhancement of an ensemble that the ensemble should contain diverse classifiers. Breiman (1996) explored that ensemble learning can increase the performance of unstable learning algorithms, but does not show remarkable performance improvement on stable learning algorithms. Unstable learning algorithms such as decision tree learners are sensitive to the change of the training data, and thus small changes in the training data can yield large changes in the generated classifiers. Therefore, ensemble with unstable learning algorithms can guarantee some diversity among the classifiers. To the contrary, stable learning algorithms such as NN and SVM generate similar classifiers in spite of small changes of the training data, and thus the correlation among the resulting classifiers is very high. This high correlation results in multicollinearity problem, which leads to performance degradation of the ensemble. Kim,s work (2009) showedthe performance comparison in bankruptcy prediction on Korea firms using tradition prediction algorithms such as NN, DT, and SVM. It reports that stable learning algorithms such as NN and SVM have higher predictability than the unstable DT. Meanwhile, with respect to their ensemble learning, DT ensemble shows the more improved performance than NN and SVM ensemble. Further analysis with variance inflation factor (VIF) analysis empirically proves that performance degradation of ensemble is due to multicollinearity problem. It also proposes that optimization of ensemble is needed to cope with such a problem. This paper proposes a hybrid system for coverage optimization of NN ensemble (CO-NN) in order to improve the performance of NN ensemble. Coverage optimization is a technique of choosing a sub-ensemble from an original ensemble to guarantee the diversity of classifiers in coverage optimization process. CO-NN uses GA which has been widely used for various optimization problems to deal with the coverage optimization problem. The GA chromosomes for the coverage optimization are encoded into binary strings, each bit of which indicates individual classifier. The fitness function is defined as maximization of error reduction and a constraint of variance inflation factor (VIF), which is one of the generally used methods to measure multicollinearity, is added to insure the diversity of classifiers by removing high correlation among the classifiers. We use Microsoft Excel and the GAs software package called Evolver. Experiments on company failure prediction have shown that CO-NN is effectively applied in the stable performance enhancement of NNensembles through the choice of classifiers by considering the correlations of the ensemble. The classifiers which have the potential multicollinearity problem are removed by the coverage optimization process of CO-NN and thereby CO-NN has shown higher performance than a single NN classifier and NN ensemble at 1% significance level, and DT ensemble at 5% significance level. However, there remain further research issues. First, decision optimization process to find optimal combination function should be considered in further research. Secondly, various learning strategies to deal with data noise should be introduced in more advanced further researches in the future.