• Title/Summary/Keyword: Poisson problem

Search Result 202, Processing Time 0.022 seconds

Quantitative Evaluation for Effectiveness of Consolidation Treatment by using the Ethylsilicate for the Namsan Granite in Gyeongju (경주 남산 화강암을 대상으로 에틸실리케이트를 이용한 강화 처리에 대한 정량적 평가)

  • Han, Min-Su;Lee, Jang-Jon;Jun, Byung-Kyu;Song, Chi-Young;Kim, Sa-Dug
    • Journal of the Mineralogical Society of Korea
    • /
    • v.21 no.2
    • /
    • pp.183-192
    • /
    • 2008
  • Stone cultural heritages in Korea are mostly situated out door without any notable protection thus there are severe damage from chemical and biological weathering. This in turn, causes deformation and structural damage. To counter act this problem and to increase durability, various kinds of conservation materials are used in the conservation and restoration treatment. However, there are not many practical and technological experiment done on this subject. This paper attempts quantitative evaluation of effectiveness of ethylsilicate based resin for Namsan granite in Gyeongju. When two different materials with different ethylsilicate concentration were compared, the result indicated decrease of absorption and porosity with increase of ultrasonic velocities, uniaxial compressive strength, elastic constant, tensile strength and Poisson's ratio. In addition, comparison of physical characteristic of the conservation material resulted favorably toward ones with higher concentration of ethylsilicate. This is due to the ethylsilicates characteristic to fill the internal pores of stone. There is discolouration of stone surface after treatment with conservation material. This was more prominent with the product of higher ethylsilicate concentration.

The Economic and Social Implication of Count Regression Models for Married Women's Completed Fertility in Korea (우리나라 가구의 자녀수 결정요인에 관한 Count 모형 분석 및 경제적 함의)

  • Kim, Hyun-Sook
    • Korea journal of population studies
    • /
    • v.30 no.3
    • /
    • pp.107-135
    • /
    • 2007
  • This paper uses a static Gamma count model, a traditional hurdle model and an endogenous switching Poisson model, respectively for determining married women's completed fertility rates in Korea. This paper analyzes the impact of household income, women's wage and education, and women's job market participation on the number of children of married women above age 40 and on the expected number of children of women aged below 40. The paper shows that a household income significantly increases the number of children for at least women aged above 40, however, this income effect is disappearing for younger generation. The empirical model suggests that women having a job tend to have fewer children for a group 39 years old and below and find that there is an endogeneity problem between child birth and labor force participation, too. The education level of married women gives a positive effect for giving a birth, itself, while it gives a negative impact on the number of children. Based on the empirical results, it concludes that Becker's Quantity-Quality theory works for Korea, too.

Development of Traffic Accident Prediction Models Considering Variations of the Future Volume in Urban Areas (신설 도시부 도로의 장래 교통량 변화를 반영한 교통사고 예측모형 개발)

  • Lee, Soo-Beom;Hong, Da-Hee
    • Journal of Korean Society of Transportation
    • /
    • v.23 no.3 s.81
    • /
    • pp.125-136
    • /
    • 2005
  • The current traffic accident reduction procedure in economic feasibility study does not consider the characteristics of road and V/C ratio. For solving this problem, this paper suggests methods to be able to evaluate safety of each road in construction and improvement through developing accident Prediction model in reflecting V/C ratio Per road types and traffic characters. In this paper as primary process, model is made by tke object of urban roads. Most of all, factor effecting on accident relying on road types is selected. At this point, selecting criteria chooses data obtained from road planning procedure, traffic volume, existence or non-existence of median barrier, and the number of crossing point, of connecting road. and of traffic signals. As a result of analyzing between each factor and accident. all appear to have relatives at a significant level of statistics. In this research, models are classified as 4-categorized classes according to roads and V/C ratio and each of models draws accident predicting model through Poisson regression along with verifying real situation data. The results of verifying models come out relatively satisfactory estimation against real traffic data. In this paper, traffic accident prediction is possible caused by road's physical characters by developing accident predicting model per road types resulted in V/C ratio and this result is inferred to be used on predicting accident cost when road construction and improvement are performed. Because data using this paper are limited in only province of Jeollabuk-Do, this paper has a limitation of revealing standards of all regions (nation).

Text Filtering using Iterative Boosting Algorithms (반복적 부스팅 학습을 이용한 문서 여과)

  • Hahn, Sang-Youn;Zang, Byoung-Tak
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.4
    • /
    • pp.270-277
    • /
    • 2002
  • Text filtering is a task of deciding whether a document has relevance to a specified topic. As Internet and Web becomes wide-spread and the number of documents delivered by e-mail explosively grows the importance of text filtering increases as well. The aim of this paper is to improve the accuracy of text filtering systems by using machine learning techniques. We apply AdaBoost algorithms to the filtering task. An AdaBoost algorithm generates and combines a series of simple hypotheses. Each of the hypotheses decides the relevance of a document to a topic on the basis of whether or not the document includes a certain word. We begin with an existing AdaBoost algorithm which uses weak hypotheses with their output of 1 or -1. Then we extend the algorithm to use weak hypotheses with real-valued outputs which was proposed recently to improve error reduction rates and final filtering performance. Next, we attempt to achieve further improvement in the AdaBoost's performance by first setting weights randomly according to the continuous Poisson distribution, executing AdaBoost, repeating these steps several times, and then combining all the hypotheses learned. This has the effect of mitigating the ovefitting problem which may occur when learning from a small number of data. Experiments have been performed on the real document collections used in TREC-8, a well-established text retrieval contest. This dataset includes Financial Times articles from 1992 to 1994. The experimental results show that AdaBoost with real-valued hypotheses outperforms AdaBoost with binary-valued hypotheses, and that AdaBoost iterated with random weights further improves filtering accuracy. Comparison results of all the participants of the TREC-8 filtering task are also provided.

Vibration Analysis of Thick Hyperboloidal Shells of Revolution from a Three-Dimensional Analysis (두꺼운 축대칭 쌍곡형 쉘의 3차원 진동해석)

  • 심현주;강재훈
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.16 no.4
    • /
    • pp.419-429
    • /
    • 2003
  • A three-dimensional (3-D) method of analysis is presented for determining the free vibration frequencies of thick, hyperboloidal shells of revolution. Unlike conventional shell theories, which are mathematically two-dimensional (2-D), the present method is based upon the 3-D dynamic equations of elasticity. Displacement components u/sub r/, u/sub θ/, u/sub z/ in the radial, circumferential, and axial directions, respectively, we taken to be sinusoidal in time, periodic in θ, and algebraic polynomials in the r and z directions. Potential(strain) and kinetic energies of the hyperboloidal shells are formulated, and the Ritz method is used to solve the eigenvalue problem, thus yielding upper bound values of the frequencies by minimizing the frequencies. As the degree of the polynomials is increased, frequencies converge to the exact values. Convergence to four digit exactitude is demonstrated for the first five frequencies of the hyperboloidal shells of revolution. Numerical results are tabulated for eighteen configurations of completely free hyperboloidal shells of revolution having two different shell thickness ratios, three variant axis ratios, and three types of shell height ratios. Poisson's ratio (ν) is fixed at 0.3. Comparisons we made among the frequencies for these hyperboloidal shells and ones which ate cylindrical or nearly cylindrical( small meridional curvature. ) The method is applicable to thin hyperboloidal shells, as well as thick and very thick ones.

Optimal Release Problems based on a Stochastic Differential Equation Model Under the Distributed Software Development Environments (분산 소프트웨어 개발환경에 대한 확률 미분 방정식 모델을 이용한 최적 배포 문제)

  • Lee Jae-Ki;Nam Sang-Sik
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.7A
    • /
    • pp.649-658
    • /
    • 2006
  • Recently, Software Development was applied to new-approach methods as a various form : client-server system and web-programing, object-orient concept, distributed development with a network environments. On the other hand, it be concerned about the distributed development technology and increasing of object-oriented methodology. These technology is spread out the software quality and improve of software production, reduction of the software develop working. Futures, we considered about the distributed software development technique with a many workstation. In this paper, we discussed optimal release problem based on a stochastic differential equation model for the distributed Software development environments. In the past, the software reliability applied to quality a rough guess with a software development process and approach by the estimation of reliability for a test progress. But, in this paper, we decided to optimal release times two method: first, SRGM with an error counting model in fault detection phase by NHPP. Second, fault detection is change of continuous random variable by SDE(stochastic differential equation). Here, we decide to optimal release time as a minimum cost form the detected failure data and debugging fault data during the system test phase and operational phase. Especially, we discussed to limitation of reliability considering of total software cost probability distribution.

FINITE ELEMENT STRESS ANALYSIS OF CLASS V COMPOSITE RESIN RESTORATION SUBJECTED TO CAVITY FORMS AND PLACEMENT METHODS (와동 형태와 충전 방법에 따른 Class V 복합 레진 수복치의 유한요소법적 응력 분석)

  • Son, Yoon-Hee;Cho, Byeong-Hoon;Um, Chung-Moon
    • Restorative Dentistry and Endodontics
    • /
    • v.25 no.1
    • /
    • pp.91-108
    • /
    • 2000
  • Most of cervical abrasion and erosion lesions show gingival margin where the cavosurface angle is on cementum or dentin. Composite resin restoration of cervical lesion shrink toward enamel margin due to polymerization contraction. This shrinkage has clinical problem such as microleakage and secondary caries. Several methods to diminish contraction stress of composite resin restoration, such as modifying cavity form and building up restorations in several increments have been attempted. The purpose of this study was to compare polymerization contraction stress of composite resin in Class V cavity subjected to cavity forms and placement methods. In this study, finite element model of 5 types of Class V cavity was developed on computer tomogram of maxillary central incisor. The types are : 1) Box cavity 2) Box cavity with incisal bevel 3) V shape cavity 4) V shape cavity with incisal bevel 5) Saucer shape cavity. The placement methods are 1) Incisal first oblique incremental curing 2) Bulk curing. An FEM based program for light activated polymerization is not available. For simulation of curing dynamics, time dependent transient thermal conduction analysis was conducted on each cavity and each placement method. For simulation of polymerization shrinkage, thermal stress analysis was performed with each cavity and each placement method. The time-temperature dependent volume shrinkage rate, elastic modulus, and Poisson's ratio were determined in thermal conduction data. The results were as follows : 1. With all five Class V cavifies, the highest Von Mises stress at the composite-tooth interface occurred at gingival margin. 2. With box cavity, V shape cavity and saucer cavity, Von Mises stress at gingival margin of V shape cavity was lower than the others. And that of box cavity was lower than that of saucer cavity. 3. Preparing bevel at incisal cavosurface margin decreased the rate of stress development in early polymerization stage. 4. Preparing bevel at incisal cavosurface margin of V shape cavity increased the Von Mises stress at gingival margin, but decreased at incisal margin. 5. At incisal margin, stress development by bulk curing method was rapid at early stage. Stress development by first increment of incremental curing method was also rapid but lower than that by bulk curing method, however after second increment curing final stress was the same for two placement methods. 6. At gingival margin, stress development by incremental curing method was suddenly rapid at early stage of second increment curing, but final stress was the same for two placement methods.

  • PDF

Developing an Accident Model for Rural Signalized Intersections Using a Random Parameter Negative Binomial Method (RPNB모형을 이용한 지방부 신호교차로 교통사고 모형개발)

  • PARK, Min Ho;LEE, Dongmin
    • Journal of Korean Society of Transportation
    • /
    • v.33 no.6
    • /
    • pp.554-563
    • /
    • 2015
  • This study dealt with developing an accident model for rural signalized intersections with random parameter negative binomial method. The limitation of previous count models(especially, Poisson/Negative Binomial model) is not to explain the integrated variations in terms of time and the distinctive characters a specific point/segment has. This drawback of the traditional count models results in the underestimation of the standard error(t-value inflation) of the derived coefficient and finally affects the low-reliability of the whole model. To solve this problem, this study improves the limitation of traditional count models by suggesting the use of random parameter which takes account of heterogeneity of each point/segment. Through the analyses, it was found that the increase of traffic flow and pedestrian facilities on minor streets had positive effects on the increase of traffic accidents. Left turning lanes and median on major streets reduced the number of accidents. The analysis results show that the random parameter modeling is an effective method for investigating the influence on traffic accident from road geometries. However, this study could not analyze the effects of sequential changes of driving conditions including geometries and safety facilities.

Efficient CT Image Denoising Using Deformable Convolutional AutoEncoder Model

  • Eon Seung, Seong;Seong Hyun, Han;Ji Hye, Heo;Dong Hoon, Lim
    • Journal of the Korea Society of Computer and Information
    • /
    • v.28 no.3
    • /
    • pp.25-33
    • /
    • 2023
  • Noise generated during the acquisition and transmission of CT images acts as a factor that degrades image quality. Therefore, noise removal to solve this problem is an important preprocessing process in image processing. In this paper, we remove noise by using a deformable convolutional autoencoder (DeCAE) model in which deformable convolution operation is applied instead of the existing convolution operation in the convolutional autoencoder (CAE) model of deep learning. Here, the deformable convolution operation can extract features of an image in a more flexible area than the conventional convolution operation. The proposed DeCAE model has the same encoder-decoder structure as the existing CAE model, but the encoder is composed of deformable convolutional layers and the decoder is composed of conventional convolutional layers for efficient noise removal. To evaluate the performance of the DeCAE model proposed in this paper, experiments were conducted on CT images corrupted by various noises, that is, Gaussian noise, impulse noise, and Poisson noise. As a result of the performance experiment, the DeCAE model has more qualitative and quantitative measures than the traditional filters, that is, the Mean filter, Median filter, Bilateral filter and NL-means method, as well as the existing CAE models, that is, MAE (Mean Absolute Error), PSNR (Peak Signal-to-Noise Ratio) and SSIM. (Structural Similarity Index Measure) showed excellent results.

What's Different about Fake Review? (조작된 리뷰(Fake Review)는 무엇이 다른가?)

  • Jung Won Lee;Cheol Park
    • Information Systems Review
    • /
    • v.23 no.1
    • /
    • pp.45-68
    • /
    • 2021
  • As the influence of online reviews on consumer decision-making increases, concerns about review manipulation are also increasing. Fake reviews or review manipulations are emerging as an important problem by posting untrue reviews in order to increase sales volume, causing the consumer's reverse choice, and acting at a high cost to the society as a whole. Most of the related prior studies have focused on predicting review manipulation through data mining methods, and research from a consumer perspective is insufficient. However, since the possibility of manipulation of reviews perceived by consumers can affect the usefulness of reviews, it can provide important implications for online word-of-mouth management regardless of whether it is false or not. Therefore, in this study, we analyzed whether there is a difference between the review evaluated by the consumer as being manipulated and the general review, and verified whether the manipulated review negatively affects the review usefulness. For empirical analysis, 34,711 online book reviews on the LibraryThing website were analyzed using multilevel logistic regression analysis and Poisson regression analysis. As a result of the analysis, it was found that there were differences in product level, reviewer level, and review level factors between reviews that consumers perceived as being manipulated and reviews that were not. In addition, manipulated reviews have been shown to negatively affect review usefulness.