• Title/Summary/Keyword: data cleansing

Search Result 73, Processing Time 0.025 seconds

Effect of cleansing methods on the bone resorption due to repeated dis/re-connection of implant abutment (지대주 풀림과 조임시 지대주 세척방법에 따른 임플란트 주변 골소실의 양에 대한 평가)

  • Yang, Seung-Min;Shin, Seung-Yun;Kye, Seung-Beom
    • Journal of Periodontal and Implant Science
    • /
    • v.37 no.3
    • /
    • pp.535-542
    • /
    • 2007
  • Background : Repeated dis/re-connection of implant abutment caused bone loss around implant fixtures due to the new formation of biologic width of the mucosal-implant barrier. The aim of this clinical study was to evaluate whether the repeated dis/re-connection of implant abutment cause bone loss clinically and the effect of cleansing methods on a bone loss during the early healing period. Methods : A total 50 implants were installed in 20 patients and repeated dis/re-connection of abutment was performed at the time of surgery and once per week for 12 weeks. 0.9% normal saline solution as group1 and 0.1% chlorhexidine solution as group 2 was used to clean abutments. All patients had radiographs taken at the placement of implant and 4, 8, and 12 weeks postoperatively. The data for bone loss around implant were analyzed. Results: The marginal bone loss at 12 weeks were $1.28{\pm}0.51mm$, $1,32{\pm}0,57mm$ in the mesial and distal sides in group1, $1.94{\pm}0.75mm$, $1.81{\pm}0.84mm$ in group 2, respectively. In view of marginal bone loss, there was not a significant statistical difference between groups. Conclusions : Repeated dis/re-connection of implant abutment may not cause marginal bone loss around implant fixture although limited samples and short-term observation period. In spite of more bone loss in group 2, there was no statistical significant difference between groups. In context of those results, the clinical significance of the repeated dis/re-connection of implant abutment and the cleansing method of abutments is debatable when it comes to marginal bone loss during early healing period.

Efficacy of various cleansing techniques on dentin wettability and its influence on shear bond strength of a resin luting agent

  • Munirathinam, Dilipkumar;Mohanaj, Dhivya;Beganam, Mohammed
    • The Journal of Advanced Prosthodontics
    • /
    • v.4 no.3
    • /
    • pp.139-145
    • /
    • 2012
  • PURPOSE. To evaluate the shear bond strength of resin luting agent to dentin surfaces cleansed with different agents like pumice, ultrasonic scaler with chlorhexidine gluconate, EDTA and the influence of these cleansing methods on wetting properties of the dentin by Axisymmetric drop Shape Analysis - Contact Diameter technique (ADSA-CD). MATERIALS AND METHODS. Forty coronal portions of human third molar were prepared until dentin was exposed. Specimens were divided into two groups: Group A and Group B. Provisional restorations made with autopolymerizing resin were luted to dentin surface with zinc oxide eugenol in Group A and with freegenol cement in Group B. All specimens were stored in distilled water at room temperature for 24 hrs and provisional cements were mechanically removed with explorer and rinsed with water and cleansed using various methods (Control-air-water spray, Pumice prophylaxis, Ultrasonic scaler with 0.2% Chlorhexidine gluconate, 17% EDTA). Contact angle measurements were performed to assess wettability of various cleansing agents using the ADSA-CD technique. Bond strength of a resin luting agent bonded to the cleansed surface was assessed using Instron testing machine and the mode of failure noted. SEM was done to assess the surface cleanliness. Data were statistically analyzed by one-way analysis of variance with Tukey HSD tests (${\alpha}$=.05). RESULTS. Specimens treated with EDTA showed the highest shear bond strength and the lowest contact angle for both groups. SEM showed that EDTA was the most effective solution to remove the smear layer. Also, mode of failure seen was predominantly cohesive for both EDTA and pumice prophylaxis. CONCLUSION. EDTA was the most effective dentin cleansing agent among the compared groups.

Effects of Walking on Discomfort and Colon Cleansing during Colon Lavage before Colonoscopy (대장내시경 검사 전 걷기 운동이 장세척액 복용 시 불편감과 대장 정결도에 미치는 효과)

  • Lee, You Joung;Hong, Eun Jung;Kim, Soon Ok;Kim, Hye Soon;Yang, In Soon;Cha, Kyung Hee;Kim, Choon Suk
    • Journal of Korean Clinical Nursing Research
    • /
    • v.16 no.1
    • /
    • pp.39-49
    • /
    • 2010
  • Purpose: The purpose of this study was to identify the effects of walking on discomfort and colon cleansing for patients having a colon lavage solution before colonoscopy. Methods: This study was a nonequivalent control group post test design. The participants were 89 patients (experimental group: 47, control group: 42) who underwent colonoscopy at G hospital in Incheon. The two groups put on step counter for an hour while taking the colon lavage solution. The experimental group was made up of patients who walked over 3,000 steps and the control group of those who walked less than 3,000 steps. Discomfort was measured using VAS and colon cleansing was measured by a specialist. Collected data were analyzed using $x^2-test$, t-test with SPSS/PC+ window version 15.0. Results: Walking while taking the colon lavage solution decreases abdominal pain before colonoscopy. Also one hour after taking the colon lavage solution, decreased nausea, abdominal pain and discomfort were found in the group which walked over 3,000 steps. Concluson: Based on the above findings, adequate walking can be used as a nursing intervention to increase comfort in patients undergoing colonoscopy.

Data Cleansing Algorithm for reducing Outlier (데이터 오·결측 저감 정제 알고리즘)

  • Lee, Jongwon;Kim, Hosung;Hwang, Chulhyun;Kang, Inshik;Jung, Hoekyung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.10a
    • /
    • pp.342-344
    • /
    • 2018
  • This paper shows the possibility to substitute statistical methods such as mean imputation, correlation coefficient analysis, graph correlation analysis for the proposed algorithm, and replace statistician for processing various abnormal data measured in the water treatment process with it. In addition, this study aims to model a data-filtering system based on a recent fractile pattern and a deep learning-based LSTM algorithm in order to improve the reliability and validation of the algorithm, using the open-sourced libraries such as KERAS, THEANO, TENSORFLOW, etc.

  • PDF

A Data Cleansing Strategy for Improving Data Quality of National R&D Information - Case Study of NTIS (데이터 품질을 고려한 국가R&D정보 데이터베이스의 통합 사례 연구 - NTIS 데이터베이스 통합 사례)

  • Shin, Sung-Ho;Yoon, Young-Jun;Yang, Myung-Suk;Kim, Jin-Man;Shon, Kang-Ryul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.6
    • /
    • pp.119-130
    • /
    • 2011
  • On the point of data quality management, data quality is influenced by quality policy, quality organization, business process, and business rule. Business rules, guide of data manipulation, have effects on data quality directly. In case of building an integration database among distributed databases, defining business rule is more important because data integration needs to consider heterogeneous structure, code, and data standardization. Also data value has various figures depended on data type, unit, and transcription. Finally, database structure and data value problem have to be solved to improve data quality. For handling them, it is needed to draw database integration model and cleanse data in integrated database. NTIS(stands for National science and Technology Information Service) has an aim to serve users who need all information about national R&D by internet, and for that aim, it has a integrated database which has been made with several database sources. We prove that database integration model and data cleansing are needed to build a successful integrated database through NTIS case study.

Prediction of lightweight concrete strength by categorized regression, MLR and ANN

  • Tavakkol, S.;Alapour, F.;Kazemian, A.;Hasaninejad, A.;Ghanbari, A.;Ramezanianpour, A.A.
    • Computers and Concrete
    • /
    • v.12 no.2
    • /
    • pp.151-167
    • /
    • 2013
  • Prediction of concrete properties is an important issue for structural engineers and different methods are developed for this purpose. Most of these methods are based on experimental data and use measured data for parameter estimation. Three typical methods of output estimation are Categorized Linear Regression (CLR), Multiple Linear Regression (MLR) and Artificial Neural Networks (ANN). In this paper a statistical cleansing method based on CLR is introduced. Afterwards, MLR and ANN approaches are also employed to predict the compressive strength of structural lightweight aggregate concrete. The valid input domain is briefly discussed. Finally the results of three prediction methods are compared to determine the most efficient method. The results indicate that despite higher accuracy of ANN, there are some limitations for the method. These limitations include high sensitivity of method to its valid input domain and selection criteria for determining the most efficient network.

A Study on the cleansing of water data using LSTM algorithm (LSTM 알고리즘을 이용한 수도데이터 정제기법)

  • Yoo, Gi Hyun;Kim, Jong Rib;Shin, Gang Wook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2017.10a
    • /
    • pp.501-503
    • /
    • 2017
  • In the water sector, various data such as flow rate, pressure, water quality and water level are collected during the whole process of water purification plant and piping system. The collected data is stored in each water treatment plant's DB, and the collected data are combined in the regional DB and finally stored in the database server of the head office of the Korea Water Resources Corporation. Various abnormal data can be generated when a measuring instrument measures data or data is communicated over various processes, and it can be classified into missing data and wrong data. The cause of each abnormal data is different. Therefore, there is a difference in the method of detecting the wrong side and the missing side data, but the method of cleansing the data is the same. In this study, a program that can automatically refine missing or wrong data by applying deep learning LSTM (Long Short Term Memory) algorithm will be studied.

  • PDF

Measurement of competency through self study in basic nursing lab. practice focused on cleansing enema (기본간호학 실습에 있어 자가학습을 통한 능숙도 측정 - 배변관장을 중심으로 -)

  • Ko Il-Sun
    • Journal of Korean Academy of Fundamentals of Nursing
    • /
    • v.6 no.3
    • /
    • pp.532-543
    • /
    • 1999
  • This study was conducted to provide the basic data necessary for the improvement of the teaching method for basic nursing practice as well as the effectiveness of the practice by examining the students' competency in cleansing enema after doing the self study instead of the traditional education. To examine the competency in cleansing enema after the self study, this study is an one group pretest-posttest design that subjects did the enema practice through the self study. The subjects were 89 sophomore students at Y University. College of Nursing. In basic nursing lab practice class, cleansing enema self study module was given to the students which was developed by the researcher based on the literature review and asked them to finish doing the pre study and checking the self study evaluation criteria after reading the goal, learning activities and theoretical guideline. After watching the video tape, students practiced the process in the module by themselves. For the competency in cleansing enema. repeated autonomous practices were done during the open lab other than the regular class. Whenever the practice was done, the frequency and time were measure and documented. When the student felt confident through repeated practices, the competency was evaluated by the researcher and two assistants based on the evaluation criteria. And the process was repeated till the student could perform all the items on evaluation criteria completely. The data were collected for 42 days from Oct. 15 to Nov. 26 in 1996. Collected data were analyzed by frequency, percentage, Pearson correlation coefficient and variance analysis. The results are summarized as follows : 1. 43.2% of the students were favorable to nursing and 63.6% like lecture, but 71.6% like practice. So they were more interested in practice than in lecture. 2. 62.3% of the students scored high in written test, 97.8% scored high in practice. So the practice score was better. 3. The frequency of repeated practice to pass the test ranged from 1 to 4 and the average is 2.2. 4. The average time needed in preparation and the performance was nearly the same regardless of the frequency. It took 5 to 38 minutes for those who passed the test after practicing once and the average was 16 minutes. 5 to 60 minutes were taken for those who practiced twice to pass the test and the average was 21 minutes. Those who passed the test after three practices needed 8 to 30 minutes and the average was 15 minutes, which was similar to the time that the students who passed the test for the first trial. Only one student passed the test after 4 practices and it took 10 minutes. 5. 64% of the students agreed that the context and the content of the module were appropriate for the self study and 68.2% were satisfied. And 71.9% said that the module helped them to practice the enema self study 6. Though only 42% of the students were satisfied with the video. 50.6% said that it was helpful for the self study. 7. 52.3% of the students were satisfied with the self study method, and 86.6% obtained self-confidence when performing the enema. 8. The lower the student's practice score was, the more practices were needed for them to pass the test(r=-.213, P<.05). As a result, for performing the enema practice competently, two or more practice opportunities were needed to be given. And it is possible to obtain the less complex nursing skills through the self study, when enough learning resources and assistance such as learning guidance or video tapes are provided. Based on this study. I want to suggest that. 1. There must be college policy that can support the new method instead of the traditional learning method for the students to attain the proficiency in basic nursing skills. 2. The assistant materials should be developed as soon as possible to promote the self study of basic nursing skills.

  • PDF

Black Ice Formation Prediction Model Based on Public Data in Land, Infrastructure and Transport Domain (국토 교통 공공데이터 기반 블랙아이스 발생 구간 예측 모델)

  • Na, Jeong Ho;Yoon, Sung-Ho;Oh, Hyo-Jung
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.10 no.7
    • /
    • pp.257-262
    • /
    • 2021
  • Accidents caused by black ice occur frequently every winter, and the fatality rate is very high compared to other traffic accidents. Therefore, a systematic method is needed to predict the black ice formation before accidents. In this paper, we proposed a black ice prediction model based on heterogenous and multi-type data. To this end, 12,574,630 cases of 46 types of land, infrastructure, transport public data and meteorological public data were collected. Subsequently, the data cleansing process including missing value detection and normalization was followed by the establishment of approximately 600,000 refined datasets. We analyzed the correlation of 42 factors collected to predict the occurrence of black ice by selecting only 21 factors that have a valid effect on black ice prediction. The prediction model developed through this will eventually be used to derive the route-specific black ice risk index, which will be utilized as a preliminary study for black ice warning alart services.

Extraction Transformation Transportation (ETT) system Design and implementation for extracting heterogeneous Data on Data Warehouse (데이터웨어하우스에서 이질적 형태를 가진 데이터의 추출을 위한 Extraction Transformation Transportation(ETT) 시스템 설계 및 구현)

  • 여성주;왕지남
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.24 no.67
    • /
    • pp.49-60
    • /
    • 2001
  • Data warehouse(DW) manages all information in a Enterprise and also offers the specific information to users. However, it might be difficult to develope an effective DW system due to varieties in computing facilities, data base, and operating systems. The heterogeneous system environments make it harder to extract data and to provide proper information to usesr in real time. Also commonly occurred is data inconsistency of non-integrated legacy system, which requires an effective and efficient data extraction flow control as well as data cleansing. We design the integrated automatic ETT(Extraction Transformation Transportation) system to control data extraction flow and suggest implementation methodology. Detail analysis and design are given to specify the proposed ETT approach with a real implementation.

  • PDF