• Title/Summary/Keyword: composite data points

Search Result 87, Processing Time 0.025 seconds

Impact of combined at-home bleaching and whitening toothpaste use on the surface and color of a composite resin

  • Carolina Meneghin Barbosa;Renata Siqueira Scatolin;Waldemir Francisco Vieira-Junior;Marcia Hiromi Tanaka;Laura Nobre Ferraz
    • Restorative Dentistry and Endodontics
    • /
    • v.48 no.3
    • /
    • pp.26.1-26.12
    • /
    • 2023
  • Objective: This in vitro study aimed to evaluate the effects of different whitening toothpastes on a composite resin during at-home bleaching with 10% carbamide peroxide. Materials and Methods: Sixty samples (7 mm × 2 mm) were used for color and roughness analyses, while another 60 samples (3 mm × 2 mm) were utilized to assess microhardness. The factors analyzed included toothpaste, for which 5 options with varying active agents were tested (distilled water; conventional toothpaste; whitening toothpaste with abrasive agents; whitening toothpaste with abrasive and chemical agents; and whitening toothpaste with abrasive, chemical, and bleaching agents). Brushing and application of whitening gel were performed for 14 days. Surface microhardness (SMH), surface roughness (Ra), and color (ΔL*, Δa*, Δb, ΔE*ab, and ΔE00) were analyzed. The Ra and SMH data were analyzed using mixed generalized linear models for repeated measures, while the color results were assessed using the Kruskal-Wallis and Dunn tests. Results: Between the initial and final time points, all groups demonstrated significant increases in Ra and reductions in SMH. No significant differences were found between groups for SMH at the final time point, at which all groups differed from the distilled water group. Conventional toothpaste exhibited the lowest Ra, while whitening toothpaste with abrasive agent had the highest value. No significant differences were observed in ΔL*, Δa*, and Δb. Conclusions: While toothpaste composition did not affect the color stability and microhardness of resin composite, combining toothbrushing with whitening toothpaste and at-home bleaching enhanced the change in Ra.

Merging of SPOT P-mode and XS-mode Images using Color Transformation and Image Enhancement (색변환과 영상개선기법을 이용한 SPOT P-mode와 XS-mode 영상합성)

  • 손덕재;이종훈
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.9 no.2
    • /
    • pp.103-113
    • /
    • 1991
  • The accuracy of input coordinates of ground control points and check points affects great influences to the results of ground coordinate computation in using SPOT digital image data. The original SPOT images displayed on CRT are not usually adequate for identifying the object features and determining the point positioning. Hence, appropriate image processing techniques such as contrast enhancement, subpixel interpolation, edge enhancement, and spatial filtering are needed. In this study, the principles of digital image processing needed for accurate three dimensional positioning and spectral characteristic analysis are investigated. The algorithms for the actual applications are developed and programmed. And using the developed image processing software, some SPOT P-mode and XS-mode images are merged into the SPOT P+XS, the high-resolution color composite image.

  • PDF

Low-speed Impact Localization on a Stiffened Composite Structure Using Reference Data Method (기준신호 데이터를 이용한 보강된 복합재 구조물에서의 저속 충격위치 탐색)

  • Kim, Yoon-Young;Kim, Jin-Hyuk;Park, Yurim;Shrestha, Pratik;Kwon, Hee-Jung;Kim, Chun-Gon
    • Composites Research
    • /
    • v.29 no.1
    • /
    • pp.1-6
    • /
    • 2016
  • Low-speed impact was localized on a stiffened composite structure, using 4 FBG sensors with 100 kHz-sampling rate interrogator and devised localization algorithm. The composite specimen consists of a main spar and several stringers, and the overall size of the specimen's surface is about $0.8{\times}1.2m$. Pre-stored reference data for 247 grid locations and 36 stiffener locations are gathered and used as comparison target for a random impact signal. The proposed algorithm uses the normalized cross-correlation method to compare the similarities of the two signals; the correlation results for each sensor's signal are multiplied by others, enabling mutual compensation. 20 verification points were successfully localized with a maximum error of 43.4 mm and an average error of 17.0 mm. For the same experimental setup, the performance of the proposed method is evaluated by reducing the number of sensors. It is revealed that the mutual compensation between the sensors is most effective in the case of a two sensor combination. For the sensor combination of FBG #1 and #2, the maximum localization error was 42.5 mm, with average error of 17.4 mm.

Comparison of Sampling and Estimation Methods for Economic Optimization of Cumene Production Process (쿠멘 생산 공정의 경제성 최적화를 위한 샘플링 및 추정법의 비교)

  • Baek, Jong-Bae;Lee, Gibaek
    • Korean Chemical Engineering Research
    • /
    • v.52 no.5
    • /
    • pp.564-573
    • /
    • 2014
  • Economic optimization of cumene manufacturing process to produce cumene from benzene and propylene was studied. The chosen objective function was the operational profit per year that subtracted capital cost, utility cost, and reactants cost from product revenue and other benefit. The number of design variables of the optimization are 6. Matlab connected to and controlled Unisim Design to calculate operational profit with the given design variables. As the first step of the optimization, design variable points was sampled and operational profit was calculated by using Unisim Design. By using the sampled data, the estimation model to calculate the operational profit was constructed, and the optimization was performed on the estimation model. This study compared second order polynomial and support vector regression as the estimation method. As the sampling method, central composite design was compared with Hammersley sequence sampling. The optimization results showed that support vector regression and Hammersley sequence sampling were superior than second order polynomial and central composite design, respectively. The optimized operational profit was 17.96 MM$ per year, which was 12% higher than 16.04 MM$ of base case.

Carbonation depth in 57 years old concrete structures

  • Medeiros-Junior, Ronaldo A.;Lima, Maryangela G.;Yazigi, Ricardo;Medeiros, Marcelo H.F.
    • Steel and Composite Structures
    • /
    • v.19 no.4
    • /
    • pp.953-966
    • /
    • 2015
  • Carbonation depth was verified in 40 points of two 57 years old concrete viaducts. Field testing (phenolphthalein spraying) was performed on the structures. Data obtained were statistically analyzed by the Kolmogrov-Smirnov's test, one-way analysis of variance (ANOVA's test), and Fisher's method. The results revealed significant differences between maximum carbonation depths of different elements of the same concrete structure. Significant differences were also found in the carbonation of different concrete structures inserted in the same macroclimate. Microclimatic factors such as temperature and local humidity, sunshine, wind, wetting and drying cycles, among others, may have been responsible by the behavior of carbonation in concrete.

A Study on the Characteristics of Parameters in Groundwater Table Fluctuation Model (지하수위 변동 해석모델의 매개변수 특성 연구)

  • Kim, Nam-Won;Kim, Youn-Jung;Chung, Il-Moon
    • Journal of Environmental Science International
    • /
    • v.23 no.4
    • /
    • pp.615-623
    • /
    • 2014
  • The groundwater level varies according to the characteristics and composite materials of aquifer. In this study, specific yield and reaction factor which are the major two hydrogeological parameters in the WTF(Water Table Fluctuation) method were estimated and analyzed spatial characteristics. 8 groundwater level stations which have enough measuring period and high correlation with rainfall in the Hancheon watershed were used. The results showed that specific yield was randomly distributed and reaction factor showed inverse trend with altitude. If the enough data were collected, reaction factor according to altitude in ungauged points could be estimated by using these parameter characteristics.

How Phenolic Composites were chosen - In Case of England (6) (페놀 컴포지트 실용화의 길 - 영국의 경우 (CASE STUDY 6))

  • Nomaguchi, Kanemasa;Forsdyke, Ken L
    • Composites Research
    • /
    • v.17 no.6
    • /
    • pp.58-66
    • /
    • 2004
  • As the first modem industrialized country in the world, so England UK was safetyfied from "SMOKE FIRE DANCER" at London Underground as the first country also. Indeed, the quick decision maker of Grate London Metropolitan must be serious, while the technology developing people also so eager in rebuilding safer composite system for public security accumulating basic data at their laboratories. Mr. Ken L. Forsdyke, one of co-authors of this paper, who was the project leader at BP Chemicals International Company at that time, is now telling you some key points adding to the stories he mentioned before in this series, "How Phenolic Composites were chosen". Now, with another article of the basic data, our tales of "In Case England" will be closed. May God save people from "Horror SMOKE FIRE".ople from "Horror SMOKE FIRE".IRE&".ot;.

Application of Response Surface Method as an Experimental Design to Optimize Coagulation Tests

  • Trinh, Thuy Khanh;Kang, Lim-Seok
    • Environmental Engineering Research
    • /
    • v.15 no.2
    • /
    • pp.63-70
    • /
    • 2010
  • In this study, the response surface method and experimental design were applied as an alternative to conventional methods for the optimization of coagulation tests. A central composite design, with 4 axial points, 4 factorial points and 5 replicates at the center point were used to build a model for predicting and optimizing the coagulation process. Mathematical model equations were derived by computer simulation programming with a least squares method using the Minitab 15 software. In these equations, the removal efficiencies of turbidity and total organic carbon (TOC) were expressed as second-order functions of two factors, such as alum dose and coagulation pH. Statistical checks (ANOVA table, $R^2$ and $R^2_{adj}$ value, model lack of fit test, and p value) indicated that the model was adequate for representing the experimental data. The p values showed that the quadratic effects of alum dose and coagulation pH were highly significant. In other words, these two factors had an important impact on the turbidity and TOC of treated water. To gain a better understanding of the two variables for optimal coagulation performance, the model was presented as both 3-D response surface and 2-D contour graphs. As a compromise for the simultaneously removal of maximum amounts of 92.5% turbidity and 39.5% TOC, the optimum conditions were found with 44 mg/L alum at pH 7.6. The predicted response from the model showed close agreement with the experimental data ($R^2$ values of 90.63% and 91.43% for turbidity removal and TOC removal, respectively), which demonstrates the effectiveness of this approach in achieving good predictions, while minimizing the number of experiments required.

Lineament analysis in the euiseong area using automatic lineament extraction algorithm (자동 선구조 추출 알고리즘을 이용한 경북 의성지역의 선구조 분석)

  • 김상완
    • Economic and Environmental Geology
    • /
    • v.32 no.1
    • /
    • pp.19-31
    • /
    • 1999
  • In this study, we have estimated lineaments in the Euiseong area, Kyungbuk Province, from Landsat TM by applying the algorithm developed by Kim and Won et al. which can effectively reduce the look direction bias associated with the Sun's azimuth angle. Fratures over the study area were also mapped in the field at 57 selected sites to compare them with the results from the satellite image. The trends of lineaments estimated from the Landsat TM images are characterized as $N50^{\circ}$~70W, NS~$N10^{\circ}$W, and $N10^{\circ}$~$60^{\circ}$E trends. The spatial distribution of lineaments is also studied using a circular grid, and the results show that the area can be divided into two domains : domain A in which NS~$N20^{\circ}$E direction is dominant, and domain B in which west-north-west direction is prominent. The trends of lineaments can also be classified into seven groups. Among them, only C, D and G trends are found to be dominant based upon Donnelly's nearest neighbor analysis and correlations of lineament desities. In the color composite image produced by overlaying the lineament density map of these C-, D-, and G-trends, G-trend is shown to be developed in the whole study area while the eastern part of the area is dominated by D-trend. C-trend develops extensively over the whole are except the southeastern part. The orientation of fractures measured at 35 points in the field shows major trends of NS~$N30^{\circ}$E, $N50^{\circ}$~$80^{\circ}$W, and N80$^{\circ}$E~EW, which agree relatively well with the lineaments estimated form the satellite image. The rose diagram analysis fo field data shows that WNW-ESE trending discontinuities are developed in the whole area while discontinuities of NS~$N20^{\circ}$E are develped only in the estern part, which also coincide with the result from the satellite image. The combined results of lineaments from the satellite image and fracture orientation of field data at 22 points including 18 minor faults in Sindong Group imply that the WNW-ESE trend is so prominent that Gumchun and Gaum faults are possibly extended up to the lower Sindong Group in the study area.

  • PDF

Investigation of Minimum Number of Drop Levels and Test Points for FWD Network-Level Testing Protocol in Iowa Department of Transportation (아이오와 주 교통국의 FWD 네트워크 레벨 조사 프로토콜을 위한 최소 하중 재하 수와 조사지점 수의 결정)

  • Kim, Yong-Joo;Lee, Ho-Sin(David);Omundson, Jason S.
    • International Journal of Highway Engineering
    • /
    • v.12 no.4
    • /
    • pp.39-46
    • /
    • 2010
  • In 2007, Iowa department of transportation (DOT) initiated to run the falling weight deflectometer (FWD) network-level testing along Iowa highway and road systems and to build a comprehensive database of deflection data and subsequent structural analysis, which are used for detecting pavement structure failure, estimating expected life, and calculating overlay requirements over a desired design life. Iowa's current FWD networklevel testing protocol requires that pavements are tested at three-drop level with 8-deflection basin collected at each drop level. The test point is determined by the length of the tested pavement section. However, the current FWD network-level program could cover about 20% of Iowa's highway and road systems annually. Therefore, the current FWD network-level test protocol should be simplified to test more than 20% of Iowa's highway and road systems for the network-level test annually. The main objective of this research is to investigate if the minimum number of drop levels and test points could be reduced to increase the testing production rate and reduce the cost of testing and traffic control without sacrificing the quality of the FWD data. Based upon the limited FWD network-level test data of eighty-three composite pavement sections, there was no significant difference between the mean values of three different response parameters when the number of drop levels and test points were reduced from the current FWD network-level testing protocol. As a result, the production rate of FWD tests would increase and the cost of testing and traffic control would be decreased without sacrificing the quality of the FWD data.