• Title/Summary/Keyword: Regularization

Search Result 487, Processing Time 0.032 seconds

Bond strength prediction of spliced GFRP bars in concrete beams using soft computing methods

  • Shahri, Saeed Farahi;Mousavi, Seyed Roohollah
    • Computers and Concrete
    • /
    • v.27 no.4
    • /
    • pp.305-317
    • /
    • 2021
  • The bond between the concrete and bar is a main factor affecting the performance of the reinforced concrete (RC) members, and since the steel corrosion reduces the bond strength, studying the bond behavior of concrete and GFRP bars is quite necessary. In this research, a database including 112 concrete beam test specimens reinforced with spliced GFRP bars in the splitting failure mode has been collected and used to estimate the concrete-GFRP bar bond strength. This paper aims to accurately estimate the bond strength of spliced GFRP bars in concrete beams by applying three soft computing models including multivariate adaptive regression spline (MARS), Kriging, and M5 model tree. Since the selection of regularization parameters greatly affects the fitting of MARS, Kriging, and M5 models, the regularization parameters have been so optimized as to maximize the training data convergence coefficient. Three hybrid model coupling soft computing methods and genetic algorithm is proposed to automatically perform the trial and error process for finding appropriate modeling regularization parameters. Results have shown that proposed models have significantly increased the prediction accuracy compared to previous models. The proposed MARS, Kriging, and M5 models have improved the convergence coefficient by about 65, 63 and 49%, respectively, compared to the best previous model.

Large-scaled truss topology optimization with filter and iterative parameter control algorithm of Tikhonov regularization

  • Nguyen, Vi T.;Lee, Dongkyu
    • Steel and Composite Structures
    • /
    • v.39 no.5
    • /
    • pp.511-528
    • /
    • 2021
  • There are recently some advances in solving numerically topology optimization problems for large-scaled trusses based on ground structure approach. A disadvantage of this approach is that the final design usually includes many bars, which is difficult to be produced in practice. One of efficient tools is a so-called filter scheme for the ground structure to reduce this difficulty and determine several distinct bars. In detail, this technique is valuable for practical uses because unnecessary bars are filtered out from the ground structure to obtain a well-defined structure during the topology optimization process, while it still guarantees the global equilibrium condition. This process, however, leads to a singular system of equilibrium equations. In this case, the minimization of least squares with Tikhonov regularization is adopted. In this paper, a proposed algorithm in controlling optimal Tikhonov parameter is considered in combination with the filter scheme due to its crucial role in obtaining solution to remove numerical singularity and saving computational time by using sparse matrix, which means that the discrete optimal topology solutions depend on choosing the Tikhonov parameter efficiently. Several numerical examples are investigated to demonstrate the efficiency of the filter parameter control algorithm in terms of the large-scaled optimal topology designs.

Comparison of covariance thresholding methods in gene set analysis

  • Park, Sora;Kim, Kipoong;Sun, Hokeun
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.5
    • /
    • pp.591-601
    • /
    • 2022
  • In gene set analysis with microarray expression data, a group of genes such as a gene regulatory pathway and a signaling pathway is often tested if there exists either differentially expressed (DE) or differentially co-expressed (DC) genes between two biological conditions. Recently, a statistical test based on covariance estimation have been proposed in order to identify DC genes. In particular, covariance regularization by hard thresholding indeed improved the power of the test when the proportion of DC genes within a biological pathway is relatively small. In this article, we compare covariance thresholding methods using four different regularization penalties such as lasso, hard, smoothly clipped absolute deviation (SCAD), and minimax concave plus (MCP) penalties. In our extensive simulation studies, we found that both SCAD and MCP thresholding methods can outperform the hard thresholding method when the proportion of DC genes is extremely small and the number of genes in a biological pathway is much greater than a sample size. We also applied four thresholding methods to 3 different microarray gene expression data sets related with mutant p53 transcriptional activity, and epithelium and stroma breast cancer to compare genetic pathways identified by each method.

Assessment of DVC measurement uncertainty on GFRPs with various fiber architectures

  • Bartulovic, Ante;Tomicevic, Zvonimir;Bubalo, Ante;Hild, Francois
    • Coupled systems mechanics
    • /
    • v.11 no.1
    • /
    • pp.15-32
    • /
    • 2022
  • The comprehensive understanding of the fiber reinforced polymer behavior requires the use of advanced non-destructive testing methods due to its heterogeneous microstructure and anisotropic mechanical proprieties. In addition, the material response under load is strongly associated with manufacturing defects (e.g., voids, inclusions, fiber misalignment, debonds, improper cure and delamination). Such imperfections and microstructures induce various damage mechanisms arising at different scales before macrocracks are formed. The origin of damage phenomena can only be fully understood with the access to underlying microstructural features. This makes X-ray Computed Tomography an appropriate imaging tool to capture changes in the bulk of fibrous materials. Moreover, Digital Volume Correlation (DVC) can be used to measure kinematic fields induced by various loading histories. The correlation technique relies on image contrast induced by microstructures. Fibrous composites can be reinforced by different fiber architectures that may lead to poor natural contrast. Hence, a priori analyses need to be performed to assess the corresponding DVC measurement uncertainties. This study aimed to evaluate measurement resolutions of global and regularized DVC for glass fiber reinforced polymers with different fiber architectures. The measurement uncertainties were evaluated with respect to element size and regularization lengths. Even though FE-based DVC could not reach the recommended displacement uncertainty with low spatial resolution, regularized DVC enabled for the use of fine meshes when applying appropriate regularization.

A New Bias Scheduling Method for Improving Both Classification Performance and Precision on the Classification and Regression Problems (분류 및 회귀문제에서의 분류 성능과 정확도를 동시에 향상시키기 위한 새로운 바이어스 스케줄링 방법)

  • Kim Eun-Mi;Park Seong-Mi;Kim Kwang-Hee;Lee Bae-Ho
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.11
    • /
    • pp.1021-1028
    • /
    • 2005
  • The general solution for classification and regression problems can be found by matching and modifying matrices with the information in real world and then these matrices are teaming in neural networks. This paper treats primary space as a real world, and dual space that Primary space matches matrices using kernel. In practical study, there are two kinds of problems, complete system which can get an answer using inverse matrix and ill-posed system or singular system which cannot get an answer directly from inverse of the given matrix. Further more the problems are often given by the latter condition; therefore, it is necessary to find regularization parameter to change ill-posed or singular problems into complete system. This paper compares each performance under both classification and regression problems among GCV, L-Curve, which are well known for getting regularization parameter, and kernel methods. Both GCV and L-Curve have excellent performance to get regularization parameters, and the performances are similar although they show little bit different results from the different condition of problems. However, these methods are two-step solution because both have to calculate the regularization parameters to solve given problems, and then those problems can be applied to other solving methods. Compared with UV and L-Curve, kernel methods are one-step solution which is simultaneously teaming a regularization parameter within the teaming process of pattern weights. This paper also suggests dynamic momentum which is leaning under the limited proportional condition between learning epoch and the performance of given problems to increase performance and precision for regularization. Finally, this paper shows the results that suggested solution can get better or equivalent results compared with GCV and L-Curve through the experiments using Iris data which are used to consider standard data in classification, Gaussian data which are typical data for singular system, and Shaw data which is an one-dimension image restoration problems.

Development of axial tomography technique for the study of steam explosion (증기폭발 적용 축방향 토모그라피 기술 개발)

  • Seo, Si-Won;Ha, Kwang-Soon;Hong, Seong-Wan;Song, Jin-Ho;Lee, Jae-Young
    • Proceedings of the KSME Conference
    • /
    • 2007.05b
    • /
    • pp.3027-3032
    • /
    • 2007
  • To understand the complex phenomena performed in steam explosion, the fast and global measurement of the steam distribution is imperative for this extremely rapid transient stimulation of the bubble breakup and coalescence due to turbulent eddies and shock waves. TROI, the experimental facility requests more robust sensor system to meet this requirement. In Europe, researchers are prefer a X-ray method but this method is very expensive and has limited measurement range. There is an alternative technology such as ECT. Because of TROI's geometry, however, we need axial tomography method. This paper reviews image reconstruction algorethms for axial tomography, including Tikhonov regularization and iterative Tikhonov regularization. Axial tomography method is examined by simulation and experiment for typical permittivity distributions. Future works in axial tomography technology is discussed.

  • PDF