• Title/Summary/Keyword: structure sensitivity study

Search Result 678, Processing Time 0.038 seconds

Renewable Energy Generation Prediction Model using Meteorological Big Data (기상 빅데이터를 활용한 신재생 에너지 발전량 예측 모형 연구)

  • Mi-Young Kang
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.18 no.1
    • /
    • pp.39-44
    • /
    • 2023
  • Renewable energy such as solar and wind power is a resource that is sensitive to weather conditions and environmental changes. Since the amount of power generated by a facility can vary depending on the installation location and structure, it is important to accurately predict the amount of power generation. Using meteorological data, a data preprocessing process based on principal component analysis was conducted to monitor the relationship between features that affect energy production prediction. In addition, in this study, the prediction was tested by reconstructing the dataset according to the sensitivity and applying it to the machine learning model. Using the proposed model, the performance of energy production prediction using random forest regression was confirmed by predicting energy production according to the meteorological environment for new and renewable energy, and comparing it with the actual production value at that time.

Modelling headed stud shear connectors of steel-concrete pushout tests with PCHCS and concrete topping

  • Lucas Mognon Santiago Prates;Felipe Piana Vendramell Ferreira;Alexandre Rossi;Carlos Humberto Martins
    • Steel and Composite Structures
    • /
    • v.46 no.4
    • /
    • pp.451-469
    • /
    • 2023
  • The use of precast hollow-core slabs (PCHCS) in civil construction has been increasing due to the speed of execution and reduction in the weight of flooring systems. However, in the literature there are no studies that present a finite element model (FEM) to predict the load-slip relationship behavior of pushout tests, considering headed stud shear connector and PCHCS placed at the upper flange of the downstand steel profile. Thus, the present paper aims to develop a FEM, which is based on tests to fill this gap. For this task, geometrical non-linear analyses are carried out in the ABAQUS software. The FEM is calibrated by sensitivity analyses, considering different types of analysis, the friction coefficient at the steel-concrete interface, as well as the constitutive model of the headed stud shear connector. Subsequently, a parametric study is performed to assess the influence of the number of connector lines, type of filling and height of the PCHCS. The results are compared with analytical models that predict the headed stud resistance. In total, 158 finite element models are processed. It was concluded that the dynamic implicit analysis (quasi-static) showed better convergence of the equilibrium trajectory when compared to the static analysis, such as arc-length method. The friction coefficient value of 0.5 was indicated to predict the load-slip relationship behavior of all models investigated. The headed stud shear connector rupture was verified for the constitutive model capable of representing the fracture in the stress-strain relationship. Regarding the number of connector lines, there was an average increase of 108% in the resistance of the structure for models with two lines of connectors compared to the use of only one. The type of filling of the hollow core slab that presented the best results was the partial filling. Finally, the greater the height of the PCHCS, the greater the resistance of the headed stud.

Psychometric Properties of the Korean Version of 12-Item Obsessive-Compulsive Inventory in Accordance With Obsessive-Compulsive Symptom Dimensions in Individuals With Obsessive-Compulsive Disorder (강박장애 환자에서 강박증상차원에 부합하는 12문항 강박증상목록의 심리측정적 특성)

  • Ho Seok Seo;Mina Choi;Seung Jae Lee
    • Anxiety and mood
    • /
    • v.19 no.1
    • /
    • pp.10-18
    • /
    • 2023
  • Objective : The 18-item Obsessive-Compulsive Inventory-Revised (OCI-R) is widely employed to assess symptoms of obsessive-compulsive disorder (OCD). However, this instrument's factor structure does not align with contemporary dimensional models of OCD. Therefore, the objective of this study was to examine psychometric properties of the 12-item Korean OCI (OCI-12) on four obsessive-compulsive symptom dimensions, in patients with OCD. Methods : A total of 157 patients with OCD and 51 healthy controls completed psychological measures, including the OCI-R, Dimensional Obsessive-Compulsive Scale (DOCS), and scales evaluating anxiety and depressive symptoms. Pychometric characteristics of the OCI-12 with three neutralizing and three hoarding items eliminated from the OCI-R, were analyzed. Results : All OCI-12 items registered excellent internal consistency at 0.83. Confirmatory factor analysis revealed strong association between individual items and their proposed latent factors (i.e., subscales). Convergent validity was appropriate. A high correlation was particularly observed for the DOCS score (r=0.71, p<0 .001). Moreover, the OCI-12 was as sensitive as the OCI-R for determining effects of empirically supported treatment for OCD. Conclusion : The OCI-12 is a 12-item measure that adheres to the prevailing 4-factor model of OCD dimensions. Like OCI-R, it possesses good to excellent psychometric properties, including reliability, validity, and sensitivity to treatment.

Exchange Rate Volatility and Bilateral Trade Flow: Evidence from China (환율 변동성과 양자 무역 흐름: 중국을 중심으로)

  • Li Qing;Sang-Whi Lee
    • Korea Trade Review
    • /
    • v.48 no.4
    • /
    • pp.47-66
    • /
    • 2023
  • Our study aims to explore the impact of China's foreign trade policy measures on the real exchange rate movement. We seek to provide specific references for the formulation of exchange rate and trade-related strategies. Our results indicate that China's bilateral trade is significantly influenced by movements in the Real Effective Exchange Rate (RER). When analyzing the relationship between aggregated trade flow and exchange rate movements, this paper finds that the depreciation of the real exchange rate leads to an increase in China's export volume and a slight decrease in its import volume. Moreover, China's export volume exhibits higher sensitivity to exchange rate volatility compared to the exchange rate level. Furthermore, the empirical findings regarding disaggregated trade flow suggest that different goods are affected differently by exchange rate movements. Capital goods and consumer goods, being in different stages of processing, show no negative impact on their import and export due to exchange rate depreciation. Consequently, we recommend deepening the industry's reform by improving production efficiency and transitioning the industrial structure to a higher processing stage. This approach can effectively reduce the negative impact of exchange rate depreciation.

Estimation for warfarin in pharmaceutical preparation using monolithic column

  • Zahraa Hadi Shareef;Ahmed Ali Alkarimi
    • Analytical Science and Technology
    • /
    • v.37 no.4
    • /
    • pp.220-229
    • /
    • 2024
  • This study aims to developing a method for estimating pharmaceutical compounds within a monolith column using high-performance liquid chromatography (HPLC). The monolithic column was prepared using copolymerization of glycidyl methacrylate, co-ethylene dimethacrylate, and co-acrylic acid inside a borosilicate tube of specific dimensions a 60 mm borosilicate tube length with 1.5 mm and 3.5 mm inner and outer diameters, respectively. A UV Ultra violet source with a wavelength of 365 nm was used, and the polymerization process involved mixing glycidyl methacrylate, acrylic acid, ethylene dimethacrylate as a binder, and 2,2-dimethoxy-2-phenyl acetate phenone as an initiator in suitable solvents consisting of ethanol and 1-hexanol. The polymerization process formed the monolith column after 4 minutes, and subsequently, the epoxy groups were altered to diol groups using 0.2 M hydrochloric acid HCl, which were pumped through the column for 3 hours at a flow rate of 10 µL·min-1. Various techniques, such as Scanning Electron Microscope SEM, Brunauer-Emmett-Teller BET, Fourier-transform infrared spectroscopy FT-IR and HNMR, were utilized to characterize and confirm the structure of the monolith. The prepared monolith was employed to estimate and identify the pharmaceutical compound of warfarin using high-performance liquid chromatography HPLC. The analytical curve of warfarin was linear in the range of 3 to 100 ㎍·mL-1 with an r2 value of 0.999. The detection and quantification limits were 0.932 and 2.788 ㎍·mL-1, respectively. The molar absorptivity and Sandells sensitivity were 2.99138 × 106 L·mol-1·cm-1 and 103.1 × 10-3 ㎍·cm-2, respectively.

Feasibility Study on the Fault Tree Analysis Approach for the Management of the Faults in Running PCR Analysis (PCR 과정의 오류 관리를 위한 Fault Tree Analysis 적용에 관한 시범적 연구)

  • Lim, Ji-Su;Park, Ae-Ri;Lee, Seung-Ju;Hong, Kwang-Won
    • Applied Biological Chemistry
    • /
    • v.50 no.4
    • /
    • pp.245-252
    • /
    • 2007
  • FTA (fault tree analysis), an analytical method for system failure management, was employed in the management of faults in running PCR analysis. PCR is executed through several processes, in which the process of PCR machine operation was selected for the analysis by FTA. The reason for choosing the simplest process in the PCR analysis was to adopt it as a first trial to test a feasibility of the FTA approach. First, fault events-top event, intermediate event, basic events-were identified by survey on expert knowledge of PCR. Then those events were correlated deductively to build a fault tree in hierarchical structure. The fault tree was evaluated qualitatively and quantitatively, yielding minimal cut sets, structural importance, common cause vulnerability, simulation of probability of occurrence of top event, cut set importance, item importance and sensitivity. The top event was 'errors in the step of PCR machine operation in running PCR analysis'. The major intermediate events were 'failures in instrument' and 'errors in actions in experiment'. The basic events were four events, one event and one event based on human errors, instrument failure and energy source failure, respectively. Those events were combined with Boolean logic gates-AND or OR, constructing a fault tree. In the qualitative evaluation of the tree, the basic events-'errors in preparing the reaction mixture', 'errors in setting temperature and time of PCR machine', 'failure of electrical power during running PCR machine', 'errors in selecting adequate PCR machine'-proved the most critical in the occurrence of the fault of the top event. In the quantitative evaluation, the list of the critical events were not the same as that from the qualitative evaluation. It was because the probability value of PCR machine failure, not on the list above though, increased with used time, and the probability of the events of electricity failure and defective of PCR machine were given zero due to rare likelihood of the events in general. It was concluded that this feasibility study is worth being a means to introduce the novel technique, FTA, to the management of faults in running PCR analysis.

Anomaly Detection for User Action with Generative Adversarial Networks (적대적 생성 모델을 활용한 사용자 행위 이상 탐지 방법)

  • Choi, Nam woong;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.43-62
    • /
    • 2019
  • At one time, the anomaly detection sector dominated the method of determining whether there was an abnormality based on the statistics derived from specific data. This methodology was possible because the dimension of the data was simple in the past, so the classical statistical method could work effectively. However, as the characteristics of data have changed complexly in the era of big data, it has become more difficult to accurately analyze and predict the data that occurs throughout the industry in the conventional way. Therefore, SVM and Decision Tree based supervised learning algorithms were used. However, there is peculiarity that supervised learning based model can only accurately predict the test data, when the number of classes is equal to the number of normal classes and most of the data generated in the industry has unbalanced data class. Therefore, the predicted results are not always valid when supervised learning model is applied. In order to overcome these drawbacks, many studies now use the unsupervised learning-based model that is not influenced by class distribution, such as autoencoder or generative adversarial networks. In this paper, we propose a method to detect anomalies using generative adversarial networks. AnoGAN, introduced in the study of Thomas et al (2017), is a classification model that performs abnormal detection of medical images. It was composed of a Convolution Neural Net and was used in the field of detection. On the other hand, sequencing data abnormality detection using generative adversarial network is a lack of research papers compared to image data. Of course, in Li et al (2018), a study by Li et al (LSTM), a type of recurrent neural network, has proposed a model to classify the abnormities of numerical sequence data, but it has not been used for categorical sequence data, as well as feature matching method applied by salans et al.(2016). So it suggests that there are a number of studies to be tried on in the ideal classification of sequence data through a generative adversarial Network. In order to learn the sequence data, the structure of the generative adversarial networks is composed of LSTM, and the 2 stacked-LSTM of the generator is composed of 32-dim hidden unit layers and 64-dim hidden unit layers. The LSTM of the discriminator consists of 64-dim hidden unit layer were used. In the process of deriving abnormal scores from existing paper of Anomaly Detection for Sequence data, entropy values of probability of actual data are used in the process of deriving abnormal scores. but in this paper, as mentioned earlier, abnormal scores have been derived by using feature matching techniques. In addition, the process of optimizing latent variables was designed with LSTM to improve model performance. The modified form of generative adversarial model was more accurate in all experiments than the autoencoder in terms of precision and was approximately 7% higher in accuracy. In terms of Robustness, Generative adversarial networks also performed better than autoencoder. Because generative adversarial networks can learn data distribution from real categorical sequence data, Unaffected by a single normal data. But autoencoder is not. Result of Robustness test showed that he accuracy of the autocoder was 92%, the accuracy of the hostile neural network was 96%, and in terms of sensitivity, the autocoder was 40% and the hostile neural network was 51%. In this paper, experiments have also been conducted to show how much performance changes due to differences in the optimization structure of potential variables. As a result, the level of 1% was improved in terms of sensitivity. These results suggest that it presented a new perspective on optimizing latent variable that were relatively insignificant.

Relationships between the Effect Factors of Private Brand Images and Customer Trust and Loyalty (유통업자 브랜드 이미지의 영향요인과 신뢰 및 고객 애호도와의 관계에 관한 연구)

  • Kim, Yu-Kyung
    • Journal of Distribution Science
    • /
    • v.13 no.7
    • /
    • pp.73-83
    • /
    • 2015
  • Purpose - Recently, many large retailers have been frequently purchasing commercial brands. Not only the quality of products purchased but also the brand images are of concern in this process. Based on this rising trend, commercial brands have become an important issue in the retail business world, along with increasing general interest as well. Thus, this study focuses on the factors affecting commercial brand images and clarifies the impact of the resulting factors as well. First, store images and familiarity, price sensitiveness, and knowledge are presented as the effect factors for commercial brands. Second, the study tries to clarify the effect of commercial brand image on the reliability and loyalty of customers. Research design, data, and methodology - To conduct the study methodology, 250 questionnaires were distributed to retailers who have used large discount stores located in Busan to purchase a commercial brand a total of 234 valid questionnaires were used in the final analysis. To verify the hypotheses, a structure equation formula using Amos 20.0 was calculated. First, prior to the verification of hypotheses, the reliability and feasibility of the questions were tested, and as a result, the value of Cronbach's alpha was higher than 0.7, thereby showing reliability. Additionally, for the verification of the feasibility of the questions, a confirmation factor analysis was implemented. Results - First, variances such as store images, familiarity, price sensitivity, and knowledge were presented as the factors of effect on commercial brands. As a result of the hypotheses verification, all the effect factors presented in this study were confirmed as important variables of commercial brand images. The hypotheses were drawn based on the studies relating to existing commercial brands, and thereby, a result similar to previous studies was produced. However, the effect factors of the commercial brand image presented in this study shaped a new study model based on the previous studies. Second, it was found that commercial brand images had positive effects on the reliability and loyalty of customers. This study presented reliability and loyalty as the resulting factors of commercial brand image because they are the most important factors in relation to customer behavior and variables that can suggest marketing points for distribution businesses. Conclusions - This study focused on clarifying the factors that are important effect factors for commercial brand images. As a result, all the hypotheses were confirmed in this study, indicating a meaningful result, and thereby suggesting many points that can be presented to distribution businesses. First, the proper strategies should be developed based on the hypothesis that a store image, the familiarity of customers with brands, and the price sensitiveness and knowledge level of customers have an important effect on the choice of commercial brands. Additionally, the formation of such favorable images will have positive effects not only in terms of customer trust in the commercial brands but also in terms of their loyalty.

Effect of Temperature Condition on Nitrogen Mineralization of Organic Matter and Soil Microbial Community Structure in non-Volcanic Ash Soil (온도가 유기물의 질소무기화와 미생물 군집구조에 미치는 영향)

  • Joa, Jae-Ho;Moon, Kyung-Hwan;Kim, Seong-Cheol;Moon, Doo-Gyung;Koh, Sang-Wook
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.45 no.3
    • /
    • pp.377-384
    • /
    • 2012
  • This study was carried out to evaluate effect of temperature condition on nitrogen mineralization of organic matter, distribution of microbial group by PLFA profiles, and soil microbial community in non-volcanic ash soil. Dried soil 30 g mixed well each 2 g of pellet (OFPE) organic fertilizers, pig manure compost (PMC), and food waste compost (FWC). And then had incubated at $10^{\circ}C$, $20^{\circ}C$, and $30^{\circ}C$, respectively. Nitrogen mineralization rate increased with increasing temperature and that was in the order of FWC>OFPE>PMC. Distribution ratio of microbial group by PLFA profiles showed that was different significantly according to incubation temperature and the type of organic matter. As incubating time passed, density of microbial group decreased gradually. The Gram-bacteria PLFA/Gram+ bacteria PLFA, Fungi PLFA/Bacteria PLFA, and Unsaturated PLFA/saturated PLFA ratios were decreased according to the increasing temperature gradually. Principal component analysis using PLFA profiles showed that microbial community structures were composed differently by temperature factor at both 75 days ($10^{\circ}C$) and 270 days ($30^{\circ}C$). In conclusion, Soil microbial community structure showed relative sensitivity and seasonal changes as affected by temperature and organic matter type.

Development of Structure Analysis Program for Jointed Concrete Pavement Applying Load Discretization Algorithm (하중변환 알고리듬을 적용한 줄눈 콘크리트 포장해석 프로그램 개발)

  • Yun, Tae-Young;Kim, Ji-Won;Cho, Yoon-Ho
    • International Journal of Highway Engineering
    • /
    • v.5 no.4 s.18
    • /
    • pp.1-11
    • /
    • 2003
  • Recently, the new pavement design method considering Korean environment and the specification for improving performance of pavement are being developed in Korea. The Jointed Concrete Pavement Program Applying Load Discretization Algorithm (called HEART-JCP) is one of the results of Korea Pavement Research Project in Korea. HEART-JCP program is developed to analyze various loading condition using the load discretization algorithm without mesh refinement. In addition, it can be modified easily into multi-purpose concrete pavement nidyses program because of the modularized structure characteristic of HEART-JCP. The program consists of basic program part and load discretization part. In basic program part, the displacement and stress are computed in the concrete slab, sub-layer, and dowel bar, which are modeled with plate/shell element, spring element and beam element. In load discretization program part, load discretization algorithm that was used for the continuum solid element is modified to analyze the model with plate and shell element. The program can analyze the distributed load, concentrated load, thermal load and body load using load discretization algorithm. From the result of verification and sensitivity study, it was known that the loading position, the magnitude of load, and the thickness of slab were the major factors of concrete pavement behavior as expected. Since the result of the model developed is similar to the results of Westergaard solution and ILLISLAB, the program can be used to estimate the behavior of jointed concrete pavement reasonably.

  • PDF