• Title/Summary/Keyword: data modelling

Search Result 1,282, Processing Time 0.031 seconds

Application of GLIM to the Binary Categorical Data

  • Sok, Yong-U
    • Journal of the military operations research society of Korea
    • /
    • v.25 no.2
    • /
    • pp.158-169
    • /
    • 1999
  • This paper is concerned with the application of generalized linear interactive modelling(GLIM) to the binary categorical data. To analyze the categorical data given by a contingency table, finding a good-fitting loglinear model is commonly adopted. In the case of a contingency table with a response variable, we can fit a logit model to find a good-fitting loglinear model. For a given $2^4$ contingency table with a binary response variable, we show the process of fitting a loglinear model by fitting a logit model using GLIM and SAS and then we estimate parameters to interpret the nature of associations implied by the model.

  • PDF

Design and Implementation of a Metadata System for Financial Information Data Modeling (금융정보 데이터 모델링을 위한 메타데이터 시스템의 설계 및 구현)

  • Cho, Sang-Hyuk
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.1
    • /
    • pp.81-85
    • /
    • 2012
  • As business environment and complex work conditions are rapidly changing, large financial institutions are doing research on various fields to build a system that will efficiently and accurately process the production and modification of financial information and minimize the error in data-processing. In this paper, we have built a metadata system that, among various research areas, gives stability, accuracy and convenience in financial data modelling, analyze its effect and when adapting new models, provide mapping information from existing model to efficiently connect models and databases. If we manage modelling and standard data through this metadata system, the data standardization and database can process the model modification work in an unitary system and consistent high quality data model can be maintained and managed when data modification occurs.

IDENTIFICATION OF FALSIFIED DRUGS USING NEAR-INFRARED SPECTROSCOPY

  • Scafi, Sergio H.F.;Pasquini, Celio
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.3112-3112
    • /
    • 2001
  • Near-Infrared Spectroscopy (NIRS) was investigated aiming at the identification of falsified drugs. The identification is based on comparison of the NIR spectrum of a sample with a typical spectra of an authentic drug using multivariate modelling and classification algorithms (PCA/SIMCA). Two spectrophotometers (Brimrose - Luminar 2000 and 2030), based on acoustic-optical filter (AOTF) technology, sharing the same controlling computer, software (Brimrose - Snap 2.03) and the data acquisition electronics, were employed. The Luminar 2000 scans the range 850 1800 nm and was employed for transmitance/absorbance measurements of liquids with a transflectance optical bundle probe with total optical path of 5 mm and a circular area of 0.5 $\textrm{cm}^2$. Model 2030 scans the rage 1100 2400 nm and was employed for reflectance measurement of solids drugs. 300 spectra, acquired in about 20 s, were averaged for each sample. Chemometric treatment of the spectral data, modelling and classification were performed by using the Unscrambler 7.5 software (CAMO Norway). This package provides the Principal Component Analysis (PCA) and SIMCA algorithms, used for modelling and classification, respectively. Initially, NIRS was evaluated for spectrum acquisition of various drugs, selected in order to accomplish the diversity of physico-chemical characteristics found among commercial products. Parameters which could affect the spectra of a given drug (especially if presented as solid tablets) were investigated and the results showed that the first derivative can minimize spectral changes associated with tablet geometry, physical differences in their faces and position in relation to the probe beam. The effect of ambient humidity and temperature were also investigated. The first factor needs to be controlled for model construction because the ambient humidity can cause spectral alterations that should cause the wrong classification of a real drug if the factor is not considered by the model.

  • PDF

A Study on Uncertainty and Sensitivity of Operational and Modelling Parameters for Feedwater Line Break Analysis (급수관 파열사고 해석에 대한 운전변수와 모형변수의 불확실성 및 민감도 연구)

  • Lee, Seung-Hyuk;Kim, Jin-Soo;Chang, Soon-Heung
    • Nuclear Engineering and Technology
    • /
    • v.19 no.1
    • /
    • pp.10-21
    • /
    • 1987
  • Uncertainty analysis of the FLB accident is performed for KNU-1 using the response surface methodology and Monte Carlo simulation. The FLB analyses using the RELAP4/Mod6 were performed a number of times to generate the data base for the uncertainty analysis, along with the EM calculation for comparison purpose. Two kinds of input sets are utilized for response surface method to investigate and compare the effects of the uncertainty of input variables on the RCS peak pressure following a FLB. The first set is composed of six major plant operational parameters and the second set is composed of five major modelling parameters. It is found through the analysis of results that the uncertainties of modelling parameters have more influence on the RCS peak pressure than the uncertainties of plant operational parameters and that the extra margin of 9% of peak pressure is gained. And one of the assumptions of EM calculation, which is usually accepted as conservative is found to be erroneous, that is, the initial core inlet temperature is found to act negatively on the RCS pressure following a FLB.

  • PDF

A Study on Prediction Technique for Underwater Electric Field Signature Characteristic using Dipole Modelling Method (다이폴 모델링 기법을 이용한 수중 전기장 신호 특성 예측 기법 연구)

  • Yang, Chang-Seob;Chung, Hyun-Ju;Lee, Jong-Ju;Jeon, Jae-Jin
    • Journal of the Korean Magnetics Society
    • /
    • v.18 no.6
    • /
    • pp.221-224
    • /
    • 2008
  • This paper describes the equivalent dipole modeling method utilizing a singular value decomposition technique from analysis data by the FNREMUS Detailled Modeller software based on BEM which can predict the underwater electric field signal due to a galvanic corrosion phenomenon on a naval vessel. The proposed dipole modeling method was successfully verified in good agreement with predicted BEM data at 30 m depths through the comparison of average differences. The proposed dipole modelling method can be effectively used in the prediction and analysis of static electric field signature distributions generated from a naval vessel at any different depths.

Establishment and Application of Computer-Assisted Environmental Information System for Land Use Zoning and Environmental Analysis of Natural Park (자연공원의 환경분석 및 용도지역설정을 위한 전산환경정보체계의 수립과 적용)

  • Lee, Myung-Woo
    • Journal of Environmental Impact Assessment
    • /
    • v.2 no.1
    • /
    • pp.39-55
    • /
    • 1993
  • The importance of urban and regional natural park increases because of the needs for preserving the natural resources and providing with natural recreation space in nature. This planning of natural park management should be established based on the research of the various natural resources in the park. But for the lack of effective data synthesizing methods and concepts, only some restricted factors for zoning plan are considered even though GIS computer system for large complex simulation is used. Therefore, in this study three ecological zoning models such as Basic Factor Model (BFM), Visual Landscape Model (VLM) and Comprehensive Ecological Model (CEM) are proposed and applied to Byounsan Peninsula Nature Park(BPNP) for comparison with the current natural park zoning. The BFM has three components -elevation, slope and vegetation. The VLM has applied with six components -elevation, slope, vegetation, road type, and the visual distance. Finally the CEM's modelling factors have included all of BFM, VLM components are added with the land use type, nature and historic resource factors. The zoning concept of BPNP was based on "Minimization" focused on the specific factors. But introduced modelling concept is "Optimization" based on the total ecological environment. So the result of the modelling has larger area for preservation and development zoning compared with the current zoning whose characteristics are ambiguous which allows the environmental destruction. The future study issues will be the determination of the weighting factor, component reconsideration based on the ground truth data and the agriculture residential area zoning.

  • PDF

Improvement in facies discrimination using multiple seismic attributes for permeability modelling of the Athabasca Oil Sands, Canada (캐나다 Athabasca 오일샌드의 투수도 모델링을 위한 다양한 탄성파 속성들을 이용한 상 구분 향상)

  • Kashihara, Koji;Tsuji, Takashi
    • Geophysics and Geophysical Exploration
    • /
    • v.13 no.1
    • /
    • pp.80-87
    • /
    • 2010
  • This study was conducted to develop a reservoir modelling workflow to reproduce the heterogeneous distribution of effective permeability that impacts on the performance of SAGD (Steam Assisted Gravity Drainage), the in-situ bitumen recovery technique in the Athabasca Oil Sands. Lithologic facies distribution is the main cause of the heterogeneity in bitumen reservoirs in the study area. The target formation consists of sand with mudstone facies in a fluvial-to-estuary channel system, where the mudstone interrupts fluid flow and reduces effective permeability. In this study, the lithologic facies is classified into three classes having different characteristics of effective permeability, depending on the shapes of mudstones. The reservoir modelling workflow of this study consists of two main modules; facies modelling and permeability modelling. The facies modelling provides an identification of the three lithologic facies, using a stochastic approach, which mainly control the effective permeability. The permeability modelling populates mudstone volume fraction first, then transforms it into effective permeability. A series of flow simulations applied to mini-models of the lithologic facies obtains the transformation functions of the mudstone volume fraction into the effective permeability. Seismic data contribute to the facies modelling via providing prior probability of facies, which is incorporated in the facies models by geostatistical techniques. In particular, this study employs a probabilistic neural network utilising multiple seismic attributes in facies prediction that improves the prior probability of facies. The result of using the improved prior probability in facies modelling is compared to the conventional method using a single seismic attribute to demonstrate the improvement in the facies discrimination. Using P-wave velocity in combination with density in the multiple seismic attributes is the essence of the improved facies discrimination. This paper also discusses sand matrix porosity that makes P-wave velocity differ between the different facies in the study area, where the sand matrix porosity is uniquely evaluated using log-derived porosity, P-wave velocity and photographically-predicted mudstone volume.

Comparison of the fit of automatic milking system and test-day records with the use of lactation curves

  • Sitkowska, B.;Kolenda, M.;Piwczynski, D.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.33 no.3
    • /
    • pp.408-415
    • /
    • 2020
  • Objective: The aim of the paper was to compare the fit of data derived from daily automatic milking systems (AMS) and monthly test-day records with the use of lactation curves; data was analysed separately for primiparas and multiparas. Methods: The study was carried out on three Polish Holstein-Friesians (PHF) dairy herds. The farms were equipped with an automatic milking system which provided information on milking performance throughout lactation. Once a month cows were also subjected to test-day milkings (method A4). Most studies described in the literature are based on test-day data; therefore, we aimed to compare models based on both test-day and AMS data to determine which mathematical model (Wood or Wilmink) would be the better fit. Results: Results show that lactation curves constructed from data derived from the AMS were better adjusted to the actual milk yield (MY) data regardless of the lactation number and model. Also, we found that the Wilmink model may be a better fit for modelling the lactation curve of PHF cows milked by an AMS as it had the lowest values of Akaike information criterion, Bayesian information criterion, mean square error, the highest coefficient of determination values, and was more accurate in estimating MY than the Wood model. Although both models underestimated peak MY, mean, and total MY, the Wilmink model was closer to the real values. Conclusion: Models of lactation curves may have an economic impact and may be helpful in terms of herd management and decision-making as they assist in forecasting MY at any moment of lactation. Also, data obtained from modelling can help with monitoring milk performance of each cow, diet planning, as well as monitoring the health of the cow.

Prediction of ocean surface current: Research status, challenges, and opportunities. A review

  • Ittaka Aldini;Adhistya E. Permanasari;Risanuri Hidayat;Andri Ramdhan
    • Ocean Systems Engineering
    • /
    • v.14 no.1
    • /
    • pp.85-99
    • /
    • 2024
  • Ocean surface currents have an essential role in the Earth's climate system and significantly impact the marine ecosystem, weather patterns, and human activities. However, predicting ocean surface currents remains challenging due to the complexity and variability of the oceanic processes involved. This review article provides an overview of the current research status, challenges, and opportunities in the prediction of ocean surface currents. We discuss the various observational and modelling approaches used to study ocean surface currents, including satellite remote sensing, in situ measurements, and numerical models. We also highlight the major challenges facing the prediction of ocean surface currents, such as data assimilation, model-observation integration, and the representation of sub-grid scale processes. In this article, we suggest that future research should focus on developing advanced modeling techniques, such as machine learning, and the integration of multiple observational platforms to improve the accuracy and skill of ocean surface current predictions. We also emphasize the need to address the limitations of observing instruments, such as delays in receiving data, versioning errors, missing data, and undocumented data processing techniques. Improving data availability and quality will be essential for enhancing the accuracy of predictions. The future research should focus on developing methods for effective bias correction, a series of data preprocessing procedures, and utilizing combined models and xAI models to incorporate data from various sources. Advancements in predicting ocean surface currents will benefit various applications such as maritime operations, climate studies, and ecosystem management.

Network Modelling for Road Intersections (교차로 네트워크 모형화 방법)

  • Gang Maeng-Gyu
    • Journal of the military operations research society of Korea
    • /
    • v.11 no.2
    • /
    • pp.40-52
    • /
    • 1985
  • This paper presents an algorithm to develop network models for road intersections. These models represent microscopic traffic movement in the intersections, and thus can be used to computerize the road data for a detailed analysis or simulation.

  • PDF