• Title/Summary/Keyword: model mapping approach

Search Result 170, Processing Time 0.031 seconds

A Meta-Model for the Storage of XML Schema using Model-Mapping Approach (모델 매핑 접근법을 이용한 XML 스키마 저장 메타모델에 대한 연구)

  • Lim, Hoon-Tae;Lim, Tae-Soo;Hong, Keun-Hee;Kang, Suk-Ho
    • IE interfaces
    • /
    • v.17 no.3
    • /
    • pp.330-337
    • /
    • 2004
  • Since XML (eXtensible Markup Language) was highlighted as an information interchange format, there is an increasing demand for incorporating XML with databases. Most of the approaches are focused on RDB (Relational Databases) because of legacy systems. But these approaches depend on the database system. Countless researches are being focused on DTD (Document Type Definition). However XML Schema is more comprehensive and efficient in many perspectives. We propose a meta-model for XML Schema that is independent of the database. There are three processes to build our meta-model: DOM (Document Object Model) tree analysis, object modeling and storing object into a fixed DB schema using model mapping approach. We propose four mapping rules for object modeling, which conform to the ODMG (Object Data Management Group) 3.0 standard. We expect that the model will be especially useful in building XML-based e-business applications.

Design of a Feature-based Multi-viewpoint Design Automation System

  • Lee, Kwang-Hoon;McMahon, Chris A.;Lee, Kwan-H.
    • International Journal of CAD/CAM
    • /
    • v.3 no.1_2
    • /
    • pp.67-75
    • /
    • 2003
  • Viewpoint-dependent feature-based modelling in computer-aided design is developed for the purposes of supporting engineering design representation and automation. The approach of this paper uses a combination of a multi-level modelling approach. This has two stages of mapping between models, and the multi-level model approach is implemented in three-level architecture. Top of this level is a feature-based description for each viewpoint, comprising a combination of form features and other features such as loads and constraints for analysis. The middle level is an executable representation of the feature model. The bottom of this multi-level modelling is a evaluation of a feature-based CAD model obtained by executable feature representations defined in the middle level. The mappings involved in the system comprise firstly, mapping between the top level feature representations associated with different viewpoints, for example for the geometric simplification and addition of boundary conditions associated with moving from a design model to an analysis model, and secondly mapping between the top level and the middle level representations in which the feature model is transformed into the executable representation. Because an executable representation is used as the intermediate layer, the low level evaluation can be active. The example will be implemented with an analysis model which is evaluated and for which results are output. This multi-level modelling approach will be investigated within the framework aimed for the design automation with a feature-based model.

Gaussian process approach for dose mapping in radiation fields

  • Khuwaileh, Bassam A.;Metwally, Walid A.
    • Nuclear Engineering and Technology
    • /
    • v.52 no.8
    • /
    • pp.1807-1816
    • /
    • 2020
  • In this work, a Gaussian Process (Kriging) approach is proposed to provide efficient dose mapping for complex radiation fields using limited number of responses. Given a few response measurements (or simulation data points), the proposed approach can help the analyst in completing a map of the radiation dose field with a 95% confidence interval, efficiently. Two case studies are used to validate the proposed approach. The First case study is based on experimental dose measurements to build the dose map in a radiation field induced by a D-D neutron generator. The second, is a simulation case study where the proposed approach is used to mimic Monte Carlo dose predictions in the radiation field using a limited number of MCNP simulations. Given the low computational cost of constructing Gaussian Process (GP) models, results indicate that the GP model can reasonably map the dose in the radiation field given a limited number of data measurements. Both case studies are performed on the nuclear engineering radiation laboratories at the University of Sharjah.

An Approach to Persistent Naming and Naming Mapping Based on OSI and IGM for Parametric CAD Model Exchanges (파라메트릭 CAD모델 교환을 위한 OSI와 IGM기반의 고유 명칭 방법과 명칭 매핑 방법)

  • Mun D.H.;Han S.H.
    • Korean Journal of Computational Design and Engineering
    • /
    • v.9 no.3
    • /
    • pp.226-237
    • /
    • 2004
  • If the topology changes in the re-generation step of the history-based and feature-based CAD systems, it is difficult to identify an entity in the old model and find the same entity in the new model. This problem is known as 'persistent naming problem'. To exchange parametric CAD models, the persistent naming problem and the naming mapping problem must be solved among different CAD system, which use different naming scheme. For CAD model exchange the persistent naming has its own characteristics compare to that for CAD system development. This paper analyses previous researches and proposes a solution to the persistent naming problem for CAD model exchanges and to the naming mapping problem among different naming schemes.

Dynamic knowledge mapping guided by data mining: Application on Healthcare

  • Brahami, Menaouer;Atmani, Baghdad;Matta, Nada
    • Journal of Information Processing Systems
    • /
    • v.9 no.1
    • /
    • pp.1-30
    • /
    • 2013
  • The capitalization of know-how, knowledge management, and the control of the constantly growing information mass has become the new strategic challenge for organizations that aim to capture the entire wealth of knowledge (tacit and explicit). Thus, knowledge mapping is a means of (cognitive) navigation to access the resources of the strategic heritage knowledge of an organization. In this paper, we present a new mapping approach based on the Boolean modeling of critical domain knowledge and on the use of different data sources via the data mining technique in order to improve the process of acquiring knowledge explicitly. To evaluate our approach, we have initiated a process of mapping that is guided by machine learning that is artificially operated in the following two stages: data mining and automatic mapping. Data mining is be initially run from an induction of Boolean case studies (explicit). The mapping rules are then used to automatically improve the Boolean model of the mapping of critical knowledge.

Identification of Nonlinear Mapping based on Fuzzy Integration of Local Affine Mappings (국부 유사사상의 퍼지통합에 기반한 비선형사상의 식별)

  • 최진영;최종호
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.32B no.5
    • /
    • pp.812-820
    • /
    • 1995
  • This paper proposes an approach of identifying nonlinear mappings from input/output data. The approach is based on the universal approximation by the fuzzy integration of local affine mappings. A connectionist model realizing the universal approximator is suggested by using a processing unit based on both the radial basis function and the weighted sum scheme. In addition, a learning method with self-organizing capability is proposed for the identifying of nonlinear mapping relationships with the given input/output data. To show the effectiveness of our approach, the proposed model is applied to the function approximation and the prediction of Mackey-Glass chaotic time series, and the performances are compared with other approaches.

  • PDF

Minimizing Leakage of Sequential Circuits through Flip-Flop Skewing and Technology Mapping

  • Heo, Se-Wan;Shin, Young-Soo
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.7 no.4
    • /
    • pp.215-220
    • /
    • 2007
  • Leakage current of CMOS circuits has become a major factor in VLSI design these days. Although many circuit-level techniques have been developed, most of them require significant amount of designers' effort and are not aligned well with traditional VLSI design process. In this paper, we focus on technology mapping, which is one of the steps of logic synthesis when gates are selected from a particular library to implement a circuit. We take a radical approach to push the limit of technology mapping in its capability of suppressing leakage current: we use a probabilistic leakage (together with delay) as a cost function that drives the mapping; we consider pin reordering as one of options in the mapping; we increase the library size by employing gates with larger gate length; we employ a new flipflop that is specifically designed for low-leakage through selective increase of gate length. When all techniques are applied to several benchmark circuits, leakage saving of 46% on average is achieved with 45-nm predictive model, compared to the conventional technology mapping.

The Case and Implications of Terminology Mapping for Development of Dankook University Hospital EHR-Based MOA CDM (단국대학교병원 EHR 기반 MOA CDM 구축을 위한 용어 매핑 사례와 시사점)

  • Yookyung Boo;Sihyun Song;Jihwan Park;Mi Jung Rho
    • Korea Journal of Hospital Management
    • /
    • v.29 no.1
    • /
    • pp.1-18
    • /
    • 2024
  • Purposes: The Common Data Model(CDM) is very important for multi-institutional research. There are various domestic and international CDM construction cases to actively utilize it. In order to construct a CDM, different terms from each institution must be mapped to standard terms. Therefore, we intend to derive the importance and major issues of terminology mapping and propose a solution in CDM construction. Methodology/Approach: This study conducted terminology mapping between Electronic Health Record(EHR) and MOA CDM for constructing Medical Record Observation & Assessment for Drug Safety(MOA) CDM at Dankook University Hospital in 2022. In the process of terminology mapping, a CDM standard terminology process and method were developed and terminology mapping was performed by applying this. The constructions of CDM mapping terms proceeded in the order of diagnosis, drug, measurement, and treatment_procedure. Findings: We developed mapping guideline for CDM construction and used this for mapping. A total of 670,993 EHR data from Dankook University Hospital(January 1, 2013 to December 31, 2021) were mapped. In the case of diagnosis terminology, 19,413 were completely mapped. Drug terminology mapped 92.1% of 2,795. Measurement terminology mapped 94.5% of 7,254 cases. Treatment and procedure were mapped to 2,181 cases, which are the number of mapping targets. Practical Implications: This study found the importance of constructing MOA CDM for drug side effect monitoring and developed terminology mapping guideline. Our results would be useful for all future researchers who are conducting terminology mapping when constructing CDM.

  • PDF

Bump mapping algorithm for polygonal model and its hardware implementation (다각형 모델에서 범프 맵핑을 수행하기 위한 알고리즘과 하드웨어 구현)

  • Choi, Seung-Hak;Mun, Byung-In;Eo, Kil-Su;Lee, Hong-Youl
    • Journal of the Korea Computer Graphics Society
    • /
    • v.2 no.1
    • /
    • pp.15-23
    • /
    • 1996
  • Bump mapping is an elegant rendering technique to simulate wrinkled surfaces such as bark, which enables to produce more realistic image than texture-mapped one. This paper presents a new algorithm for bump mapping along with a hardware architecture to run our algorithm in real-time. The proposed approach is more efficient than previous one, and in particular, our hardware architecture is simpler to implement.

  • PDF

Patient-Specific Mapping between Myocardium and Coronary Arteries using Myocardial Thickness Variation

  • Dongjin Han
    • International journal of advanced smart convergence
    • /
    • v.13 no.2
    • /
    • pp.187-194
    • /
    • 2024
  • For precise cardiac diagnostics and treatment, we introduce a novel method for patient-specific mapping between myocardial and coronary anatomy, leveraging local variations in myocardial thickness. This complex system integrates and automates multiple sophisticated components, including left ventricle segmentation, myocardium segmentation, long-axis estimation, coronary artery tracking, and advanced geodesic Voronoi distance mapping. It meticulously accounts for variations in myocardial thickness and precisely delineates the boundaries between coronary territories according to the conventional 17-segment myocardial model. Each phase of the system provides a step-by-step approach to automate coronary artery mapping onto the myocardium. This innovative method promises to transform cardiac imaging by offering highly precise, automated, and patient-specific analyses, potentially enhancing the accuracy of diagnoses and the effectiveness of therapeutic interventions for various cardiac conditions.