• Title/Summary/Keyword: Domain component

Search Result 609, Processing Time 0.024 seconds

A Preliminary Study on the Exhumation Mechanism of the Paleozoic Gwangcheon Gneiss in the Southwestern Margin of the Gyeonggi Massif (경기육괴 남서 연변부에 발달하는 고생대 광천편마암의 노출기작에 대한 예비 연구)

  • Park, Seung-Ik
    • Economic and Environmental Geology
    • /
    • v.50 no.6
    • /
    • pp.525-535
    • /
    • 2017
  • Exhumation mechanism of migmatite in orogenic belts provides insights into thermo-mechanical evolution of lithosphere in association with orogeny. This study deals with kinematics of structures in and around the Gwangcheon Gneiss, as a preliminary study on exhumation mechanism, which is a main constituent of a domal structure (viz., Oseosan Dome) in the Hongseong area, southwestern margin of the Gyeonggi massif. Geological structures in the Gwangcheon Gneiss, which mainly comprises southern and northwestern part of the Oseosan Dome, generally have kinematic component of top-outward shear. This feature is likely to represent diapiric dome-up movement. In addition, a high strain zone, by which the tectonic domain involving the Gwangcheon Gneiss is bounded on the west, show structural features with normal sense of shear component. Taking available (thermo)chronological data into account, it is interpreted that activation of the high strain zone and exhumation of the Gwangcheon Gneiss occurred during Late Triassic, when the Gyeonggi massif was widely affected by post-collisional processes. It means that the Gwangcheon Gneiss was diapirically moved up and exhumed in the footwall of extensional high strain zone in association with Triassic post-collisional processes.

Using the METHONTOLOGY Approach to a Graduation Screen Ontology Development: An Experiential Investigation of the METHONTOLOGY Framework

  • Park, Jin-Soo;Sung, Ki-Moon;Moon, Se-Won
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.125-155
    • /
    • 2010
  • Ontologies have been adopted in various business and scientific communities as a key component of the Semantic Web. Despite the increasing importance of ontologies, ontology developers still perceive construction tasks as a challenge. A clearly defined and well-structured methodology can reduce the time required to develop an ontology and increase the probability of success of a project. However, no reliable knowledge-engineering methodology for ontology development currently exists; every methodology has been tailored toward the development of a particular ontology. In this study, we developed a Graduation Screen Ontology (GSO). The graduation screen domain was chosen for the several reasons. First, the graduation screen process is a complicated task requiring a complex reasoning process. Second, GSO may be reused for other universities because the graduation screen process is similar for most universities. Finally, GSO can be built within a given period because the size of the selected domain is reasonable. No standard ontology development methodology exists; thus, one of the existing ontology development methodologies had to be chosen. The most important considerations for selecting the ontology development methodology of GSO included whether it can be applied to a new domain; whether it covers a broader set of development tasks; and whether it gives sufficient explanation of each development task. We evaluated various ontology development methodologies based on the evaluation framework proposed by G$\acute{o}$mez-P$\acute{e}$rez et al. We concluded that METHONTOLOGY was the most applicable to the building of GSO for this study. METHONTOLOGY was derived from the experience of developing Chemical Ontology at the Polytechnic University of Madrid by Fern$\acute{a}$ndez-L$\acute{o}$pez et al. and is regarded as the most mature ontology development methodology. METHONTOLOGY describes a very detailed approach for building an ontology under a centralized development environment at the conceptual level. This methodology consists of three broad processes, with each process containing specific sub-processes: management (scheduling, control, and quality assurance); development (specification, conceptualization, formalization, implementation, and maintenance); and support process (knowledge acquisition, evaluation, documentation, configuration management, and integration). An ontology development language and ontology development tool for GSO construction also had to be selected. We adopted OWL-DL as the ontology development language. OWL was selected because of its computational quality of consistency in checking and classification, which is crucial in developing coherent and useful ontological models for very complex domains. In addition, Protege-OWL was chosen for an ontology development tool because it is supported by METHONTOLOGY and is widely used because of its platform-independent characteristics. Based on the GSO development experience of the researchers, some issues relating to the METHONTOLOGY, OWL-DL, and Prot$\acute{e}$g$\acute{e}$-OWL were identified. We focused on presenting drawbacks of METHONTOLOGY and discussing how each weakness could be addressed. First, METHONTOLOGY insists that domain experts who do not have ontology construction experience can easily build ontologies. However, it is still difficult for these domain experts to develop a sophisticated ontology, especially if they have insufficient background knowledge related to the ontology. Second, METHONTOLOGY does not include a development stage called the "feasibility study." This pre-development stage helps developers ensure not only that a planned ontology is necessary and sufficiently valuable to begin an ontology building project, but also to determine whether the project will be successful. Third, METHONTOLOGY excludes an explanation on the use and integration of existing ontologies. If an additional stage for considering reuse is introduced, developers might share benefits of reuse. Fourth, METHONTOLOGY fails to address the importance of collaboration. This methodology needs to explain the allocation of specific tasks to different developer groups, and how to combine these tasks once specific given jobs are completed. Fifth, METHONTOLOGY fails to suggest the methods and techniques applied in the conceptualization stage sufficiently. Introducing methods of concept extraction from multiple informal sources or methods of identifying relations may enhance the quality of ontologies. Sixth, METHONTOLOGY does not provide an evaluation process to confirm whether WebODE perfectly transforms a conceptual ontology into a formal ontology. It also does not guarantee whether the outcomes of the conceptualization stage are completely reflected in the implementation stage. Seventh, METHONTOLOGY needs to add criteria for user evaluation of the actual use of the constructed ontology under user environments. Eighth, although METHONTOLOGY allows continual knowledge acquisition while working on the ontology development process, consistent updates can be difficult for developers. Ninth, METHONTOLOGY demands that developers complete various documents during the conceptualization stage; thus, it can be considered a heavy methodology. Adopting an agile methodology will result in reinforcing active communication among developers and reducing the burden of documentation completion. Finally, this study concludes with contributions and practical implications. No previous research has addressed issues related to METHONTOLOGY from empirical experiences; this study is an initial attempt. In addition, several lessons learned from the development experience are discussed. This study also affords some insights for ontology methodology researchers who want to design a more advanced ontology development methodology.

Moire Reduction in Digital Still Camera by Using Inflection Point in Frequency Domain (주파수 도메인의 변곡점을 이용한 디지털 카메라의 moire 제거 방법)

  • Kim, Dae-Chul;Kyung, Wang-Jun;Lee, Cheol-Hee;Ha, Yeong-Ho
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.1
    • /
    • pp.152-157
    • /
    • 2014
  • Digital still camera generally uses optical low-pass filter(OLPF) to enhance its image quality because it removes high spatial frequencies causing aliasing. However, the use of OLPF causes some loss of detail. On the other hand, when image are captured by using no OLPF, the moir$\acute{e}$ is generally existed in high spatial frequency region of an image. Therefore, in this paper, moir$\acute{e}$ reduction method in case of using no OLPF is suggested. To detect the moir$\acute{e}$, spatial frequency response(SFR) of camera was firstly analyzed by using ISO 12233 resolution chart. Then, moir$\acute{e}$ region is detected by using the patterns that are related to the SFR of camera. next, this region is analysed in the frequency domain. Then, the moir$\acute{e}$ is reduced by removing its frequency component, which represents inflection point between high frequency and DC components. Through the experimental results, it is shown that the proposed method can achieve moir$\acute{e}$ reduction with preserving the detail.

A 2-Dimensional Approach for Analyzing Variability of Domain Core Assets (도메인 핵심자산의 가변성 분석을 위한 2차원적 접근방법)

  • Moon Mi-Kyeong;Chae Heung-Seok;Yeom Keun-Hyuk
    • Journal of KIISE:Software and Applications
    • /
    • v.33 no.6
    • /
    • pp.550-563
    • /
    • 2006
  • Software product line engineering is a method that prepares for the future reuse and supports to seamless reuse in application development process. Commonality and variability play central roles in all product line development processes. Reusable assets will become core assets by explicitly representing C&V. Indeed, the variabilities that art identified at each phase of core assets development have different levels of abstraction. In the past, these variabilities have been handled in an implicit manner and without distinguishing the characteristics of each core assets. In addition, previous approaches have depended on the experience and intuition of a domain expert to recognize commonality and variability. In this paper, we suggest a 2-dimensional analyzing method that analyzes the variabilities of core assets in software product line. In horizontal analysis process, the variation types are analyzed in requirements, architecture, and component that are produced at each phase of development process. In vertical analysis process, variations are analyzed in different abstract levels, in which the region of commonality is identified and the variation points are refined. By this method, the traceability of variations between core assets will be possible and core assets can be reused seamlessly.

The Study of Driving Fatigue using HRV Analysis (HRV 분석을 이용한 운전피로도에 관한 연구)

  • 성홍모;차동익;김선웅;박세진;김철중;윤영로
    • Journal of Biomedical Engineering Research
    • /
    • v.24 no.1
    • /
    • pp.1-8
    • /
    • 2003
  • The job of long distance driving is likely to be fatiguing and requires long period alertness and attention, which make considerable demands of the driver. Driving fatigue contributes to driver related with accidents and fatalities. In this study, we investigated the relationship between the number of hours of driving and driving fatigue using heart rate variability(HRV) signal. With a more traditional measure of overall variability (standard deviation, mean, spectral values of heart rate). Nonlinear characteristics of HRV signal were analyzed using Approximate Entropy (ApEn) and Poincare plot. Five subjects drive the four passenger vehicle twice. All experiment number was 40. The test route was about 300Km continuous long highway circuit and driving time was about 3 hours. During the driving, measures of electrocardiogram(ECG) were performed at intervals of 30min. HRV signal, derived from the ECG, was analyzed using time, frequency domain parameters and nonlinear characteristic. The significance of differences on the response to driving fatigue was determined by Student's t-test. Differences were considered significant when a p value < 0.05 was observed. In the results, mean heart rate(HRmean) decreased consistently with driving time, standard deviation of RR intervals(SDRR), standard deviation of the successive difference of the RR intervals(SDSD) increased until 90min. Hereafter, they were almost unchanging until the end of the test. Normalized low frequency component $(LF_{norm})$, ratio of low to high frequency component (LF/HF) increased. We used the Approximate Entropy(ApEn), Poincare plot method to describe the nonlinear characteristics of HRV signal. Nonlinear characteristics of HRV signals decreased with driving time. Statistical significant is appeared after 60 min in all parameters.

Business Process Design to Apply ebXML Framework to the Port and Logistics Distribution Industry (ebXML 적용을 위한 항만물류산업 비즈니스 프로세스 설계)

  • Choi, Hyung-Rim;Park, Nam-Kyu;Lim, Ho-Seob;Lee, Hyun-Chul;Lee, Chang-Sup
    • Information Systems Review
    • /
    • v.4 no.2
    • /
    • pp.209-222
    • /
    • 2002
  • EDI (Electronic Data Interchange) has been widely utilized to support Business Activities since it has such advantages as fast transfer of information, less documentation work, efficient information exchange etc. Recently e-business environment has urged the traditional EDI system to be changed to ebXML framework. To apply the ebXML framework to a certain industry, it is required to implement Business Process (BP), Core Component (CC), Collaboration Protocol Profile (CPP), Collaboration Protocol Agreement (CPA), Messaging system etc. We have selected the port and logistics industry as a target domain to apply ebXML framework, since the EDI usage ratio of it is relatively higher than other industries. In this paper, we have analyzed the current status of EDI system and transaction processes in the port and logistics industry. We have defined the business process that will be registered in the registry/repository, the main component of ebXML framework, using UN/CEFACT modeling methodology. And Business Collaborations, Business Transactions, Business Document Flows, Choreography, Pattern, etc. are represented using UML according to UN/ CEFACT modeling methodology, to apply ebXML Framework to the port and logistics distribution industry. Also we have suggested the meta methodology for applying the ebXML framework to other industries.

An Exploratory Study of Software Development Environment in Korean Shipbuilding and Marine Industry (조선해양산업 소프트웨어 개발환경 현황 연구)

  • Yu, Misun;Jeong, Yang-Jae;Chun, In-Geol;Kim, Byoung-Chul;Na, Gapjoo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.7 no.6
    • /
    • pp.221-228
    • /
    • 2018
  • With an increase in demand for the high added value of shipbuilding and marine industry based on the information and communications technology (ICT), software technology has become more important than ever in the industry. In this paper, we present the result of our preliminary investigation on the current software development environment in the shipbuilding and marine industry in order to develop reusable software component, which can enhance the competitiveness of software development. The investigation is performed based on the survey answers from 34 developers who are working in different shipbuilding and marine companies. The questionnaire is composed of items to gather the information of each company such as the number of employees and product domain, and actual software development environment such as operating system, programming languages, deployment format, obstacles for developing components, and the adoption of software development methods and tools. According to the results of the survey, the most important consideration to select their development platform was the number of available utilities and the technical supports, followed by performance, price and security problems. In addition, the requirements of various platforms supporting and the higher reliability, and the limitations of low development cost and manpower made it difficult for them to develop reusable software components. Finally, throughout the survey, we find out that only 15% of developers used software development processes and managed the quality to systematically develop their software products, therefore, shipbuilding and marine companies need more technical and institutional support to improve their ability to develop high qualified software.

Sensitivity Analysis and Estimation of the Depth of Investigation in Small-Loop EM Surveys (소형루프 전자탐사의 감도분석 및 가탐심도 추정)

  • Song Yoonho;Chung Seung-Hwan
    • Geophysics and Geophysical Exploration
    • /
    • v.5 no.4
    • /
    • pp.299-308
    • /
    • 2002
  • We have derived an analytical expression for the sensitivity of the frequency domain small-loop electromagnetic (EM) surveys over a two-layer earth in order to estimate the depth of investigation with an instrument having the source-receiver separation of about 2 m. We analyzed the sensitivities to the lower layer normalized by those to the upper half-space and estimated the depth of investigation from the sensitivity analyses and the mutual impedance ratio. The computational results showed that the in-phase components of the sensitivity to the lower layer dominates those to the upper layer when the thickness of the upper layer is less than 20 m, while the quadrature components are not sensitive to the lower layer over the entire frequency range. Hence we confirmed that the accurate measurement of the in-phase component is essential to increase the depth of investigation in the multi-frequency small-loop EM survey. When conductive basement of 10 ohm-m underlies the upper layer of 100 ohm-m, an accurate measurement of the in-phase components ensures the depth of the investigation more than 10 m even accounting a noise effect, from which we conclude that the small-loop EM survey is quite effective in imaging the conductive plume down to a considerable depth. On the other hand, in the presence of the resistive basement of 1,000 ohm-m, the depth of investigation may not exceed 5 m considering the instrumental accuracy, which implies that the application of the small-loop EM survey is not recommended over the resistive environment other than detecting the buried conductor.

Identification and molecular characterization of the chitinase gene, EaChi, from the midgut of the earthworm, Eisenia andrei (붉은줄지렁이 (Eisenia andrei) 중장에서 발현되는 chitinase 유전자, EaChi의 동정 및 분자생물학적 특성에 관한 연구)

  • Tak, Eun Sik;Kim, Dae hwan;Lee, Myung Sik;Ahn, Chi Hyun;Park, Soon Cheol
    • Journal of the Korea Organic Resources Recycling Association
    • /
    • v.18 no.3
    • /
    • pp.31-37
    • /
    • 2010
  • Chitinases (EC 3.2.1.14) hydrolyze the ${\beta}$-1,4-linkages in chitin, the second most abundant polymer of N-acetyl-${\beta}$-D-glucosamine which is a structural component of protective biological matrices such as fungal cell walls and insect exoskeletons. The glycosyl hydrolases 18 family including chitinases is an ancient gene family widely expressed in archea, prokaryotes and eukaryotes. Since earthworms live in the soil with a lot of microbial activities and fungi are supposed to be a major component of the diet of earthworm, it has been reported that there would be appropriate immune system to protect themselves from microorganisms attacks. In this study, the novel chitinase, EaChi, from the midgut of earthworm, Eisenia andrei, were identified and characterized. To obtain full-length cDNA sequence of chitinase, RT-PCR and RACE-PCR analyses were carried out by using the previously identified EST sequence amongst cDNA library established from the midgut of E. andrei. EaChi, a partial chitinase gene, was composed of 927 nucleotides encoding 309 amino acids. By the multiple sequence alignments of amino acids with other different species, it was revealed that EaCHI is a member of glycosyl hydrolases 18 family, which has two highly conserved domains, substrate binding and catalytic domain.

A Modified REDP Aggregate Marker for improving TCP Fairness of Assured Services

  • Hur Kyeong;Eom Doo-Seop;Tchah Kyun-Hyon
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.1B
    • /
    • pp.86-100
    • /
    • 2004
  • To provide the end-to-end service differentiation for assured services, the random early demotion and promotion (REDP) marker in the edge router at each domain boundary monitors the aggregate flow of the incoming in-profile packets and demotes in-profile packets or promotes the previously demoted in-profile packets at the aggregate flow level according to the negotiated interdomain service level agreement (SLA). The REDP marker achieves UDP fairness in demoting and promoting packets through random and early marking decisions on packets. But, TCP fairness of the REDP marker is not obvious as for UDP sources. In this paper, to improve TCP fairness of the REDP marker, we propose a modified REDP marker where we combine a dropper, meters and a token filling rate configuration component with the REDP marker. To make packet transmission rates of TCP flows more fair, at the aggregate flow level the combined dropper drops incoming excessive in-profile packets randomly with a constant probability when the token level in the leaky bucket stays in demotion region without incoming demoted in-profile packets. Considering the case where the token level cannot stay in demotion region without the prior demotion, we propose a token filling rate configuration method using traffic meters. By using the token filling rate configuration method, the modified REDP marker newly configures a token filling rate which is less than the negotiated rate determined by interdomain SLA and larger than the current input aggregate in-profile traffic rate. Then, with the newly configured token filling rate, the token level in the modified REDP marker can stay in demotion region pertinently fir the operation of the dropper to improve TCP fairness. We experiment with the modified REDP marker using ns2 simulator fur TCP sources at the general case where the token level cannot stay in demotion region without the prior demotion at the negotiated rate set as the bottleneck link bandwidth. The simulation results demonstrate that through the combined dropper with the newly configured token filling rate, the modified REDP marker also increases both aggregate in-profile throughput and link utilization in addition to TCP fairness improvement compared to the REDP marker.