• Title/Summary/Keyword: Term Mapping

Search Result 155, Processing Time 0.036 seconds

Development of Rainfall Forecastion Model Using a Neural Network (신경망이론을 이용한 강우예측모형의 개발)

  • 오남선
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1996.10a
    • /
    • pp.253-256
    • /
    • 1996
  • Rainfall is one of the major and complicated elements of hydrologic system. Accurate prediction of rainfall is very important to mitigate storm damage. The neural network is a good model to be applied for the classification problem, large combinatorial optimization and nonlinear mapping. In this dissertation, rainfall predictions by the neural network theory were presented. A multi-layer neural network was constructed. The network learned continuous-valued input and output data. The network was used to predict rainfall. The online, multivariate, short term rainfall prediction is possible by means of the developed model. A multidimensional rainfall generation model is applied to Seoul metropolitan area in order to generate the 10-minute rainfall. Application of neural network to the generated rainfall shows good prediction. Also application of neural network to 1-hour real data in Seoul metropolitan area shows slightly good predictions.

  • PDF

Formal Analysis of Distributed Shared Memory Algorithms

  • Muhammad Atif;Muhammad Adnan Hashmi;Mudassar Naseer;Ahmad Salman Khan
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.4
    • /
    • pp.192-196
    • /
    • 2024
  • The memory coherence problem occurs while mapping shared virtual memory in a loosely coupled multiprocessors setup. Memory is considered coherent if a read operation provides same data written in the last write operation. The problem is addressed in the literature using different algorithms. The big question is on the correctness of such a distributed algorithm. Formal verification is the principal term for a group of techniques that routinely use an analysis that is established on mathematical transformations to conclude the rightness of hardware or software behavior in divergence to dynamic verification techniques. This paper uses UPPAAL model checker to model the dynamic distributed algorithm for shared virtual memory given by K.Li and P.Hudak. We analyse the mechanism to keep the coherence of memory in every read and write operation by using a dynamic distributed algorithm. Our results show that the dynamic distributed algorithm for shared virtual memory partially fulfils its functional requirements.

Area-to-Area Poisson Kriging and Spatial Bayesian Analysis in Mapping of Gastric Cancer Incidence in Iran

  • Asmarian, Naeimehossadat;Jafari-Koshki, Tohid;Soleimani, Ali;Ayatollahi, Seyyed Mohammad Taghi
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.17 no.10
    • /
    • pp.4587-4590
    • /
    • 2016
  • Background: In many countries gastric cancer has the highest incidence among the gastrointestinal cancers and is the second most common cancer in Iran. The aim of this study was to identify and map high risk gastric cancer regions at the county-level in Iran. Methods: In this study we analyzed gastric cancer data for Iran in the years 2003-2010. Area-to-area Poisson kriging and Besag, York and Mollie (BYM) spatial models were applied to smoothing the standardized incidence ratios of gastric cancer for the 373 counties surveyed in this study. The two methods were compared in term of accuracy and precision in identifying high risk regions. Result: The highest smoothed standardized incidence rate (SIR) according to area-to-area Poisson kriging was in Meshkinshahr county in Ardabil province in north-western Iran (2.4,SD=0.05), while the highest smoothed standardized incidence rate (SIR) according to the BYM model was in Ardabil, the capital of that province (2.9,SD=0.09). Conclusion: Both methods of mapping, ATA Poisson kriging and BYM, showed the gastric cancer incidence rate to be highest in north and north-west Iran. However, area-to-area Poisson kriging was more precise than the BYM model and required less smoothing. According to the results obtained, preventive measures and treatment programs should be focused on particular counties of Iran.

The History of Volcanic Hazard Map (화산위험지도의 역사)

  • Yun, Sung-Hyo;Chang, Cheolwoo;Ewert, John W.
    • The Journal of the Petrological Society of Korea
    • /
    • v.27 no.1
    • /
    • pp.49-66
    • /
    • 2018
  • Volcano hazard mapping became a focus of scientific inquiry in the 1960s. Dwight Crandell and Don Mullineaux pioneered the geologic history approach with the concept of the past is the key to the future, to hazard mapping. The 1978 publication of the Mount St. Helens hazards assessment and forecast of an eruption in the near future, followed by the large eruption in 1980 demonstrated the utility of volcano hazards assessments and triggered huge growth in this area of volcano science. Numerical models of hazardous processes began to be developed and used for identifying hazardous areas in 1980s and have proliferated since the late 1990s. Model outputs are most useful and accurate when they are constrained by geological knowledge of the volcano. Volcanic Hazard maps can be broadly categorized into those that portray long-term unconditional volcanic hazards-maps showing all areas with some degree of hazard and those that are developed during an unrest or eruption crisis and take into account current monitoring, observation, and forecast information.

The comparative study of PKNU2 Image and Aerial photo & satellite image

  • Lee, Chang-Hun;Choi, Chul-Uong;Kim, Ho-Yong;Jung, Hei-Chul
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.453-454
    • /
    • 2003
  • Most research materials (data), which are used for the study of digital mapping and digital elevation model (DEM) in the field of Remote Sensing and Aerial Photogrammetry are aerial photographs and satellite images. Additionally, they are also used for National land mapping, National land management, environment management, military purposes, resource exploration and Earth surface analysis etc. Although aerial photographs have high resolution, the data, which they contain, are not used for environment exploration that requires continuous observation because of problems caused by its coastline, as well as single - spectral and long-term periodic image. In addition to this, they are difficult to interpret precisely because Satellite Images are influenced by atmospheric phenomena at the time of photographing, and have by far much lower resolution than existing aerial photographs, while they have a great practical usability because they are mulitispectral images. The PKNU 2 is an aerial photographing system that is made to compensate with the weak points of existing aerial photograph and satellite images. It is able to take pictures of very high resolution using a color digital camera with 6 million pixels and a color infrared camera, and can take perpendicular photographs because PKNU 2 system has equipment that makes the cameras stay level. Moreover, it is very cheap to take pictures by using super light aircraft as a platform. It has much higher resolution than exiting aerial photographs and satellite images because it flies at a low altitude about 800m. The PKNU 2 can obtain multispectral images of visible to near infrared band so that it is good to manage environment and to make a classified diagram of vegetation.

  • PDF

Proposal of ISMS-P-based outsourcing service management method through security control business relevance analysis (보안관제 업무 연관성 분석을 통한 ISMS-P 기반의 외주용역 관리 방법 제안)

  • Ko, Dokyun;Park, Yongsuk
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.4
    • /
    • pp.582-590
    • /
    • 2022
  • As security threats caused by cyber attacks continue, security control is mainly operated in the form of a service business with expertise for rapid detection and response. Accordingly, a number of studies have been conducted on the operation of security control services. However, due to the research on the resulting management, indicators, and measurements, the work process has not been studied in detail, causing confusion in the field, making it difficult to respond to security accidents. This paper presents ISMS-P-based service management methods and proposes an easy outsourcing service management method for client by checklisting each item derived from the mapping of 64 items of ISMS-P protection requirements through business relevance analysis. In addition, it is expected to help implement periodic security compliance and acquire and renew ISMS-P in the mid- to long-term, and to contribute to enhancing security awareness of related personnel.

Analysis of Abroad Mid- to Long-Term R&D Themes and Market Information in the Geological Information and Mineral Resources Fields (지질정보 및 광물자원 분야 국외 중장기 연구개발 주제 및 시장정보 분석)

  • Ahn, Eun-Young
    • Economic and Environmental Geology
    • /
    • v.52 no.6
    • /
    • pp.637-645
    • /
    • 2019
  • Due to the transformation to the intelligent information society, the rapid change of our life and environment is expected. The Ministry of Science and ICT (MSIT) and the National Research Council of Science and Technology (NST) introduced a five-year government supported research institution's planning and evaluation based on the mid-to long-term perspective. This study collects international benchmarking information including industry, academia, and research fields by collecting mid- and long-term strategy reports from public research institutes, surveys by experts from abroad universities and research institutes, and analyzing overseas market information reports. The British Geological Survey (BGS), the U.S. Geological Survey (USGS) and the japanese geological survey related institutes (AIST-GSJ) plans for three-dimensional national geological information, predictions of geological environmental disasters, and development of important metals and material in the low carbon economic transformation and in the era of the Fourth Industrial Revolution. The mid- and long-term program emphasizes basic and public research on geological information through abroad experts survey such as the IPGP-CNRS etc. The market analysis of the mining automation and digital map sectors has been able to derive the fields in which the role of public research institutes by the market is expected such as data collection on land and in the air, mobile or three-dimensional information production, smooth/fast/real-time maps, custom map design, mapping support to various platforms, geological environmental risk assessment and disaster management information and maps.

Searle's Conception of Social Reality and the Problem of Freestanding Y Terms (설의 사회적 실재와 '비대응 Y항' 문제)

  • Noh, Yang-jin
    • Journal of Korean Philosophical Society
    • /
    • v.141
    • /
    • pp.43-62
    • /
    • 2017
  • The main purpose of this paper is to survey the debates between Searle and Smith over the problem of "freestanding Y terms" in Searle's conception of social reality, and offer a viable solution, drawing on the experientialist conception of symbolic experience. Smith raises the problem of "freestanding Y term" against Searle's formula "X counts as Y in C" that there may be some cases where we cannot identify an X term to which an Y term refers. In case of an abstract concept such as equity, we may not find exactly what it stands for. That is, we cannot identify exactly what(X term) counts as equity. If there is nothing like an X for Y term, we can regard anything as equity, which may disrupt Searle's formula. Understandably, Smith does not say that the problem dismantles Searle's whole conception of social reality. Instead, Smith intends to show that Searle's formula is neither complete nor specific enough. Apparently, Searle admits that there may be freestanding Y terms and tries to articulate it within his formula, which does not seem to work. I suggest that the experientialist account of symbolic experience may serve to dissolve Smith's challenge, without modifying Searle's original formula. According to the experientialist conception of symbolization, we symbolically map some portion of our experience onto a physical object, which serves as a signifier, and we then understand and experience the signifier "in terms of" the mapped portion of experience. Thus, we experience certain buildings and some relevant people, say students, staffs, and professors in terms of "university." The status functions of university have been created by means of symbolic mappings, which change the way we understand and experience the buildings and people. In this picture, there need not be any notions such as "one-to-one correspondence" between X terms and Y terms. In this way, Searle may maintain his original formula, while dissolving, not answering, Smith's challenge. What Searle needs is a more appropriate theory of symbolization, part of which has been articulated by the experientialist account of symbolic experience.

Standardization and Management of Interface Terminology regarding Chief Complaints, Diagnoses and Procedures for Electronic Medical Records: Experiences of a Four-hospital Consortium (전자의무기록 표준화 용어 관리 프로세스 정립)

  • Kang, Jae-Eun;Kim, Kidong;Lee, Young-Ae;Yoo, Sooyoung;Lee, Ho Young;Hong, Kyung Lan;Hwang, Woo Yeon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.22 no.3
    • /
    • pp.679-687
    • /
    • 2021
  • The purpose of the present study was to document the standardization and management process of interface terminology regarding the chief complaints, diagnoses, and procedures, including surgery in a four-hospital consortium. The process was proposed, discussed, modified, and finalized in 2016 by the Terminology Standardization Committee (TSC), consisting of personnel from four hospitals. A request regarding interface terminology was classified into one of four categories: 1) registration of a new term, 2) revision, 3) deleting an old term and registering a new term, and 4) deletion. A request was processed in the following order: 1) collecting testimonies from related departments and 2) voting by the TSC. At least five out of the seven possible members of the voting pool need to approve of it. Mapping to the reference terminology was performed by three independent medical information managers. All processes were performed online, and the voting and mapping results were collected automatically. This process made the decision-making process clear and fast. In addition, this made users receptive to the decision of the TSC. In the 16 months after the process was adopted, there were 126 new terms registered, 131 revisions, 40 deletions of an old term and the registration of a new term, and 1235 deletions.

Photon Mapping-Based Rendering Technique for Smoke Particles (연기 파티클에 대한 포톤 매핑 기반의 렌더링 기법)

  • Song, Ki-Dong;Ihm, In-Sung
    • Journal of the Korea Computer Graphics Society
    • /
    • v.14 no.4
    • /
    • pp.7-18
    • /
    • 2008
  • To realistically produce fluids such as smoke for the visual effects in the films or animations, we need two main processes: a physics-based modeling of smoke and a rendering of smoke simulation data, based on light transport theory. In the computer graphics community, the physics-based fluids simulation is generally adopted for smoke modeling. Recently, the interest of the particle-based Lagrangian simulation methods is increasing due to the advantages at simulation time, instead of the grid-based Eulerian simulation methods which was widely used. As a result, because the smoke rendering technique depends heavily on the modeling method, the research for rendering of the particle-based smoke data still remains challenging while the research for rendering of the grid-based smoke data is actively in progress. This paper focuses on realistic rendering technique for the smoke particles produced by Lagrangian simulation method. This paper introduces a technique which is called particle map, that is the expansion and modification of photon mapping technique for the particle data. And then, this paper suggests the novel particle map technique and shows the differences and improvements, compared to previous work. In addition, this paper presents irradiance map technique which is the pre-calculation of the multiple scattering term in the volume rendering equation to enhance efficiency at rendering time.

  • PDF