• Title/Summary/Keyword: 맵핑 규칙

Search Result 13, Processing Time 0.024 seconds

The study related to the meta data for the attribute mapping from IFC to CityGML (IFC에서 CityGML로 속성 맵핑을 위한 메타 데이터에 관한 연구)

  • Kang, Tae Wook;Choi, Hyun Sang;Hwang, Jung Rae;Hong, Chang Hee
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.30 no.6_1
    • /
    • pp.559-565
    • /
    • 2012
  • The purpose of the present study is to suggest the meta data of the attribute mapping for interoperability from IFC to CityGML. For this, we analyzed the interoperability issue including the neutral information model structure such as IFC, CityGML. To solve the interoperability problem between IFC and CityGML model which is the neutral GIS format, We proposed the meta data including the mapping rule. The meta data is consists of the connection information between BIM and GIS model, the mapping rule based on the perspective of the use-case, the operator and the attribute. By using this, XML to represent the meta data is defined and the information mapping system is developed.

GML 응용스키마를 이용한 공간데이터베이스 스키마 모델링

  • 정호영;이민우;전우제;박수홍
    • Proceedings of the Korean Association of Geographic Inforamtion Studies Conference
    • /
    • 2003.11a
    • /
    • pp.30-39
    • /
    • 2003
  • GML 데이터는 공간 및 비공간 정보를 동시에 갖는 GIS 데이터의 특징과 구조적(structured)인 XML 데이터의 성격을 함께 가지고 있어 일반적인 DBMS에 저장되기 힘들다. XML 저장이 가능한 데이터베이스는 공간데이터 처리 능력이 부족하고, 공간데이터베이스는 XML 데이터를 저장하기 어렵다. 본 연구는 GML 데이터가 공간데이터베이스에 저장될 수 있도록 GML 응용스키마로부터 공간데이터베이스 스키마를 모델링하는 방법을 제안한다. 이를 위하여 객체관계형 데이터베이스의 특징인 복합 애트리뷰트(composite attribute)와 추상데이터타입(ADT)을 이용한 GML 스키마의 맵핑 규칙을 정하였다. 맵핑 규칙은 OGC SQL 스키마에 적합하도록 GML 데이터의 공간 정보와 비공간 정보를 분리하여 저장시킨다. 따라서 저장된 데이터는 공간데이터베이스가 제공하는 공간 연산자/함수 및 인덱스를 통하여 다양한 공간/비공간 질의가 빠르게 수행될 수 있다.

  • PDF

Scan-to-Geometry Mapping Rule Definition for Building Plane Reverse engineering Automation (건축물 평면 형상 역설계 자동화를 위한 Scan-to-Geometry 맵핑 규칙 정의)

  • Kang, Tae-Wook
    • Journal of KIBIM
    • /
    • v.9 no.2
    • /
    • pp.21-28
    • /
    • 2019
  • Recently, many scan projects are gradually increasing for maintenance, construction. The scan data contains useful data, which can be generated in the target application from the facility, space. However, modeling the scan data required for the application requires a lot of cost. In example, the converting 3D point cloud obtained from scan data into 3D object is a time-consuming task, and the modeling task is still very manual. This research proposes Scan-to-Geometry Mapping Rule Definition (S2G-MD) which maps point cloud data to geometry for irregular building plane objects. The S2G-MD considers user use case variability. The method to define rules for mapping scan to geometry is proposed. This research supports the reverse engineering semi-automatic process for the building planar geometry from the user perspective.

Three-Dimensional Direction Code Patterns for Hand Gesture Recognition (손동작인식을 위한 3차원 방향 코드 패턴)

  • Park, Jung-Hoo;Kim, Young-Ju
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2013.07a
    • /
    • pp.21-22
    • /
    • 2013
  • 논문에서는 제스처 인식을 하기 위해 필요한 특징 값을 3차원 방향 코드로 구현한 특징 패턴을 검출하는 방법을 제안한다. 검출된 데이터 좌표끼리 직선을 만들고 직선들의 사이각의 합 연산을 이용해서 특징 변곡점을 추출한다. 추출된 변곡점끼리 직선을 생성한 후, 8방향 코드와 깊이 값을 병합시킨 24방향 코드를 맵핑 시켜준다. 맵핑된 방향 코드들을 한 패턴으로 생성한다. 생성된 패턴에서 인식에 불필요한 방향 노이즈를 제거하기 위해 특정 규칙을 적용한 필터링을 적용하여 필터링된 패턴을 추출하게 된다. '배너코드를 이용한 8방향 패턴'과 비교해서 더 효과적인 패턴이 추출됨을 확인하였다.

  • PDF

The External BIM Reference Model Suggestion for Interoperability Between BIM and GIS (BIM과 GIS간 정보상호운용을 위한 외부 BIM 참조 모델 제안)

  • Kang, Tae Wook;Hong, Chang Hee;Hwang, Jung Rae;Choi, Hyun Sang
    • Spatial Information Research
    • /
    • v.20 no.5
    • /
    • pp.91-98
    • /
    • 2012
  • The purpose of the present study is to suggest the extern BIM reference model for interoperability between BIM and GIS. After we surveyed the research progress and usecases related to the interoperability to do this, we analyzed the architecture of the neutral model such as IFC, CityGML to identify the differences between these and expand CityGML model. By using this result, we proposed the external BIM reference model including the metadata which defines mapping rules from IFC to CityGML.

PRAiSE: A Rule-based Process-centered Software Engineering Environment (PRAiSE : 규칙 기반 프로세스 중심 소프트웨어 공학 환경)

  • Lee, Hyung-Won;Lee, Seung-Iin
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.11 no.3
    • /
    • pp.246-256
    • /
    • 2005
  • Rule-based paradigm is one of the principal types of software process modeling and enaction approaches, as they provide formality and flexibility sufficient to handle complex processes. However, the systems adopting rule-based paradigms are hard to define and understand process models, and their inference engine should be modified or redeveloped at worst according to the change of process language. In this paper, we describe a rule-based PSEE(Process-Centered Software Engineering Environment) PRAiSE that solves the above limitations of existing rule-based PSEEs as well as maintains the merits of rule-based paradigm such as the ability to incorporate the nature of software processes flexibly in which dynamic changes and parallelism are pervasive and prevalent. PRAiSE provides RAiSE, a graphical Process modeling language, and defined process models are interpreted and enacted by process engine implemented using CLiPS, a rule based expert system tool.

Geometry-to-BIM Mapping Rule Definition for Building Plane BIM object (건축물 평면 형상에 대한 형상-to-BIM 맵핑 규칙 정의)

  • Kang, Tae-Wook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.10
    • /
    • pp.236-242
    • /
    • 2019
  • Recently, scanning projects have been carried out in various construction and construction fields for maintenance purposes. The point cloud generated by the scan results is composed of a number of points representing the object to be scanned. The process of extracting the necessary information, including dimensions, from such scan data is called paradox. The reverse engineering process of modeling a point cloud as BIM involves considerable manual work. Owing to the time-consuming reverse engineering nature of the work, the costs increase exponentially when rework requests are made, such as design changes. Reverse engineering automation technology can help improve these problems. On the other hand, the reverse design product is variable depending on the use, and the kind and detail level of the product may be different. This paper proposes the G2BM (Geometry-to-BIM mapping) rule definition method that automatically maps a BIM object from a primitive geometry to a BIM object. G2BM proposes a process definition and a customization method for reverse engineering BIM objects that consider the use case variability.

An Exploratory Study on Linking ISAD(G) and CIDOC CRM Using KARMA (KARMA를 활용한 ISAD(G)와 CIDOC CRM 연계에 관한 탐색적 연구)

  • Park, Zi-young
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.18 no.2
    • /
    • pp.189-214
    • /
    • 2018
  • Archival description is considered as a creation and curation process, and the results of the descriptive records can be used for archival information service. Therefore, various archival descriptive standards provide essential guidelines for establishing a semantic and synthetic structure of the archival records. In this study, the structural aspects of the archival descriptive standards were analyzed and an experimental mapping between General International Standard Archival Description (ISAD(G)), the archival standard, and CIDOC Conceptual Reference Model (CIDOC CRM), the domain ontology of cultural heritage field was performed. The data structure of ISAD(G) is examined in advance and mapping was performed using Karma as a platform. It was thus concluded that there is a need to understand the ontology-based mapping method and the event-focused domain ontology. Moreover, developing a CIDOC CRM-compatible archival ontology and restructuring the legacy ISAD(G) are needed.

A Study for Rule Integration in Vulnerability Assessment and Intrusion Detection using Meaning Based Vulnerability Identification Method (의미기반 취약점 식별자 부여 기법을 사용한 취약점 점검 및 공격 탐지 규칙 통합 방법 연구)

  • Kim, Hyung-Jong;Jung, Tae-In
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.18 no.3
    • /
    • pp.121-129
    • /
    • 2008
  • This paper presents vulnerability identification method based on meaning which is making use of the concept of atomic vulnerability. Also, we are making use of decomposition and specialization processes which were used in DEVS/SES to get identifiers. This vulnerability representation method is useful for managing and removing vulnerability in organized way. It is helpful to make a relation between vulnerability assessing and intrusion detection rules in lower level. The relation enables security manager to response more quickly and conveniently. Especially, this paper shows a mapping between Nessus plugins and Snort rules using meaning based vulnerability identification method and lists usages based on three goals that security officer keeps in mind about vulnerability. The contribution of this work is in suggestion of meaning based vulnerability identification method and showing the cases of its usage for the rule integration of vulnerability assessment and intrusion detection.

Development of Intelligent Job Classification System based on Job Posting on Job Sites (구인구직사이트의 구인정보 기반 지능형 직무분류체계의 구축)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.123-139
    • /
    • 2019
  • The job classification system of major job sites differs from site to site and is different from the job classification system of the 'SQF(Sectoral Qualifications Framework)' proposed by the SW field. Therefore, a new job classification system is needed for SW companies, SW job seekers, and job sites to understand. The purpose of this study is to establish a standard job classification system that reflects market demand by analyzing SQF based on job offer information of major job sites and the NCS(National Competency Standards). For this purpose, the association analysis between occupations of major job sites is conducted and the association rule between SQF and occupation is conducted to derive the association rule between occupations. Using this association rule, we proposed an intelligent job classification system based on data mapping the job classification system of major job sites and SQF and job classification system. First, major job sites are selected to obtain information on the job classification system of the SW market. Then We identify ways to collect job information from each site and collect data through open API. Focusing on the relationship between the data, filtering only the job information posted on each job site at the same time, other job information is deleted. Next, we will map the job classification system between job sites using the association rules derived from the association analysis. We will complete the mapping between these market segments, discuss with the experts, further map the SQF, and finally propose a new job classification system. As a result, more than 30,000 job listings were collected in XML format using open API in 'WORKNET,' 'JOBKOREA,' and 'saramin', which are the main job sites in Korea. After filtering out about 900 job postings simultaneously posted on multiple job sites, 800 association rules were derived by applying the Apriori algorithm, which is a frequent pattern mining. Based on 800 related rules, the job classification system of WORKNET, JOBKOREA, and saramin and the SQF job classification system were mapped and classified into 1st and 4th stages. In the new job taxonomy, the first primary class, IT consulting, computer system, network, and security related job system, consisted of three secondary classifications, five tertiary classifications, and five fourth classifications. The second primary classification, the database and the job system related to system operation, consisted of three secondary classifications, three tertiary classifications, and four fourth classifications. The third primary category, Web Planning, Web Programming, Web Design, and Game, was composed of four secondary classifications, nine tertiary classifications, and two fourth classifications. The last primary classification, job systems related to ICT management, computer and communication engineering technology, consisted of three secondary classifications and six tertiary classifications. In particular, the new job classification system has a relatively flexible stage of classification, unlike other existing classification systems. WORKNET divides jobs into third categories, JOBKOREA divides jobs into second categories, and the subdivided jobs into keywords. saramin divided the job into the second classification, and the subdivided the job into keyword form. The newly proposed standard job classification system accepts some keyword-based jobs, and treats some product names as jobs. In the classification system, not only are jobs suspended in the second classification, but there are also jobs that are subdivided into the fourth classification. This reflected the idea that not all jobs could be broken down into the same steps. We also proposed a combination of rules and experts' opinions from market data collected and conducted associative analysis. Therefore, the newly proposed job classification system can be regarded as a data-based intelligent job classification system that reflects the market demand, unlike the existing job classification system. This study is meaningful in that it suggests a new job classification system that reflects market demand by attempting mapping between occupations based on data through the association analysis between occupations rather than intuition of some experts. However, this study has a limitation in that it cannot fully reflect the market demand that changes over time because the data collection point is temporary. As market demands change over time, including seasonal factors and major corporate public recruitment timings, continuous data monitoring and repeated experiments are needed to achieve more accurate matching. The results of this study can be used to suggest the direction of improvement of SQF in the SW industry in the future, and it is expected to be transferred to other industries with the experience of success in the SW industry.