• Title/Summary/Keyword: mapping algorithm

Search Result 1,082, Processing Time 0.026 seconds

Assessing the Effects of Climate Change on the Geographic Distribution of Pinus densiflora in Korea using Ecological Niche Model (소나무의 지리적 분포 및 생태적 지위 모형을 이용한 기후변화 영향 예측)

  • Chun, Jung Hwa;Lee, Chang-Bae
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.15 no.4
    • /
    • pp.219-233
    • /
    • 2013
  • We employed the ecological niche modeling framework using GARP (Genetic Algorithm for Ruleset Production) to model the current and future geographic distribution of Pinus densiflora based on environmental predictor variable datasets such as climate data including the RCP 8.5 emission climate change scenario, geographic and topographic characteristics, soil and geological properties, and MODIS enhanced vegetation index (EVI) at 4 $km^2$ resolution. National Forest Inventory (NFI) derived occurrence and abundance records from about 4,000 survey sites across the whole country were used for response variables. The current and future potential geographic distribution of Pinus densiflora, one of the tree species dominating the present Korean forest was modeled and mapped. Future models under RCP 8.5 scenarios for Pinus densiflora suggest large areas predicted under current climate conditions may be contracted by 2090 showing range shifts northward and to higher altitudes. Area Under Curve (AUC) values of the modeled result was 0.67. Overall, the results of this study were successful in showing the current distribution of major tree species and projecting their future changes. However, there are still many possible limitations and uncertainties arising from the select of the presence-absence data and the environmental predictor variables for model input. Nevertheless, ecological niche modeling can be a useful tool for exploring and mapping the potential response of the tree species to climate change. The final models in this study may be used to identify potential distribution of the tree species based on the future climate scenarios, which can help forest managers to decide where to allocate effort in the management of forest ecosystem under climate change in Korea.

Finding Stop Position of Taxis using IoV data and road segment algorithm (IoV 데이터와 도로 분할 알고리즘을 이용한 택시 정차위치 파악)

  • Lim, Dong-jin;Onueam, Athita;Jung, Han-min
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.10a
    • /
    • pp.590-592
    • /
    • 2018
  • Taxis that are illegally parked on the road to catch customer can cause traffic congestion and sometimes cause traffic accidents. Stop position of taxis is determined by the long term experience of taxi drivers. In this study, We provide information to taxi drivers and customer who visit in first time through finding stop position of taxis by time. To do this, we used the Internet of Vehicle (IoV) data collected from sensors installed in 40 taxis. Previous studies attempted by forming a cluster around a taxi. Since this method is centered on a taxi, the position of the cluster changes depending on the location of the taxi. In this study, we use a road segmentation algorithm to solve these problems. Unlike the previous studies, since the cluster is formed around the road, the position of the cluster is fixed and it is not affected by the number of taxis, so it is possible to grasp the stop position in real time. The road segmentation is made up of 30m units, and map the taxi location data divided into hourly, weekday, and weekend to the nearest point. As a result of the mapping, it was difficult to see a big difference in the time of week because there were few taxis to operate on weekends, but in case of weekdays, the difference of stop position between the commute time zone and the night time zone was confirmed. The results of this study suggest that it will be possible to propose the prevention of taxi illegally driving taxi and the location of the taxi stand.

  • PDF

A Program Transformational Approach for Rule-Based Hangul Automatic Programming (규칙기반 한글 자동 프로그램을 위한 프로그램 변형기법)

  • Hong, Seong-Su;Lee, Sang-Rak;Sim, Jae-Hong
    • The Transactions of the Korea Information Processing Society
    • /
    • v.1 no.1
    • /
    • pp.114-128
    • /
    • 1994
  • It is very difficult for a nonprofessional programmer in Koera to write a program with very High Level Language such as, V,REFINE, GIST, and SETL, because the semantic primitives of these languages are based on predicate calculus, set, mapping, or testricted natural language. And it takes time to be familiar with these language. In this paper, we suggest a method to reduce such difficulties by programming with the declarative, procedural constructs, and aggregate constructs. And we design and implement an experimental knowledge-based automatic programming system. called HAPS(Hangul Automatic Program System). HAPS, whose input is specification such as Hangul abstract algorithm and datatype or Hangul procedural constructs, and whose output is C program. The method of operation is based on rule-based and program transformation technique, and the problem transformation technique. The problem area is general problem. The control structure of HAPS accepts the program specification, transforms this specification according to the proper rule in the rule-base, and stores the transformed program specification on the global data base. HAPS repeats these procedures until the target C program is fully constructed.

  • PDF

Change Reconciliation on XML Repetitive Data (XML 반복부 데이터의 변경 협상 방법)

  • Lee Eunjung
    • The KIPS Transactions:PartA
    • /
    • v.11A no.6
    • /
    • pp.459-468
    • /
    • 2004
  • Sharing XML trees on mobile devices has become more and more popular. Optimistic replication of XML trees for mobile devices raises the need for reconciliation of concurrently modified data. Especially for reconciling the modified tree structures, we have to compare trees by node mapping which takes O($n^2$) time. Also, using semantic based conflict resolving policy is often discussed in the literature. In this research, we focused on an efficient reconciliation method for mobile environments, using edit scripts of XML data sent from each device. To get a simple model for mobile devices, we use the XML list data sharing model, which allows inserting/deleting subtrees only for the repetitive parts of the tree, based on the document type. Also, we use keys for repetitive part subtrees, keys are unique between nodes with a same parent. This model not only guarantees that the edit action always results a valid tree but also allows a linear time reconciliation algorithm due to key based list reconciliation. The algorithm proposed in this paper takes linear time to the length of edit scripts, if we can assume that there is no insertion key conflict. Since the previous methods take a linear time to the size of the tree, the proposed method is expected to provide a more efficient reconciliation model in the mobile environment.

Robust Vision Based Algorithm for Accident Detection of Crossroad (교차로 사고감지를 위한 강건한 비젼기반 알고리즘)

  • Jeong, Sung-Hwan;Lee, Joon-Whoan
    • The KIPS Transactions:PartB
    • /
    • v.18B no.3
    • /
    • pp.117-130
    • /
    • 2011
  • The purpose of this study is to produce a better way to detect crossroad accidents, which involves an efficient method to produce background images in consideration of object movement and preserve/demonstrate the candidate accident region. One of the prior studies proposed an employment of traffic signal interval within crossroad to detect accidents on crossroad, but it may cause a failure to detect unwanted accidents if any object is covered on an accident site. This study adopted inverse perspective mapping to control the scale of object, and proposed different ways such as producing robust background images enough to resist surrounding noise, generating candidate accident regions through information on object movement, and by using edge information to preserve and delete the candidate accident region. In order to measure the performance of proposed algorithm, a variety of traffic images were saved and used for experiment (e.g. recorded images on rush hours via DVR installed on crossroad, different accident images recorded in day and night rainy days, and recorded images including surrounding noise of lighting and shades). As a result, it was found that there were all 20 experiment cases of accident detected and actual effective rate of accident detection amounted to 76.9% on average. In addition, the image processing rate ranged from 10~14 frame/sec depending on the area of detection region. Thus, it is concluded that there will be no problem in real-time image processing.

Ensemble Deep Network for Dense Vehicle Detection in Large Image

  • Yu, Jae-Hyoung;Han, Youngjoon;Kim, JongKuk;Hahn, Hernsoo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.26 no.1
    • /
    • pp.45-55
    • /
    • 2021
  • This paper has proposed an algorithm that detecting for dense small vehicle in large image efficiently. It is consisted of two Ensemble Deep-Learning Network algorithms based on Coarse to Fine method. The system can detect vehicle exactly on selected sub image. In the Coarse step, it can make Voting Space using the result of various Deep-Learning Network individually. To select sub-region, it makes Voting Map by to combine each Voting Space. In the Fine step, the sub-region selected in the Coarse step is transferred to final Deep-Learning Network. The sub-region can be defined by using dynamic windows. In this paper, pre-defined mapping table has used to define dynamic windows for perspective road image. Identity judgment of vehicle moving on each sub-region is determined by closest center point of bottom of the detected vehicle's box information. And it is tracked by vehicle's box information on the continuous images. The proposed algorithm has evaluated for performance of detection and cost in real time using day and night images captured by CCTV on the road.

Development of Intelligent Job Classification System based on Job Posting on Job Sites (구인구직사이트의 구인정보 기반 지능형 직무분류체계의 구축)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.123-139
    • /
    • 2019
  • The job classification system of major job sites differs from site to site and is different from the job classification system of the 'SQF(Sectoral Qualifications Framework)' proposed by the SW field. Therefore, a new job classification system is needed for SW companies, SW job seekers, and job sites to understand. The purpose of this study is to establish a standard job classification system that reflects market demand by analyzing SQF based on job offer information of major job sites and the NCS(National Competency Standards). For this purpose, the association analysis between occupations of major job sites is conducted and the association rule between SQF and occupation is conducted to derive the association rule between occupations. Using this association rule, we proposed an intelligent job classification system based on data mapping the job classification system of major job sites and SQF and job classification system. First, major job sites are selected to obtain information on the job classification system of the SW market. Then We identify ways to collect job information from each site and collect data through open API. Focusing on the relationship between the data, filtering only the job information posted on each job site at the same time, other job information is deleted. Next, we will map the job classification system between job sites using the association rules derived from the association analysis. We will complete the mapping between these market segments, discuss with the experts, further map the SQF, and finally propose a new job classification system. As a result, more than 30,000 job listings were collected in XML format using open API in 'WORKNET,' 'JOBKOREA,' and 'saramin', which are the main job sites in Korea. After filtering out about 900 job postings simultaneously posted on multiple job sites, 800 association rules were derived by applying the Apriori algorithm, which is a frequent pattern mining. Based on 800 related rules, the job classification system of WORKNET, JOBKOREA, and saramin and the SQF job classification system were mapped and classified into 1st and 4th stages. In the new job taxonomy, the first primary class, IT consulting, computer system, network, and security related job system, consisted of three secondary classifications, five tertiary classifications, and five fourth classifications. The second primary classification, the database and the job system related to system operation, consisted of three secondary classifications, three tertiary classifications, and four fourth classifications. The third primary category, Web Planning, Web Programming, Web Design, and Game, was composed of four secondary classifications, nine tertiary classifications, and two fourth classifications. The last primary classification, job systems related to ICT management, computer and communication engineering technology, consisted of three secondary classifications and six tertiary classifications. In particular, the new job classification system has a relatively flexible stage of classification, unlike other existing classification systems. WORKNET divides jobs into third categories, JOBKOREA divides jobs into second categories, and the subdivided jobs into keywords. saramin divided the job into the second classification, and the subdivided the job into keyword form. The newly proposed standard job classification system accepts some keyword-based jobs, and treats some product names as jobs. In the classification system, not only are jobs suspended in the second classification, but there are also jobs that are subdivided into the fourth classification. This reflected the idea that not all jobs could be broken down into the same steps. We also proposed a combination of rules and experts' opinions from market data collected and conducted associative analysis. Therefore, the newly proposed job classification system can be regarded as a data-based intelligent job classification system that reflects the market demand, unlike the existing job classification system. This study is meaningful in that it suggests a new job classification system that reflects market demand by attempting mapping between occupations based on data through the association analysis between occupations rather than intuition of some experts. However, this study has a limitation in that it cannot fully reflect the market demand that changes over time because the data collection point is temporary. As market demands change over time, including seasonal factors and major corporate public recruitment timings, continuous data monitoring and repeated experiments are needed to achieve more accurate matching. The results of this study can be used to suggest the direction of improvement of SQF in the SW industry in the future, and it is expected to be transferred to other industries with the experience of success in the SW industry.

Theoretical Research for Unmanned Aircraft Electromagnetic Survey: Electromagnetic Field Calculation and Analysis by Arbitrary Shaped Transmitter-Loop (무인 항공 전자탐사 이론 연구: 임의 모양의 송신루프에 의한 전자기장 반응 계산 및 분석)

  • Bang, Minkyu;Oh, Seokmin;Seol, Soon Jee;Lee, Ki Ha;Cho, Seong-Jun
    • Geophysics and Geophysical Exploration
    • /
    • v.21 no.3
    • /
    • pp.150-161
    • /
    • 2018
  • Recently, unmanned aircraft EM (electromagnetic) survey based on ICT (Information and Communication Technology) has been widely utilized because of the efficiency in regional survey. We performed the theoretical study on the unmanned airship EM system developed by KIGAM (Korea Institute of Geoscience and Mineral resources) as part of the practical application of unmanned aircraft EM survey. Since this system has different configurations of transmitting and receiving loops compared to the conventional aircraft EM systems, a new technique is required for the appropriate interpretation of measured responses. Therefore, we proposed a method to calculate the EM field for the arbitrary shaped transmitter and verified its validity through the comparison with analytic solution for circular loop. In addition, to simulate the magnetic responses by three-dimensionally (3D) distributed anomalies, we have adapted our algorithm to 3D frequency-domain EM modeling algorithm based on the edge-FEM (finite element method). Though the analysis on magnetic field responses from a subsurface anomaly, it was found that the response decreases as the depth of the anomaly increases or the flight altitude increases. Also, it was confirmed that the response became smaller as the resistivity of the anomaly increases. However, a nonlinear trend of the out-of-phase component is shown depending on the depth of the anomaly and the used frequency, that makes it difficult to apply simple analysis based on the mapping of the magnitude of the responses and can cause the non-uniqueness problem in calculating the apparent resistivity. Thus, it is a prerequisite to analyze the appropriate frequency band and flight altitude considering the purpose of the survey and the site conditions when conducting a survey using the unmanned aircraft EM system.

Water Balance Projection Using Climate Change Scenarios in the Korean Peninsula (기후변화 시나리오를 활용한 미래 한반도 물수급 전망)

  • Kim, Cho-Rong;Kim, Young-Oh;Seo, Seung Beom;Choi, Su-Woong
    • Journal of Korea Water Resources Association
    • /
    • v.46 no.8
    • /
    • pp.807-819
    • /
    • 2013
  • This study proposes a new methodology for future water balance projection considering climate change by assigning a weight to each scenario instead of inputting future streamflows based on GCMs into a water balance model directly. K-nearest neighbor algorithm was employed to assign weights and streamflows in non-flood period (October to the following June) was selected as the criterion for assigning weights. GCM-driven precipitation was input to TANK model to simulate future streamflow scenarios and Quantile Mapping was applied to correct bias between GCM hindcast and historical data. Based on these bias-corrected streamflows, different weights were assigned to each streamflow scenarios to calculate water shortage for the projection periods; 2020s (2010~2039), 2050s (2040~2069), and 2080s (2070~2099). As a result by applying the proposed methodology to project water shortage over the Korean Peninsula, average water shortage for 2020s is projected to increase to 10~32% comparing to the basis (1967~2003). In addition, according to getting decreased in streamflows in non-flood period gradually by 2080s, average water shortage for 2080s is projected to increase up to 97% (516.5 million $m^3/yr$) as maximum comparing to the basis. While the existing research on climate change gives radical increase in future water shortage, the results projected by the weighting method shows conservative change. This study has significance in the applicability of water balance projection regarding climate change, keeping the existing framework of national water resources planning and this lessens the confusion for decision-makers in water sectors.

A Dynamic Prefetch Filtering Schemes to Enhance Usefulness Of Cache Memory (캐시 메모리의 유용성을 높이는 동적 선인출 필터링 기법)

  • Chon Young-Suk;Lee Byung-Kwon;Lee Chun-Hee;Kim Suk-Il;Jeon Joong-Nam
    • The KIPS Transactions:PartA
    • /
    • v.13A no.2 s.99
    • /
    • pp.123-136
    • /
    • 2006
  • The prefetching technique is an effective way to reduce the latency caused memory access. However, excessively aggressive prefetch not only leads to cache pollution so as to cancel out the benefits of prefetch but also increase bus traffic leading to overall performance degradation. In this thesis, a prefetch filtering scheme is proposed which dynamically decides whether to commence prefetching by referring a filtering table to reduce the cache pollution due to unnecessary prefetches In this thesis, First, prefetch hashing table 1bitSC filtering scheme(PHT1bSC) has been shown to analyze the lock problem of the conventional scheme, this scheme such as conventional scheme used to be N:1 mapping, but it has the two state to 1bit value of each entries. A complete block address table filtering scheme(CBAT) has been introduced to be used as a reference for the comparative study. A prefetch block address lookup table scheme(PBALT) has been proposed as the main idea of this paper which exhibits the most exact filtering performance. This scheme has a length of the table the same as the PHT1bSC scheme, the contents of each entry have the fields the same as CBAT scheme recently, never referenced data block address has been 1:1 mapping a entry of the filter table. On commonly used prefetch schemes and general benchmarks and multimedia programs simulates change cache parameters. The PBALT scheme compared with no filtering has shown enhanced the greatest 22%, the cache miss ratio has been decreased by 7.9% by virtue of enhanced filtering accuracy compared with conventional PHT2bSC. The MADT of the proposed PBALT scheme has been decreased by 6.1% compared with conventional schemes to reduce the total execution time.