• Title/Summary/Keyword: Category Mapping

Search Result 61, Processing Time 0.027 seconds

An Evaluation of the Use of the Texture in Land Cover Classification Accuracy from SPOT HRV Image of Pusan Metropolitan Area (SPOT HRV 영상을 이용한 부산 지역 토지피복분류에 있어서의 질감의 기여에 관한 평가)

  • Jung, In-Chul
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.2 no.1
    • /
    • pp.32-44
    • /
    • 1999
  • Texture features can be incorporated in classification procedure to resolve class confusions. However, there have been few application-oriented studies made to evaluate the relative powers of texture analysis methods in a particular environment. This study evaluates the increases in the land-cover classification accuracy of the SPOT HRV multispectral data of Pusan Metropolitan area from texture processing. Twenty-four texture measures were derived from the SPOT HRV band 3 image. Each of these features were used in combination with the three spectral images in the classification of 10 land-cover classes. Supervised training and a Gaussian maximum likelihood classifier were used in the classification. It was found that while entropy produces the best empirical results in terms of the overall classification, other texture features can also largely improve the classification accuracies obtained by the use of the spectral images only. With the inclusion of texture, the classification for each category improves. Specially, urban built-up areas had much increase in accuracy. The results indicate that texture size 5 by 5 and 7 by 7 may be suitable at land cover classification of Pusan Metropolitan area.

  • PDF

A Study on Application of KORMARC-Integrated Format for Bibliographic Data for Management of Community Archive (마을기록물 관리를 위한 KORMARC-통합서지용 형식 적용에 관한 연구)

  • Kim, Boil
    • Journal of Korean Library and Information Science Society
    • /
    • v.50 no.2
    • /
    • pp.285-310
    • /
    • 2019
  • The integrated management of community archives, which is conducted by incorporating them into the category of library data through the system of library materials management is more necessary for residents to use community archives by arranging and preserving them collected by public libraries, than management implemented through a separate system. This study therefore draw25 descriptive factors by comparatively analyzing descriptive factors of the descriptive standards for community archives such as ISAD(G) and DC, etc. and integrating common or similar ones. Such 25 drawn descriptive factors were empirically tested three times by professionals through the Delphi technique, and 3 descriptive sectors and 21 descriptive factors were finally drawn. The drawn descriptive factors were divided into mandatory, mandatory if applicable and optional through KORMARC-integrated format for bibliographic data and mapping and applied to the system of library materials management, by adjusting the descriptive factors for the systems.

Evaluation of Neoadjuvant Chemotherapy Effect in Osteosarcoma (골육종에서 술전 항암화학요법의 효과 판정)

  • Joo, Min Wook;Kang, Yong-Koo;Yoo, Ie Ryung;Choi, Woo Hee;Chung, Yang-Guk;Kim, Dong-Hyun;Kang, Jin-Woo
    • The Journal of the Korean bone and joint tumor society
    • /
    • v.20 no.2
    • /
    • pp.66-73
    • /
    • 2014
  • Purpose: Various diagnostic imaging modalities have been used to evaluate the effect of neoadjuvant chemotherapy for osteosarcoma early and noninvasively. We evaluated the effectiveness of imaging studies of plain radiographs and positron-emission tomography/computed tomography (PET/CT) in predicting neoadjuvant chemotherapy effect for osteosarcoma and tried to establish a general principle in interpretation of PET/CT parameters. Materials and Methods: Eighteen patients who underwent two cycles of neoadjuvant chemotherapy and surgical excision for osteosarcoma were enrolled. There were 13 males and 5 females, with a median age of 19 (11-63) years. Fifteen patients of 18 had the American Joint Committe on Cancer (AJCC) stage IIB. They had plain radiographs and PET/CT before and after neoadjuvant chemotherapy. The resected tumor specimens were pathologically examined to determine histological response grade using a conventional mapping method. Statistical analysis was performed to evaluate the correlation between histopathological necrosis rate, and radiographic finding category, post-chemotherapy maximum standardized uptake value (SUVmax), average standardized uptake value and metabolic tumor volume (MTV) as well as reduction rates of them. Results: Eight patients were good responders to neoadjuvant chemotherapy based on histological evaluation. Median SUVmax reduction rate was 73 (23-77) % in good responders and 42 (-32-76) % in poor responders. Median MTV reduction rate was 93.5 (62-99) % in good responders and 46 (-81-100) % in poor responders. While radiographic finding category was not different according to histological response (p=1.0), SUVmax reduction rate was significantly different (p=0.041). Difference in MTV reduction rates approached statistical significance as well (p=0.071). Conclusion: While radiographic finding category was not reliable to assess neoadjuvant chemotherapy effect for osteosarcoma, reduction rate of SUVmax was a useful indicator in this study. As parameters of PET/CT can be influenced by various factors of settings, different centers have to make an effort to establish their own standard of judgement with reference of previous studies.

A Comparative Study of Fuzzy Relationship and ANN for Landslide Susceptibility in Pohang Area (퍼지관계 기법과 인공신경망 기법을 이용한 포항지역의 산사태 취약성 예측 기법 비교 연구)

  • Kim, Jin Yeob;Park, Hyuck Jin
    • Economic and Environmental Geology
    • /
    • v.46 no.4
    • /
    • pp.301-312
    • /
    • 2013
  • Landslides are caused by complex interaction among a large number of interrelated factors such as topography, geology, forest and soils. In this study, a comparative study was carried out using fuzzy relationship method and artificial neural network to evaluate landslide susceptibility. For landslide susceptibility mapping, maps of the landslide occurrence locations, slope angle, aspect, curvature, lithology, soil drainage, soil depth, soil texture, forest type, forest age, forest diameter and forest density were constructed from the spatial data sets. In fuzzy relation analysis, the membership values for each category of thematic layers have been determined using the cosine amplitude method. Then the integration of different thematic layers to produce landslide susceptibility map was performed by Cartesian product operation. In artificial neural network analysis, the relative weight values for causative factors were determined by back propagation algorithm. Landslide susceptibility maps prepared by two approaches were validated by ROC(Receiver Operating Characteristic) curve and AUC(Area Under the Curve). Based on the validation results, both approaches show excellent performance to predict the landslide susceptibility but the performance of the artificial neural network was superior in this study area.

OpenGL ES 1.1 Implementation Using OpenGL (OpenGL을 이용한 OpenGL ES 1.1 구현)

  • Lee, Hwan-Yong;Baek, Nak-Hoon
    • The KIPS Transactions:PartA
    • /
    • v.16A no.3
    • /
    • pp.159-168
    • /
    • 2009
  • In this paper, we present an efficient way of implementing OpenGL ES 1.1 standard for the environments with hardware-supported OpenGL API, such as desktop PCs. Although OpenGL ES was started from the existing OpenGL features, it becomes a new three-dimensional graphics library customized for embedded systems through introducing fixed-point arithmetic operations, buffer management with fixed-point data type supports, completely new texture mapping functionalities and others. Currently, it is the official three dimensional graphics library for Google Android, Apple iPhone, PlayStation3, etc. In this paper, we achieved improvements on the arithmetic operations for the fixed-point number representation, which is the most characteristic data type for OpenGL ES. For the conversion of fixed-point data types to the floating-point number representations for the underlying OpenGL, we show the way of efficient conversion processes even with satisfying OpenGL ES standard requirements. We also introduced a simple memory management scheme to mange the converted data for the buffer containing fixed-point numbers. In the case of texture processing, the requirements in both standards are quite different and thus we used completely new software-implementations. Our final implementation result of OpenGL ES library provides all of over than 200 functions in OpenGL ES 1.1 standard and completely passed its conformance test, to show its compliance with the standard. From the efficiency viewpoint, we measured its execution times for several OpenGL ES-specific application programs and achieved at most 33.147 times improvements, to become the fastest one among the OpenGL ES implementations in the same category.

A Study of Forest Education Concept Mapping of Pre-Service Teachers and In-Service Teachers' for Young Children (숲교육(숲활동)에 대한 유아교사와 예비유아교사의 지식개념 연구)

  • Lee, Youn Sun;Kyun, Ju Youn;Lee, Si Eun;Lee, So Young
    • Korean Journal of Childcare and Education
    • /
    • v.10 no.4
    • /
    • pp.29-49
    • /
    • 2014
  • This study focused on how early childhood in-service teachers and pre-service teachers understood the concept of Forest Education. By applying the analysis of conceptual maps, introduced by Novak and Gowin's (1983, 1984), this study examined the number of upper category and subcategories, characteristics, hierarchy, and density of teachers' knowledge of Forest Education. 39 early childhood teachers and 60 pre-service teachers participated in this study. First, in-service teachers put 'nature(forest)' and 'living creatures' in the highest level of knowledge of Forest Education. On the other hands, pre-service teachers put 'the effect of Forest Education' and 'program' as well as 'nature(forest)' in the highest level of knowledge. In-service teachers seemed to construct their knowledge by understanding Forest Education as curriculum or activities such as math, language, music or multicultural education. Therefore they had a tendency to talk more about specific concepts including four seasons, insects, air, or climate change. However, pre-service teachers described 'the interconnectedness between human and nature', 'deep relationship with nature' and 'provision of nature.' This tendency might relate to their prior educational experiences of Eco-centered Early Childhood Education. With regards to the density and hierarchy of knowledge on Forest Education, both groups revealed the relatively low degree of density with average of around 2.00. This result can be interpreted that both groups of teachers might not have strong hierarchical and organizational knowledge of Forest Education. For the teacher education, philosophical background and practical knowledge of Forest Education should be included more.

A Study on the Field Application through the Improvement of Scoring System for HACCP Evaluation Items of Cattle Farm (소 농장 HACCP 평가항목의 점수부여 체계 개선을 통한 현장 적용 연구)

  • Baek, Seung-Hee;Nam, In-Sik
    • Korean Journal of Organic Agriculture
    • /
    • v.26 no.4
    • /
    • pp.759-774
    • /
    • 2018
  • This study was conducted to establish scores according to the importance levels of each HACCP evaluation item in cattle farm. The importance levels and score of each HACCP evaluation item was derived through the non-compliance rate and severity levels of hazard. In order to change the score criteria according to the importance of each HACCP evaluation item, we analyzed the importance of each HACCP evaluation item by using the portfolio mapping method according to the occurrence frequency and severity levels of hazard. The scores were classified into 3 point, 2 point, and 1 point, respectively, by classifying the importance of each category as 'high', 'middle' and 'lower'. Accordingly, we have established a now scoring system of each HACCP evaluation item through this study. Through the result of this study, the objectivity of the comparative evaluation was verified by implementing the currently used HACCP evaluation item to the cattle farm. In conclusion, Implementation of the result of this study to cattle farm may help to increase the objectivity and also improve more safety and hygienic cattle management and raw milk production.

Knowledge Visualization and Mapping of Studies on Social Systems Theory in Social Sciences: Focused on Niklas Luhmann (사회과학 분야 사회적 체계 이론 연구의 지식 시각화와 매핑 - Niklas Luhmann을 중심으로 -)

  • Park, Seongwoo;Hong, Soram
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.56 no.1
    • /
    • pp.253-275
    • /
    • 2022
  • Niklas Luhmann is one of the most contentious and difficult theorist in sociology but follow-up studies on his theory gradually increase for recent 10 years. The purpose of this study is to observe how follow-up studies use the difficult concepts of Luhmann. Unlike previous studies, this study adopted a keyword rather than an article as the unit of analysis because keywords are linguistic constructs that can make concepts observable. The study analyzed co-occurrence of keywords in 139 articles retrieved from social sciences category in Web of Science DB. The key findings were following: the most important keywords were the name of Luhmann(Niklas Luhmann) and theory(social systems); keywords were grouped into 4 clusters(social systems theory, systems theory, legal system and political system, the significant of Luhmann's theory from the viewpoint of the history of social theory); topic terms were systems theory, communication, Autopoiesis, risk, legal system, functional differentiation, environment, social theory, sociological theory, structural coupling, systems and evolution. The significance of the study is following: the study gives keywords as useful access point for beginners of Luhmann's theory; the study proves that content analysis by keywords network can be applied to trend analysis of difficult theoretical researches.

Development of Intelligent Job Classification System based on Job Posting on Job Sites (구인구직사이트의 구인정보 기반 지능형 직무분류체계의 구축)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.123-139
    • /
    • 2019
  • The job classification system of major job sites differs from site to site and is different from the job classification system of the 'SQF(Sectoral Qualifications Framework)' proposed by the SW field. Therefore, a new job classification system is needed for SW companies, SW job seekers, and job sites to understand. The purpose of this study is to establish a standard job classification system that reflects market demand by analyzing SQF based on job offer information of major job sites and the NCS(National Competency Standards). For this purpose, the association analysis between occupations of major job sites is conducted and the association rule between SQF and occupation is conducted to derive the association rule between occupations. Using this association rule, we proposed an intelligent job classification system based on data mapping the job classification system of major job sites and SQF and job classification system. First, major job sites are selected to obtain information on the job classification system of the SW market. Then We identify ways to collect job information from each site and collect data through open API. Focusing on the relationship between the data, filtering only the job information posted on each job site at the same time, other job information is deleted. Next, we will map the job classification system between job sites using the association rules derived from the association analysis. We will complete the mapping between these market segments, discuss with the experts, further map the SQF, and finally propose a new job classification system. As a result, more than 30,000 job listings were collected in XML format using open API in 'WORKNET,' 'JOBKOREA,' and 'saramin', which are the main job sites in Korea. After filtering out about 900 job postings simultaneously posted on multiple job sites, 800 association rules were derived by applying the Apriori algorithm, which is a frequent pattern mining. Based on 800 related rules, the job classification system of WORKNET, JOBKOREA, and saramin and the SQF job classification system were mapped and classified into 1st and 4th stages. In the new job taxonomy, the first primary class, IT consulting, computer system, network, and security related job system, consisted of three secondary classifications, five tertiary classifications, and five fourth classifications. The second primary classification, the database and the job system related to system operation, consisted of three secondary classifications, three tertiary classifications, and four fourth classifications. The third primary category, Web Planning, Web Programming, Web Design, and Game, was composed of four secondary classifications, nine tertiary classifications, and two fourth classifications. The last primary classification, job systems related to ICT management, computer and communication engineering technology, consisted of three secondary classifications and six tertiary classifications. In particular, the new job classification system has a relatively flexible stage of classification, unlike other existing classification systems. WORKNET divides jobs into third categories, JOBKOREA divides jobs into second categories, and the subdivided jobs into keywords. saramin divided the job into the second classification, and the subdivided the job into keyword form. The newly proposed standard job classification system accepts some keyword-based jobs, and treats some product names as jobs. In the classification system, not only are jobs suspended in the second classification, but there are also jobs that are subdivided into the fourth classification. This reflected the idea that not all jobs could be broken down into the same steps. We also proposed a combination of rules and experts' opinions from market data collected and conducted associative analysis. Therefore, the newly proposed job classification system can be regarded as a data-based intelligent job classification system that reflects the market demand, unlike the existing job classification system. This study is meaningful in that it suggests a new job classification system that reflects market demand by attempting mapping between occupations based on data through the association analysis between occupations rather than intuition of some experts. However, this study has a limitation in that it cannot fully reflect the market demand that changes over time because the data collection point is temporary. As market demands change over time, including seasonal factors and major corporate public recruitment timings, continuous data monitoring and repeated experiments are needed to achieve more accurate matching. The results of this study can be used to suggest the direction of improvement of SQF in the SW industry in the future, and it is expected to be transferred to other industries with the experience of success in the SW industry.

Comparative Study on the Methodology of Motor Vehicle Emission Calculation by Using Real-Time Traffic Volume in the Kangnam-Gu (자동차 대기오염물질 산정 방법론 설정에 관한 비교 연구 (강남구의 실시간 교통량 자료를 이용하여))

  • 박성규;김신도;이영인
    • Journal of Korean Society of Transportation
    • /
    • v.19 no.4
    • /
    • pp.35-47
    • /
    • 2001
  • Traffic represents one of the largest sources of primary air pollutants in urban area. As a consequence. numerous abatement strategies are being pursued to decrease the ambient concentration of pollutants. A characteristic of most of the these strategies is a requirement for accurate data on both the quantity and spatial distribution of emissions to air in the form of an atmospheric emission inventory database. In the case of traffic pollution, such an inventory must be compiled using activity statistics and emission factors for vehicle types. The majority of inventories are compiled using passive data from either surveys or transportation models and by their very nature tend to be out-of-date by the time they are compiled. The study of current trends are towards integrating urban traffic control systems and assessments of the environmental effects of motor vehicles. In this study, a methodology of motor vehicle emission calculation by using real-time traffic data was studied. A methodology for estimating emissions of CO at a test area in Seoul. Traffic data, which are required on a street-by-street basis, is obtained from induction loops of traffic control system. It was calculated speed-related mass of CO emission from traffic tail pipe of data from traffic system, and parameters are considered, volume, composition, average velocity, link length. And, the result was compared with that of a method of emission calculation by VKT(Vehicle Kilometer Travelled) of vehicles of category.

  • PDF