• 제목/요약/키워드: software process

Search Result 4,749, Processing Time 0.033 seconds

Applying Meta-model Formalization of Part-Whole Relationship to UML: Experiment on Classification of Aggregation and Composition (UML의 부분-전체 관계에 대한 메타모델 형식화 이론의 적용: 집합연관 및 복합연관 판별 실험)

  • Kim, Taekyung
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.99-118
    • /
    • 2015
  • Object-oriented programming languages have been widely selected for developing modern information systems. The use of concepts relating to object-oriented (OO, in short) programming has reduced efforts of reusing pre-existing codes, and the OO concepts have been proved to be a useful in interpreting system requirements. In line with this, we have witnessed that a modern conceptual modeling approach supports features of object-oriented programming. Unified Modeling Language or UML becomes one of de-facto standards for information system designers since the language provides a set of visual diagrams, comprehensive frameworks and flexible expressions. In a modeling process, UML users need to consider relationships between classes. Based on an explicit and clear representation of classes, the conceptual model from UML garners necessarily attributes and methods for guiding software engineers. Especially, identifying an association between a class of part and a class of whole is included in the standard grammar of UML. The representation of part-whole relationship is natural in a real world domain since many physical objects are perceived as part-whole relationship. In addition, even abstract concepts such as roles are easily identified by part-whole perception. It seems that a representation of part-whole in UML is reasonable and useful. However, it should be admitted that the use of UML is limited due to the lack of practical guidelines on how to identify a part-whole relationship and how to classify it into an aggregate- or a composite-association. Research efforts on developing the procedure knowledge is meaningful and timely in that misleading perception to part-whole relationship is hard to be filtered out in an initial conceptual modeling thus resulting in deterioration of system usability. The current method on identifying and classifying part-whole relationships is mainly counting on linguistic expression. This simple approach is rooted in the idea that a phrase of representing has-a constructs a par-whole perception between objects. If the relationship is strong, the association is classified as a composite association of part-whole relationship. In other cases, the relationship is an aggregate association. Admittedly, linguistic expressions contain clues for part-whole relationships; therefore, the approach is reasonable and cost-effective in general. Nevertheless, it does not cover concerns on accuracy and theoretical legitimacy. Research efforts on developing guidelines for part-whole identification and classification has not been accumulated sufficient achievements to solve this issue. The purpose of this study is to provide step-by-step guidelines for identifying and classifying part-whole relationships in the context of UML use. Based on the theoretical work on Meta-model Formalization, self-check forms that help conceptual modelers work on part-whole classes are developed. To evaluate the performance of suggested idea, an experiment approach was adopted. The findings show that UML users obtain better results with the guidelines based on Meta-model Formalization compared to a natural language classification scheme conventionally recommended by UML theorists. This study contributed to the stream of research effort about part-whole relationships by extending applicability of Meta-model Formalization. Compared to traditional approaches that target to establish criterion for evaluating a result of conceptual modeling, this study expands the scope to a process of modeling. Traditional theories on evaluation of part-whole relationship in the context of conceptual modeling aim to rule out incomplete or wrong representations. It is posed that qualification is still important; but, the lack of consideration on providing a practical alternative may reduce appropriateness of posterior inspection for modelers who want to reduce errors or misperceptions about part-whole identification and classification. The findings of this study can be further developed by introducing more comprehensive variables and real-world settings. In addition, it is highly recommended to replicate and extend the suggested idea of utilizing Meta-model formalization by creating different alternative forms of guidelines including plugins for integrated development environments.

A Study on Defense and Attack Model for Cyber Command Control System based Cyber Kill Chain (사이버 킬체인 기반 사이버 지휘통제체계 방어 및 공격 모델 연구)

  • Lee, Jung-Sik;Cho, Sung-Young;Oh, Heang-Rok;Han, Myung-Mook
    • Journal of Internet Computing and Services
    • /
    • v.22 no.1
    • /
    • pp.41-50
    • /
    • 2021
  • Cyber Kill Chain is derived from Kill chain of traditional military terms. Kill chain means "a continuous and cyclical process from detection to destruction of military targets requiring destruction, or dividing it into several distinct actions." The kill chain has evolved the existing operational procedures to effectively deal with time-limited emergency targets that require immediate response due to changes in location and increased risk, such as nuclear weapons and missiles. It began with the military concept of incapacitating the attacker's intended purpose by preventing it from functioning at any one stage of the process of reaching it. Thus the basic concept of the cyber kill chain is that the attack performed by a cyber attacker consists of each stage, and the cyber attacker can achieve the attack goal only when each stage is successfully performed, and from a defense point of view, each stage is detailed. It is believed that if a response procedure is prepared and responded, the chain of attacks is broken, and the attack of the attacker can be neutralized or delayed. Also, from the point of view of an attack, if a specific response procedure is prepared at each stage, the chain of attacks can be successful and the target of the attack can be neutralized. The cyber command and control system is a system that is applied to both defense and attack, and should present defensive countermeasures and offensive countermeasures to neutralize the enemy's kill chain during defense, and each step-by-step procedure to neutralize the enemy when attacking. Therefore, thist paper proposed a cyber kill chain model from the perspective of defense and attack of the cyber command and control system, and also researched and presented the threat classification/analysis/prediction framework of the cyber command and control system from the defense aspect

Real-time CRM Strategy of Big Data and Smart Offering System: KB Kookmin Card Case (KB국민카드의 빅데이터를 활용한 실시간 CRM 전략: 스마트 오퍼링 시스템)

  • Choi, Jaewon;Sohn, Bongjin;Lim, Hyuna
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.1-23
    • /
    • 2019
  • Big data refers to data that is difficult to store, manage, and analyze by existing software. As the lifestyle changes of consumers increase the size and types of needs that consumers desire, they are investing a lot of time and money to understand the needs of consumers. Companies in various industries utilize Big Data to improve their products and services to meet their needs, analyze unstructured data, and respond to real-time responses to products and services. The financial industry operates a decision support system that uses financial data to develop financial products and manage customer risks. The use of big data by financial institutions can effectively create added value of the value chain, and it is possible to develop a more advanced customer relationship management strategy. Financial institutions can utilize the purchase data and unstructured data generated by the credit card, and it becomes possible to confirm and satisfy the customer's desire. CRM has a granular process that can be measured in real time as it grows with information knowledge systems. With the development of information service and CRM, the platform has change and it has become possible to meet consumer needs in various environments. Recently, as the needs of consumers have diversified, more companies are providing systematic marketing services using data mining and advanced CRM (Customer Relationship Management) techniques. KB Kookmin Card, which started as a credit card business in 1980, introduced early stabilization of processes and computer systems, and actively participated in introducing new technologies and systems. In 2011, the bank and credit card companies separated, leading the 'Hye-dam Card' and 'One Card' markets, which were deviated from the existing concept. In 2017, the total use of domestic credit cards and check cards grew by 5.6% year-on-year to 886 trillion won. In 2018, we received a long-term rating of AA + as a result of our credit card evaluation. We confirmed that our credit rating was at the top of the list through effective marketing strategies and services. At present, Kookmin Card emphasizes strategies to meet the individual needs of customers and to maximize the lifetime value of consumers by utilizing payment data of customers. KB Kookmin Card combines internal and external big data and conducts marketing in real time or builds a system for monitoring. KB Kookmin Card has built a marketing system that detects realtime behavior using big data such as visiting the homepage and purchasing history by using the customer card information. It is designed to enable customers to capture action events in real time and execute marketing by utilizing the stores, locations, amounts, usage pattern, etc. of the card transactions. We have created more than 280 different scenarios based on the customer's life cycle and are conducting marketing plans to accommodate various customer groups in real time. We operate a smart offering system, which is a highly efficient marketing management system that detects customers' card usage, customer behavior, and location information in real time, and provides further refinement services by combining with various apps. This study aims to identify the traditional CRM to the current CRM strategy through the process of changing the CRM strategy. Finally, I will confirm the current CRM strategy through KB Kookmin card's big data utilization strategy and marketing activities and propose a marketing plan for KB Kookmin card's future CRM strategy. KB Kookmin Card should invest in securing ICT technology and human resources, which are becoming more sophisticated for the success and continuous growth of smart offering system. It is necessary to establish a strategy for securing profit from a long-term perspective and systematically proceed. Especially, in the current situation where privacy violation and personal information leakage issues are being addressed, efforts should be made to induce customers' recognition of marketing using customer information and to form corporate image emphasizing security.

Open Digital Textbook for Smart Education (스마트교육을 위한 오픈 디지털교과서)

  • Koo, Young-Il;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.2
    • /
    • pp.177-189
    • /
    • 2013
  • In Smart Education, the roles of digital textbook is very important as face-to-face media to learners. The standardization of digital textbook will promote the industrialization of digital textbook for contents providers and distributers as well as learner and instructors. In this study, the following three objectives-oriented digital textbooks are looking for ways to standardize. (1) digital textbooks should undertake the role of the media for blended learning which supports on-off classes, should be operating on common EPUB viewer without special dedicated viewer, should utilize the existing framework of the e-learning learning contents and learning management. The reason to consider the EPUB as the standard for digital textbooks is that digital textbooks don't need to specify antoher standard for the form of books, and can take advantage od industrial base with EPUB standards-rich content and distribution structure (2) digital textbooks should provide a low-cost open market service that are currently available as the standard open software (3) To provide appropriate learning feedback information to students, digital textbooks should provide a foundation which accumulates and manages all the learning activity information according to standard infrastructure for educational Big Data processing. In this study, the digital textbook in a smart education environment was referred to open digital textbook. The components of open digital textbooks service framework are (1) digital textbook terminals such as smart pad, smart TVs, smart phones, PC, etc., (2) digital textbooks platform to show and perform digital contents on digital textbook terminals, (3) learning contents repository, which exist on the cloud, maintains accredited learning, (4) App Store providing and distributing secondary learning contents and learning tools by learning contents developing companies, and (5) LMS as a learning support/management tool which on-site class teacher use for creating classroom instruction materials. In addition, locating all of the hardware and software implement a smart education service within the cloud must have take advantage of the cloud computing for efficient management and reducing expense. The open digital textbooks of smart education is consdered as providing e-book style interface of LMS to learners. In open digital textbooks, the representation of text, image, audio, video, equations, etc. is basic function. But painting, writing, problem solving, etc are beyond the capabilities of a simple e-book. The Communication of teacher-to-student, learner-to-learnert, tems-to-team is required by using the open digital textbook. To represent student demographics, portfolio information, and class information, the standard used in e-learning is desirable. To process learner tracking information about the activities of the learner for LMS(Learning Management System), open digital textbook must have the recording function and the commnincating function with LMS. DRM is a function for protecting various copyright. Currently DRMs of e-boook are controlled by the corresponding book viewer. If open digital textbook admitt DRM that is used in a variety of different DRM standards of various e-book viewer, the implementation of redundant features can be avoided. Security/privacy functions are required to protect information about the study or instruction from a third party UDL (Universal Design for Learning) is learning support function for those with disabilities have difficulty in learning courses. The open digital textbook, which is based on E-book standard EPUB 3.0, must (1) record the learning activity log information, and (2) communicate with the server to support the learning activity. While the recording function and the communication function, which is not determined on current standards, is implemented as a JavaScript and is utilized in the current EPUB 3.0 viewer, ths strategy of proposing such recording and communication functions as the next generation of e-book standard, or special standard (EPUB 3.0 for education) is needed. Future research in this study will implement open source program with the proposed open digital textbook standard and present a new educational services including Big Data analysis.

The Effects on CRM Performance and Relationship Quality of Successful Elements in the Establishment of Customer Relationship Management: Focused on Marketing Approach (CRM구축과정에서 마케팅요인이 관계품질과 CRM성과에 미치는 영향)

  • Jang, Hyeong-Yu
    • Journal of Global Scholars of Marketing Science
    • /
    • v.18 no.4
    • /
    • pp.119-155
    • /
    • 2008
  • Customer Relationship Management(CRM) has been a sustainable competitive edge of many companies. CRM analyzes customer data for designing and executing targeted marketing analysing customer behavior in order to make decisions relating to products and services including management information system. It is critical for companies to get and maintain profitable customers. How to manage relationships with customers effectively has become an important issue for both academicians and practitioners in recent years. However, the existing academic literature and the practical applications of customer relationship management(CRM) strategies have been focused on the technical process and organizational structure about the implementation of CRM. These limited focus on CRM lead to the result of numerous reports of failed implementations of various types of CRM projects. Many of these failures are also related to the absence of marketing approach. Identifying successful factors and outcomes focused on marketing concept before introducing a CRM project are a pre-implementation requirements. Many researchers have attempted to find the factors that contribute to the success of CRM. However, these research have some limitations in terms of marketing approach without explaining how the marketing based factors contribute to the CRM success. An understanding of how to manage relationship with crucial customers effectively based marketing approach has become an important topic for both academicians and practitioners. However, the existing papers did not provide a clear antecedent and outcomes factors focused on marketing approach. This paper attempt to validate whether or not such various marketing factors would impact on relational quality and CRM performance in terms of marketing oriented perceptivity. More specifically, marketing oriented factors involving market orientation, customer orientation, customer information orientation, and core customer orientation can influence relationship quality(satisfaction and trust) and CRM outcome(customer retention and customer share). Another major goals of this research are to identify the effect of relationship quality on CRM outcomes consisted of customer retention and share to show the relationship strength between two factors. Based on meta analysis for conventional studies, I can construct the following research model. An empirical study was undertaken to test the hypotheses with data from various companies. Multiple regression analysis and t-test were employed to test the hypotheses. The reliability and validity of our measurements were tested by using Cronbach's alpha coefficient and principal factor analysis respectively, and seven hypotheses were tested through performing correlation test and multiple regression analysis. The first key outcome is a theoretically and empirically sound CRM factors(marketing orientation, customer orientation, customer information orientation, and core customer orientation.) in the perceptive of marketing. The intensification of ${\beta}$coefficient among antecedents factors in terms of marketing was not same. In particular, The effects on customer trust of marketing based CRM antecedents were significantly confirmed excluding core customer orientation. It was notable that the direct effects of core customer orientation on customer trust were not exist. This means that customer trust which is firmly formed by long term tasks will not be directly linked to the core customer orientation. the enduring management concerned with this interactions is probably more important for the successful implementation of CRM. The second key result is that the implementation and operation of successful CRM process in terms of marketing approach have a strong positive association with both relationship quality(customer trust/customer satisfaction) and CRM performance(customer retention and customer possession). The final key fact that relationship quality has a strong positive effect on customer retention and customer share confirms that improvements in customer satisfaction and trust improve accessibility to customers, provide more consistent service and ensure value-for-money within the front office which result in growth of customer retention and customer share. Particularly, customer satisfaction and trust which is main components of relationship quality are found to be positively related to the customer retention and customer share. Interactive managements of these main variables play key roles in connecting the successful antecedent of CRM with final outcome involving customer retention and share. Based on research results, This paper suggest managerial implications concerned with constructions and executions of CRM focusing on the marketing perceptivity. I can conclude in general the CRM can be achieved by the recognition of antecedents and outcomes based on marketing concept. The implementation of marketing concept oriented CRM will be connected with finding out about customers' purchasing habits, opinions and preferences profiling individuals and groups to market more effectively and increase sales changing the way you operate to improve customer service and marketing. Benefiting from CRM is not just a question of investing the right software, but adapt CRM users to the concept of marketing including marketing orientation, customer orientation, and customer information orientation. No one deny that CRM is a process or methodology used to develop stronger relationships being composed of many technological components, but thinking about CRM in primarily technological terms is a big mistake. We can infer from this paper that the more useful way to think and implement about CRM is as a process that will help bring together lots of pieces of marketing concept about customers, marketing effectiveness, and market trends. Finally, a real situation we conducted our research may enable academics and practitioners to understand the antecedents and outcomes in the perceptive of marketing more clearly.

  • PDF

The Implementation of a HACCP System through u-HACCP Application and the Verification of Microbial Quality Improvement in a Small Size Restaurant (소규모 외식업체용 IP-USN을 활용한 HACCP 시스템 적용 및 유효성 검증)

  • Lim, Tae-Hyeon;Choi, Jung-Hwa;Kang, Young-Jae;Kwak, Tong-Kyung
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.42 no.3
    • /
    • pp.464-477
    • /
    • 2013
  • There is a great need to develop a training program proven to change behavior and improve knowledge. The purpose of this study was to evaluate employee hygiene knowledge, hygiene practice, and cleanliness, before and after HACCP system implementation at one small-size restaurant. The efficiency of the system was analyzed using time-temperature control after implementation of u-HACCP$^{(R)}$. The employee hygiene knowledge and practices showed a significant improvement (p<0.05) after HACCP system implementation. In non-heating processes, such as seasoned lettuce, controlling the sanitation of the cooking facility and the chlorination of raw ingredients were identified as the significant CCP. Sanitizing was an important CCP because total bacteria were reduced 2~4 log CFU/g after implementation of HACCP. In bean sprouts, microbial levels decreased from 4.20 logCFU/g to 3.26 logCFU/g. There were significant correlations between hygiene knowledge, practice, and microbiological contamination. First, personnel hygiene had a significant correlation with 'total food hygiene knowledge' scores (p<0.05). Second, total food hygiene practice scores had a significant correlation (p<0.05) with improved microbiological qualities of lettuce salad. Third, concerning the assessment of microbiological quality after 1 month, there were significant (p<0.05) improvements in times of heating, and the washing and division process. On the other hand, after 2 months, microbiological was maintained, although only two categories (division process and kitchen floor) were improved. This study also investigated time-temperature control by using ubiquitous sensor networks (USN) consisting of an ubi reader (CCP thermometer), an ubi manager (tablet PC), and application software (HACCP monitoring system). The result of the temperature control before and after USN showed better thermal management (accuracy, efficiency, consistency of time control). Based on the results, strict time-temperature control could be an effective method to prevent foodborne illness.

Image Watermarking for Copyright Protection of Images on Shopping Mall (쇼핑몰 이미지 저작권보호를 위한 영상 워터마킹)

  • Bae, Kyoung-Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.147-157
    • /
    • 2013
  • With the advent of the digital environment that can be accessed anytime, anywhere with the introduction of high-speed network, the free distribution and use of digital content were made possible. Ironically this environment is raising a variety of copyright infringement, and product images used in the online shopping mall are pirated frequently. There are many controversial issues whether shopping mall images are creative works or not. According to Supreme Court's decision in 2001, to ad pictures taken with ham products is simply a clone of the appearance of objects to deliver nothing but the decision was not only creative expression. But for the photographer's losses recognized in the advertising photo shoot takes the typical cost was estimated damages. According to Seoul District Court precedents in 2003, if there are the photographer's personality and creativity in the selection of the subject, the composition of the set, the direction and amount of light control, set the angle of the camera, shutter speed, shutter chance, other shooting methods for capturing, developing and printing process, the works should be protected by copyright law by the Court's sentence. In order to receive copyright protection of the shopping mall images by the law, it is simply not to convey the status of the product, the photographer's personality and creativity can be recognized that it requires effort. Accordingly, the cost of making the mall image increases, and the necessity for copyright protection becomes higher. The product images of the online shopping mall have a very unique configuration unlike the general pictures such as portraits and landscape photos and, therefore, the general image watermarking technique can not satisfy the requirements of the image watermarking. Because background of product images commonly used in shopping malls is white or black, or gray scale (gradient) color, it is difficult to utilize the space to embed a watermark and the area is very sensitive even a slight change. In this paper, the characteristics of images used in shopping malls are analyzed and a watermarking technology which is suitable to the shopping mall images is proposed. The proposed image watermarking technology divide a product image into smaller blocks, and the corresponding blocks are transformed by DCT (Discrete Cosine Transform), and then the watermark information was inserted into images using quantization of DCT coefficients. Because uniform treatment of the DCT coefficients for quantization cause visual blocking artifacts, the proposed algorithm used weighted mask which quantizes finely the coefficients located block boundaries and coarsely the coefficients located center area of the block. This mask improves subjective visual quality as well as the objective quality of the images. In addition, in order to improve the safety of the algorithm, the blocks which is embedded the watermark are randomly selected and the turbo code is used to reduce the BER when extracting the watermark. The PSNR(Peak Signal to Noise Ratio) of the shopping mall image watermarked by the proposed algorithm is 40.7~48.5[dB] and BER(Bit Error Rate) after JPEG with QF = 70 is 0. This means the watermarked image is high quality and the algorithm is robust to JPEG compression that is used generally at the online shopping malls. Also, for 40% change in size and 40 degrees of rotation, the BER is 0. In general, the shopping malls are used compressed images with QF which is higher than 90. Because the pirated image is used to replicate from original image, the proposed algorithm can identify the copyright infringement in the most cases. As shown the experimental results, the proposed algorithm is suitable to the shopping mall images with simple background. However, the future study should be carried out to enhance the robustness of the proposed algorithm because the robustness loss is occurred after mask process.

An Ontology Model for Public Service Export Platform (공공 서비스 수출 플랫폼을 위한 온톨로지 모형)

  • Lee, Gang-Won;Park, Sei-Kwon;Ryu, Seung-Wan;Shin, Dong-Cheon
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.149-161
    • /
    • 2014
  • The export of domestic public services to overseas markets contains many potential obstacles, stemming from different export procedures, the target services, and socio-economic environments. In order to alleviate these problems, the business incubation platform as an open business ecosystem can be a powerful instrument to support the decisions taken by participants and stakeholders. In this paper, we propose an ontology model and its implementation processes for the business incubation platform with an open and pervasive architecture to support public service exports. For the conceptual model of platform ontology, export case studies are used for requirements analysis. The conceptual model shows the basic structure, with vocabulary and its meaning, the relationship between ontologies, and key attributes. For the implementation and test of the ontology model, the logical structure is edited using Prot$\acute{e}$g$\acute{e}$ editor. The core engine of the business incubation platform is the simulator module, where the various contexts of export businesses should be captured, defined, and shared with other modules through ontologies. It is well-known that an ontology, with which concepts and their relationships are represented using a shared vocabulary, is an efficient and effective tool for organizing meta-information to develop structural frameworks in a particular domain. The proposed model consists of five ontologies derived from a requirements survey of major stakeholders and their operational scenarios: service, requirements, environment, enterprise, and county. The service ontology contains several components that can find and categorize public services through a case analysis of the public service export. Key attributes of the service ontology are composed of categories including objective, requirements, activity, and service. The objective category, which has sub-attributes including operational body (organization) and user, acts as a reference to search and classify public services. The requirements category relates to the functional needs at a particular phase of system (service) design or operation. Sub-attributes of requirements are user, application, platform, architecture, and social overhead. The activity category represents business processes during the operation and maintenance phase. The activity category also has sub-attributes including facility, software, and project unit. The service category, with sub-attributes such as target, time, and place, acts as a reference to sort and classify the public services. The requirements ontology is derived from the basic and common components of public services and target countries. The key attributes of the requirements ontology are business, technology, and constraints. Business requirements represent the needs of processes and activities for public service export; technology represents the technological requirements for the operation of public services; and constraints represent the business law, regulations, or cultural characteristics of the target country. The environment ontology is derived from case studies of target countries for public service operation. Key attributes of the environment ontology are user, requirements, and activity. A user includes stakeholders in public services, from citizens to operators and managers; the requirements attribute represents the managerial and physical needs during operation; the activity attribute represents business processes in detail. The enterprise ontology is introduced from a previous study, and its attributes are activity, organization, strategy, marketing, and time. The country ontology is derived from the demographic and geopolitical analysis of the target country, and its key attributes are economy, social infrastructure, law, regulation, customs, population, location, and development strategies. The priority list for target services for a certain country and/or the priority list for target countries for a certain public services are generated by a matching algorithm. These lists are used as input seeds to simulate the consortium partners, and government's policies and programs. In the simulation, the environmental differences between Korea and the target country can be customized through a gap analysis and work-flow optimization process. When the process gap between Korea and the target country is too large for a single corporation to cover, a consortium is considered an alternative choice, and various alternatives are derived from the capability index of enterprises. For financial packages, a mix of various foreign aid funds can be simulated during this stage. It is expected that the proposed ontology model and the business incubation platform can be used by various participants in the public service export market. It could be especially beneficial to small and medium businesses that have relatively fewer resources and experience with public service export. We also expect that the open and pervasive service architecture in a digital business ecosystem will help stakeholders find new opportunities through information sharing and collaboration on business processes.

Effective 3-D GPR Survey for the Exploration of Old Remains (유적지 발굴을 위한 효율적 3차원 GPR 탐사)

  • Kim, Jung-Ho;Yi, Myeong-Jong;Son, Jeong-Sul;Cho, Seong-Jun;Park, Sam-Gyu
    • Geophysics and Geophysical Exploration
    • /
    • v.8 no.4
    • /
    • pp.262-269
    • /
    • 2005
  • Since the buried cultural relics are three-dimensional (3-D) objects in nature, 3-D survey is more preferable in archeological exploration. 3-D Ground Penetrating Radar (GPR) survey based on very dense data in principle, however, might need much higher cost and longer time of exploration than other geophysical methods commonly used for the archeological exploration, such as magnetic and electromagnetic methods. We developed a small-scale continuous data acquisition system which consists of two sets of GPR antennas and the precise positioning device tracking the moving-path of GPR antenna automatically and continuously. Since the high cost of field work may be partly attributed to establishing many profile lines, we adopted a concept of data acquisition at arbitrary locations not along the pre-established profile lines. Besides this hardware system, we also developed several software packages in order to effectively process and visualize the 3-D data obtained by the developed system and the data acquisition concept. Using the developed system, we performed 3-D GPR survey to investigate the possible historical remains of Baekje Kingdom at Buyeo city, South Korea, prior to the excavation. Owing to the newly devised system, we could obtain 3-D GPR data of this survey area having areal extent over about $17,000m^2$ within only six-hours field work. Although the GPR data were obtained at random locations not along the pre-established profile lines, we could obtain high-resolution 3-D images showing many distinctive anomalies, which could be interpreted as old agricultural lands, waterways, and artificial structures or remains. This cast: history led us to the conclusion that 3-D GPR method is very useful not only to examine a small anomalous area but also to investigate the wider region of the archeological interests.

(Image Analysis of Electrophoresis Gels by using Region Growing with Multiple Peaks) (다중 피크의 영역 성장 기법에 의한 전기영동 젤의 영상 분석)

  • 김영원;전병환
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.5_6
    • /
    • pp.444-453
    • /
    • 2003
  • Recently, a great interest of bio-technology(BT) is concentrated and the image analysis technique for electrophoresis gels is highly requested to analyze genetic information or to look for some new bio-activation materials. For this purpose, the location and quantity of each band in a lane should be measured. In most of existing techniques, the approach of peak searching in a profile of a lane is used. But this peak is improper as the representative of a band, because its location does not correspond to that of the brightest pixel or the center of gravity. Also, it is improper to measure band quantity in most of these approaches because various enhancement processes are commonly applied to original images to extract peaks easily. In this paper, we adopt an approach to measure accumulated brightness as a band quantity in each band region, which Is extracted by not using any process of changing relative brightness, and the gravity center of the region is calculated as a band location. Actually, we first extract lanes with an entropy-based threshold calculated on a gel-image histogram. And then, three other methods are proposed and applied to extract bands. In the MER method, peaks and valleys are searched on a vertical search line by which each lane is bisected. And the minimum enclosing rectangle of each band is set between successive two valleys. On the other hand, in the RG-1 method, each band is extracted by using region growing with a peak as a seed, separating overlapped neighbor bands. In the RG-2 method, peaks and valleys are searched on two vertical lines by which each lane is trisected, and the left and right peaks nay be paired up if they seem to belong to the same band, and then each band region is grown up with a peak or both peaks if exist. To compare above three methods, we have measured the location and amount of bands. As a result, the average errors in band location of MER, RG-1, and RG-2 were 6%, 3%, and 1%, respectively, when the lane length is normalized to a unit value. And the average errors in band amount were 8%, 5%, and 2%, respectively, when the sum of band amount is normalized to a unit value. In conclusion, RG-2 was shown to be more reliable in the accuracy of measuring the location and amount of bands.