• Title/Summary/Keyword: Checks

Search Result 831, Processing Time 0.023 seconds

Restoration planning of the Seoul Metropolitan area, Korea toward eco-city

  • Lee, Chang Seok
    • Proceedings of the Korea Society of Environmental Biology Conference
    • /
    • 2003.06a
    • /
    • pp.1-5
    • /
    • 2003
  • In order to prepare a basis for ecological restoration of the Seoul Metropolitan area, ecological diagnoses on soil physico-chemical properties and vegetation structure were carried out. Land use patterns, actual vegetation, and biotope patterns were also investigated based on aerial photograph interpretation and field checks. I formulated landscape elements overlaying those data and evaluated the ecological value of each element. Soil pollution was evaluated by analyzing soil samples collected in each grid on the mesh map, divided by 2km $\times$ 2km intervals. Soil samples were collected in forests or grasslands escaped from direct human interference. Soil pollution evaluated from pH, and SO$_4$, Ca, Mg, and Al contents of soil was more severe in the urban outskirts than in the urban center. Those soil environmental factors showed significant correlation with each other. Vegetation in the urban area was different in species composition from that in suburban areas and showed lower diversity compared with that in the suburban areas. Successional process investigated by population structure of major species also showed a difference. That is, successional trend was normal in suburban areas, but that in urban areas showed a retrogressive pattern. The landscape ecological map of Seoul indicates that the urban center lacks vegetation and greenery space is restricted in urban outskirts. Such an uneven distribution of vegetation has caused a specific urban climate and thereby contributed to aggravation of air and soil pollution, furthermore causing vegetation decline. From this result, it was estimated that such uneven distribution of vegetation functioned as a trigger factor to deteriorate the urban environment. I suggested, therefore, a restoration plan based on landscape ecological principles, which emphasizes connectivity and even distribution of green areas throughout the whole area of the Seoul to solve this complex environmental problem. In this restoration plan, first of all, I decided the priority order for connection of the fragmented greenery spaces based on the distances from the core reserves comprised of green belt and rivers, which play roles as habitats of wildlife as well as for improvement of urban environment. Next, I prepared methods to restore each landscape element included in the paths of green network to be constructed in the future on the bases of such preferential order. Rivers and roads, which hold good connectivity, were chosen as elements to play important roles in constructing green network by linking the fragmented greenery spaces.

  • PDF

Effective Normalization Method for Fraud Detection Using a Decision Tree (의사결정나무를 이용한 이상금융거래 탐지 정규화 방법에 관한 연구)

  • Park, Jae Hoon;Kim, Huy Kang;Kim, Eunjin
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.25 no.1
    • /
    • pp.133-146
    • /
    • 2015
  • Ever sophisticated e-finance fraud techniques have led to an increasing number of reported phishing incidents. Financial authorities, in response, have recommended that we enhance existing Fraud Detection Systems (FDS) of banks and other financial institutions. FDSs are systems designed to prevent e-finance accidents through real-time access and validity checks on client transactions. The effectiveness of an FDS depends largely on how fast it can analyze and detect abnormalities in large amounts of customer transaction data. In this study we detect fraudulent transaction patterns and establish detection rules through e-finance accident data analyses. Abnormalities are flagged by comparing individual client transaction patterns with client profiles, using the ruleset. We propose an effective flagging method that uses decision trees to normalize detection rules. In demonstration, we extracted customer usage patterns, customer profile informations and detection rules from the e-finance accident data of an actual domestic(Korean) bank. We then compared the results of our decision tree-normalized detection rules with the results of a sequential detection and confirmed the efficiency of our methods.

Design and Implementation of Priority Retrieval Technique based on SIF (SIF기반 우선순위 검색기법의 설계 및 구현)

  • Lee, Eun-Sik;Cho, Dae-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.11
    • /
    • pp.2535-2540
    • /
    • 2010
  • In traditional Publish/Subscribe system, the first procedure to deliver event from publisher to subscriber is that publisher publishes publisher's event to broker. Next step is that broker checks simple binary notion of matching : an event either matches a subscription or it does not. Lastly, broker delivers the event matched with subscriptions to the corresponding subscribers. In this system, information delivery has been accomplished in one way only. However, current some applications require two way delivery between subscriber and publisher. Therefore, we initiate an extended Publish/Subscribe system that supports two way delivery. Extended Publish/Subscribe system requires additional functions of delivering subscription to publisher and especially deciding top-n subscriptions using priority because broker might has a number of subscriptions. In this paper, we propose two priority retrieval techniques based on SIF using IS-List with deciding priority among subscriptions and defining SIF(Specific Interval First). The performance measurements show that RSO(resulting set sorting) technique results in better performance in index creation time and ITS&IS(insertion time sorting and inverse search using stack) technique results in better performance in search time.

An Improved Signature Hashing Algorithm for High Performance Network Intrusion Prevention System (고성능 네트워크 침입방지시스템을 위한 개선된 시그니처 해싱 알고리즘)

  • Ko, Joong-Sik;Kwak, Hu-Keun;Wang, Jeong-Seok;Kwon, Hui-Ung;Chung, Kyu-Sik
    • The KIPS Transactions:PartC
    • /
    • v.16C no.4
    • /
    • pp.449-460
    • /
    • 2009
  • The signature hashing algorithm[9] provides the fast pattern matching speed for network IPS(Intrusion Prevention System) using the hash table. It selects 2 bytes from all signature rules and links to the hash table by the hash value. It has an advantage of performance improvement because it reduces the number of inspecting rules in the pattern matching. However it has a disadvantage of performance drop if the number of rules with the same hash value increases when the number of rules are large and the corelation among rules is strong. In this paper, we propose a method to make all rules distributed evenly to the hash table independent of the number of rules and corelation among rules for overcoming the disadvantage of the signature hashing algorithm. In the proposed method, it checks whether or not there is an already assigned rule linked to the same hash value before a new rule is linked to a hash value in the hash table. If there is no assigned rule, the new rule is linked to the hash value. Otherwise, the proposed method recalculate a hash value to put it in other position. We implemented the proposed method in a PC with a Linux module and performed experiments using Iperf as a network performance measurement tool. The signature hashing method shows performance drop if the number of rules with the same hash value increases when the number of rules are large and the corelation among rules is strong, but the proposed method shows no performance drop independent of the number of rules and corelation among rules.

ARQ Packet Error Control Scheme Using Multiple Threads Based on MMT Protocol (MMT 프로토콜 기반의 다중쓰레드를 활용한 ARQ 패킷 오류 제어 기법)

  • Won, Kwang-eun;Ahn, Eun-bin;Kim, Ayoung;Lee, Hong-rae;Seo, Kwang-deok
    • Journal of Broadcast Engineering
    • /
    • v.23 no.5
    • /
    • pp.682-692
    • /
    • 2018
  • In this paper, we propose an ARQ packet error control scheme using multiple threads in delivering massive capacity of multimedia based on MMT(MPEG Media Transport) protocol. On the sending side, each frame that constitutes an image is packetized into MMT packets based on MMT protocol. The header of the packet stores the sequence number of the frames contained in the packet and the time of presentation information. The payload of the packet stores the direct information that comprises the frame. The generated MMT packet is transmitted to the IP network. The receiving side checks if any error has occurred in the received packet. For any identified error, it controls the error through ARQ error control scheme and reconfigure the frame according to the information stored in the header of the received packet. At this point, a multi-threading based transport design is constructed so that each thread takes over a single frame, which increases the transmission efficiency of massive capacity multimedia. The efficiency of the multi-threading transport method is verified by solving the problems that might arise when using a single-thread approach if packets with errors are retransmitted.

Applicability of Continuous Process Using Saturated and Superheated Steam for Boxed Heart Square Timber Drying (대단면 수심정각재 건조를 위한 포화-과열증기 연속 건조 공정의 이용가능성 평가)

  • PARK, Yonggun;CHUNG, Hyunwoo;KIM, Hyunbin;YEO, Hwanmyeong
    • Journal of the Korean Wood Science and Technology
    • /
    • v.48 no.2
    • /
    • pp.121-135
    • /
    • 2020
  • This study aims to evaluate applicability for the continuous drying process using saturated and superheated steam for large-square timber. During drying of the boxed heart square timber, changes in moisture content were examined through the slices of the surface, inner and core layers. The results showed that there was a large moisture content difference between the surface and inner layers during saturated steam drying and between the inner and core layers during superheated steam drying. However, despite the moisture content difference between the layers, no surface check occurred, and an internal check occurred only near the pith or juvenile parts of the wood. The maximum value of the drying stress of the dried larch boxed heart square timber, calculated from the elastic strain of the slice and the tangential elastic modulus of the larch, was 1.30 MPa. The tangential tensile strength of the larch was estimated at 5.21 MPa under temperature and moisture content conditions when drying stress was at a maximum. That is, in the continuous drying process, the saturated and superheated steam did not generate a check in the surface because the drying stress of the wood did not exceed the tangential tensile strength. In further studies, the superheated steam drying conditions will need to be relaxed to suppress the occurrence of internal checks. Such studies would make the continuous drying process using saturated and superheated steam available for the drying of large-square timber.

The method of development for enhancing reliability of missile assembly test set (유도탄 점검 장비의 신뢰성 향상을 위한 개발 방법)

  • Koh, Sang-Hoon;Han, Seok-Choo;Lee, Kye-Shin;Lee, You-Sang;Kim, Young-Kuk;Park, Dong-Hyun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.8
    • /
    • pp.37-43
    • /
    • 2018
  • A developer solves problems with isolating failures if faults are detected when inspecting missiles using the missile assembly test set (MATS) and then resumes the testing. In order to identify faults, it is necessary to analyze the data coming from the equipment, but the information received may not be sufficient, depending on the inspection environment. In this case, the developer repeats the test until the problem is reproduced or checks the performance of each piece of equipment that is related to the fault. When this task is added, schedule management becomes problematic, and development costs rise. To solve this problem, we need to design a MATS in a systematic way to increase fault coverage while satisfying the required reliability. By designing the necessary processes for each procedure, it is possible to reduce the fault identification time when a fault is detected during operations. But it is not possible to guarantee 100% fault coverage, so we provide another method by comparing costs and effects. This paper describes a development method to enhance the reliability of the missile assembly test set; it describes the expected effects when it is adapted, and describes the limitations of this method.

GRINDING OPTIMIZATION MODEL FOR NANOMETRIC SURFACE ROUGHNESS FOR ASPHERIC ASTRONOMICAL OPTICAL SURFACES (천체망원경용 비구면 반사경 표면조도 향상을 위한 최적연삭변수 수치결정모델)

  • Han, Jeong-Yeol;Kim, Sug-Whan;Kim, Geon-Hee;Han, In-Woo;Yang, Sun-Choel
    • Journal of Astronomy and Space Sciences
    • /
    • v.22 no.1
    • /
    • pp.13-20
    • /
    • 2005
  • Bound abrasive grinding is used for the initial fabrication phase of the precision aspheric mirrors for both space and ground based astronomical telescopes. We developed a new grinding optimization process that determines the input grinding variables for the target surface roughness, checks the grinding error magnitude in resulting surface roughnesses, and minimizes the required machining time. Using the machining data collected from the previous grinding runs and subsequently fed into the multivariable regression engine, the process has the evolving controllability that suggests the optimum set of grinding variables for each target surface roughness. The process model was then used for ten grinding experiments that resulted in the grinding accuracy of $=-0.906{\pm}3.38(\sigma)\;nm(Ra)$ for the target surface roughnesses of Zerodur substrate ranging from 96.1 nm (Ra) to 65.0 nm (Ra) The results imply that the quantitative process optimization technique developed in this study minimizes the machining time and offers the nanometric surface roughness controllability superior to the traditional, qualitative, craftsman based grinding process for the astronomical optical surfaces.

A study on the O2O Commerce Business Process with Business Model Canvas

  • PARK, Hyun-Sung
    • Journal of Distribution Science
    • /
    • v.18 no.5
    • /
    • pp.89-98
    • /
    • 2020
  • Purpose: The growth of online commerce is now becoming a major threat and a new opportunity for retailers. Existing offline retailers struggle to cope with new online retailers' threats by utilizing offline infrastructure. Besides, online retailers expand their online strengths to offline sales by opening their offline stores. Many retailers are paying close attention to the O2O business and the resulting changes. Thus, this research focuses on the O2O business model and process that retailers can adopt. Research design, data and methodology: Considering the features of products that retailers sell, this paper divides O2O business process with the following criteria: delivery lead-time and delivery area. And This research uses the business model canvas to define the features of O2O commerce business process. This paper also uses nine key elements in the business model canvas for analyzing the structure of O2O commerce business. Results: This paper suggests the delivery model of retailers respond to offline customer orders and summarizes the following results. (1) Considering characteristics such as logistics process, delivery area, and product type, we define the features of O2O business models: wide-area (warehouse) based O2O business model, regional area (store) based O2O business model and time-separated O2O business model. (2) This study checks the availability of the business model through the business cases of O2O business models. (3) This study also analyzes the O2O business model of domestic retail companies by the factors defined in the business model canvas. Conclusions: Retailers can adopt the O2O business process to fit their business requirements and strategy. The online retailers who deal with normal consumer products mainly have the wide-area based O2O business model. The wide-area based O2O business model can be suitable for retailers who manage inventory centrally. The time-separated O2O business model can be a good solution for fresh food retailers to operate the logistics process efficiently. And to shorten the delivery lead-time of fresh foods, the regional area based O2O business model can be fit to the retailer that utilizes its offline logistics or sales infrastructure. It may be much more important for retailers to share the inventory information with other branches and to change the role of offline stores.

Measurement from Moving Vehicle Health Screening Outside of The Leakage Dose (이동건강검진차량에서 외부의 누설선량 측정)

  • Han, Beom-Hee;Han, Sang-Hyun;Mo, Eun-Hee;Kim, Chong-Yeal
    • The Journal of the Korea Contents Association
    • /
    • v.15 no.3
    • /
    • pp.192-198
    • /
    • 2015
  • Checked out by the moving vehicle health checks for patients, practitioners, and the increase in radiation dose outside the vehicle, but an investigation into the leakage of radiation can still be negligible. In this study, through experimental results were as follows: Dose that occurs most often where the leak in the right door $1.14{\pm}1.75mR/h$, X-ray generator in terms of proximity to the top $0.65{\pm}1.25mR/h$, X-ray generator and away to the bottom in terms $0.91{\pm}1.25mR/h$, the adjacent rear detector in the upper part $96.98{\pm}158.88mR/h$ dose of the leak appeared in various locations. By measuring position from the rear of the adjacent detector $67.48{\pm}97.03mR/h$ dose had the highest leakage into. Generating device for diagnosis of radiation safety regulations regarding the maximum leakage dose per week, but all met back when it is displayed in the leakage dose Hourly rates do not ignore the leakage radiation dose were measured. Therefore, a mobile health screening in a vehicle outside of the leakage radiation dose to the liver and move on we are not interested twelve barrier leakage radiation dose of defense has a chance to re-evaluate the standards required, and move the vehicle health check using the X-ray increases as the dose per hour, depending on the criteria for choosing the appropriate measures that will require effort.