• Title/Summary/Keyword: Performance based standard

Search Result 2,658, Processing Time 0.038 seconds

Factors Affecting the Survivals of Out-of-hospital Cardiac Arrests by 119 Fire Service (119구급대원의 심폐소생술 성적 분석 - 병원전 심정지를 중심으로 -)

  • Kang, Byung-Woo
    • The Korean Journal of Emergency Medical Services
    • /
    • v.9 no.2
    • /
    • pp.111-128
    • /
    • 2005
  • Background: Cardiac arrest is one of the most critical diseases which can likely lead to severe cerebral disability or brain death when the cases can not recover their circulation within 10 minutes. Saving out-of-hospital cardiac arrest cases is a recent concern in Korea. Resuscitation has become an important multidisciplinary branch of medicine, demanding a spectrum of skills and attracting a plethora of specialities and organizations. The best survival can be achieved if all the following links have been optimized : rapid access, and early CPR, defibrillation and ACLS, Since the "Utstein Style" was advocated in 1991, many reports about out-of-hospital cardiac arrest have been published based on this guideline. These differences prevent valid inter-hospital and international comparisons. However, it is not known how effective resuscitation has become to the patients. In other words, there are no guidelines for reviewing, reporting, and conducting research on resuscitation in Korea. This dissertation aims to provide the basic data for a unified reporting guideline of resuscitation in Korea and evaluating the out-of-hospital factors associated with survival discharge of out-of-hospital cardiac arrest. Methods: As for this study, uses the collected data about Out-of-hospital cardiac arrests at 4 area, from January, 2005 to April. 2005. With a retrospective study, 174 cases were analyzed. The data was recorded based on the Out-of-Hospital Utstein Style. Results: Resuscitation was performed on 174 out-of-hospital cardiac arrest cases at the 4 area 14 patients(8.1%) recovered their spontaneous circulation. Overall, the ROSC of the out-of-hospital cardiac arrest patients was 8.1%, which was poorer than that of western countries. Gender distribution was 50 females(28.7%) and 124 males(71.3%), approximately twice as many males as females. ROSC of witnessed arrests was found out to be 97.7%. The ratio of the witnessed arrest groups showed higher results than that of unwitnessed arrest groups in the above-examined cases. Cardiac etiology consisted of cardiac(33.5%), non-cardiac(45.7%), trauma(20.1%), and unknown(6.0%). Cardiac was the best performance. Initial rhythm showed Ventricular Tachycardia/pulseless Ventricular Fibrillation in 8 patients(6.0%), asystole in 100(75.2%) and unknown in 25(18.8%). The results of the Ventricular Tachycardia/pulseless Ventricular Fibrillation showed higher results than the others cases, The proportion of the cardiogenic cause was 33.5%, which was only half of western countries. Ventricular Tachycardia/pulseless Ventricular Fibrillation is relatively rare. These differences were due to the prevalent pattern of Out-of-hospital cardiac arrest as well as prematurity of the EMSS. Bystander CPR was practiced on 13 patients(7.52%). ROSC was shown in 46.2% cases. CPR by EMT was carried out on 167 cases(96.5%). ACLS by EMf was rare. From collapse, 4 cases(2.6%) arrived to ED within 6 minutes. 13 (8.6%) within 10 minutes, and 49(32.5%) over 31 minutes. The sooner the patients arrived, the greater the ratio of ROSC and discharged alive became, and the same with collapse time to ROSC. As the results of the logistic regression analysis, ROSC was found out to be highly influenced by the time of ED arrival from collapse and Ventricular Tachycardia/pulseless Ventricular Fibrillation. Therefore, the ratio of ROSC depends on not any single factor but various intervention factors. Conclusion: This dissertation presents the following suggestions and directions of the study hereafter. First, the first step for a chain of survival should be taken to activate EMSS early with a phone as soon as cardiac arrests are witnessed. Second, it is keenly needed that emergency medical technicians should be increased through emergency education for living. Third, it is necessary to establish the emergency transportation system. Fourth, most of the Koreans have little understanding of EMT and the present operation systems have many problems, which should be fundamentally changed. Fifth, it is required to have an active medical control over Out-of-hospital CPR, And proper psychological supports should be given not only to patients themselves and their family but also individuals who are engaged in emergency situation. Finally, through studies hereafter on nationwide, comprehensive, and standard forms, it is needed to examine into the biological figures of human body, causes and trends of cardiac arrests, and then, to enhance the survival rate of Out-of-hospital cardiac arrests. Korean guidelines for Cardiopulmonary resuscitation need to be made.

  • PDF

A Machine Learning-based Total Production Time Prediction Method for Customized-Manufacturing Companies (주문생산 기업을 위한 기계학습 기반 총생산시간 예측 기법)

  • Park, Do-Myung;Choi, HyungRim;Park, Byung-Kwon
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.177-190
    • /
    • 2021
  • Due to the development of the fourth industrial revolution technology, efforts are being made to improve areas that humans cannot handle by utilizing artificial intelligence techniques such as machine learning. Although on-demand production companies also want to reduce corporate risks such as delays in delivery by predicting total production time for orders, they are having difficulty predicting this because the total production time is all different for each order. The Theory of Constraints (TOC) theory was developed to find the least efficient areas to increase order throughput and reduce order total cost, but failed to provide a forecast of total production time. Order production varies from order to order due to various customer needs, so the total production time of individual orders can be measured postmortem, but it is difficult to predict in advance. The total measured production time of existing orders is also different, which has limitations that cannot be used as standard time. As a result, experienced managers rely on persimmons rather than on the use of the system, while inexperienced managers use simple management indicators (e.g., 60 days total production time for raw materials, 90 days total production time for steel plates, etc.). Too fast work instructions based on imperfections or indicators cause congestion, which leads to productivity degradation, and too late leads to increased production costs or failure to meet delivery dates due to emergency processing. Failure to meet the deadline will result in compensation for delayed compensation or adversely affect business and collection sectors. In this study, to address these problems, an entity that operates an order production system seeks to find a machine learning model that estimates the total production time of new orders. It uses orders, production, and process performance for materials used for machine learning. We compared and analyzed OLS, GLM Gamma, Extra Trees, and Random Forest algorithms as the best algorithms for estimating total production time and present the results.

Development of Prediction Model for the Na Content of Leaves of Spring Potatoes Using Hyperspectral Imagery (초분광 영상을 이용한 봄감자의 잎 Na 함량 예측 모델 개발)

  • Park, Jun-Woo;Kang, Ye-Seong;Ryu, Chan-Seok;Jang, Si-Hyeong;Kang, Kyung-Suk;Kim, Tae-Yang;Park, Min-Jun;Baek, Hyeon-Chan;Song, Hye-Young;Jun, Sae-Rom;Lee, Su-Hwan
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.23 no.4
    • /
    • pp.316-328
    • /
    • 2021
  • In this study, the leaf Na content prediction model for spring potato was established using 400-1000 nm hyperspectral sensor to develop the multispectral sensor for the salinity monitoring in reclaimed land. The irrigation conditions were standard, drought, and salinity (2, 4, 8 dS/m), and the irrigation amount was calculated based on the amount of evaporation. The leaves' Na contents were measured 1st and 2nd weeks after starting irrigation in the vegetative, tuber formative, and tuber growing periods, respectively. The reflectance of the leaves was converted from 5 nm to 10 nm, 25 nm, and 50 nm of FWHM (full width at half maximum) based on the 10 nm wavelength intervals. Using the variance importance in projections of partial least square regression(PLSR-VIP), ten band ratios were selected as the variables to predict salinity damage levels with Na content of spring potato leaves. The MLR(Multiple linear regression) models were estimated by removing the band ratios one by one in the order of the lowest weight among the ten band ratios. The performance of models was compared by not only R2, MAPE but also the number of band ratios, optimal FWHM to develop the compact multispectral sensor. It was an advantage to use 25 nm of FWHM to predict the amount of Na in leaves for spring potatoes during the 1st and 2nd weeks vegetative and tuber formative periods and 2 weeks tuber growing periods. The selected bandpass filters were 15 bands and mainly in red and red-edge regions such as 430/440, 490/500, 500/510, 550/560, 570/580, 590/600, 640/650, 650/660, 670/680, 680/690, 690/700, 700/710, 710/720, 720/730, 730/740 nm.

A Study on the Direction of Planting Renewal in the Green Area of Seoul Children's Grand Park Reflecting Functional Changes (기능변화를 반영한 서울어린이대공원 조성녹지의 식재 리뉴얼 방향성 연구)

  • Park, Jeong-Ah;Han, Bong-Ho;Park, Seok-Cheol
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.51 no.3
    • /
    • pp.21-36
    • /
    • 2023
  • As a solution to environmental issues, such as climate change response, the carbon neutrality strategy, urban heat islands, fine dust, and biodiversity enhancement, the value of urban green spaces and trees are becoming important, and various studies dealing with the effects of trees for environmental improvement are being conducted. This study comprehensively considers the preceding studies on planting tree species, planting structure, planting density, and planting base to propose a direction for the planting renewal of green areas in urban parks and applies the findings to a renewal plan to improve the urban environment through landscaping trees. A field survey was conducted on the planting status of Seoul Children's Grand Park, a large-scale neighborhood park in Seoul, and based on the survey data, a planting function evaluation was conducted, and areas needing improvement in planting function were identified. The planting function evaluation was carried out considering the park function setting, planting concept according to spatial function, and planting status. As a result of the study, the direction of planting renewal according to functional change was derived for each stage of planting function evaluation. Increasing the green area ratio is a priority in setting up park functions, but user convenience should also be considered. As a concept of planting, visual landscape planting involves planting species with beautiful tree shapes, high carbon absorption, and fine dust reduction effects. Ecological landscape planting should create a multi-layered planting site on a slope. Buffer planting should be created as multi-layered forests to improve carbon absorption and fine dust reduction effects. Green planting should consist of broad-leaved trees and herbaceous layers and aim for the natural planting of herbaceous species. For plant species, species with high urban environment improvement effects, local native species, and wild bird preferred species should be selected. As for the planting structure, landscape planting sites and green planting sites should be composed of trees, shrubs, and trees and herbaceous layers that emphasize ecology or require multi-layered buffer functions. A higher standard is applied based on the planting interval for planting density. Installing a rainwater recycling facility and using soil loam for the planting base improves performance. The results of this study are meaningful in that they can be applied to derive areas needing functional improvement by performing planting function evaluation when planning planting renewal of aging urban parks and can suggest renewal directions that reflect the paradigm of functional change of created green areas.

Vitamin D analysis in the Korean total diet study and UV/sun light irradiated mushrooms (한국형 총식이조사 및 UV/태양광 조사 버섯에서의 비타민 D 분석)

  • Min-Jeong Seo;In-Hwa Roh;Jee-Yeon Lee;Sung-Ok Kwon;Cho-Il Kim;Gae-Ho Lee
    • Food Science and Preservation
    • /
    • v.30 no.1
    • /
    • pp.109-121
    • /
    • 2023
  • This study was conducted to evaluate vitamin D intake of Koreans in a total diet study (TDS) and to determine the effect of irradiation on vitamin D synthesis in mushrooms. For analysis, sample were saponified and extracted with hexane, and vitamin D was quantified by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Based on the validation results, the recovery of the National Institute of Standards and Technology (NIST) standard reference sample (SRM) 1849a was 96.7% and the z-score of -1.6 was obtained by the Food Analysis Performance Assessment Scheme (FAPAS) proficiency test (PT) 21115. Vitamin D2 was not detected in any samples, and the highest level of vitamin D3 was detected in mackerel and anchovies ranging from 24.2 to 120.2 ㎍/kg. The mean daily intake of vitamin D was 0.99 ㎍/day, as estimated from the vitamin D contents of the analyzed foods and their corresponding intake. The adequate intake (AI) of vitamin D based on the Dietary reference intakes for Koreans provided by the Ministry of Health and Welfare is 5-15 ㎍/day for Koreans aged 6 to 75 years. Compared with this AI, vitamin D intake of Koreans estimated in this study was inadequate. For that, the increased vitamin D content in ultraviolet (UV)/sun light irradiated mushrooms warrants further research to increase vitamin D intake of Koreans through diet.

Discovering Promising Convergence Technologies Using Network Analysis of Maturity and Dependency of Technology (기술 성숙도 및 의존도의 네트워크 분석을 통한 유망 융합 기술 발굴 방법론)

  • Choi, Hochang;Kwahk, Kee-Young;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.101-124
    • /
    • 2018
  • Recently, most of the technologies have been developed in various forms through the advancement of single technology or interaction with other technologies. Particularly, these technologies have the characteristic of the convergence caused by the interaction between two or more techniques. In addition, efforts in responding to technological changes by advance are continuously increasing through forecasting promising convergence technologies that will emerge in the near future. According to this phenomenon, many researchers are attempting to perform various analyses about forecasting promising convergence technologies. A convergence technology has characteristics of various technologies according to the principle of generation. Therefore, forecasting promising convergence technologies is much more difficult than forecasting general technologies with high growth potential. Nevertheless, some achievements have been confirmed in an attempt to forecasting promising technologies using big data analysis and social network analysis. Studies of convergence technology through data analysis are actively conducted with the theme of discovering new convergence technologies and analyzing their trends. According that, information about new convergence technologies is being provided more abundantly than in the past. However, existing methods in analyzing convergence technology have some limitations. Firstly, most studies deal with convergence technology analyze data through predefined technology classifications. The technologies appearing recently tend to have characteristics of convergence and thus consist of technologies from various fields. In other words, the new convergence technologies may not belong to the defined classification. Therefore, the existing method does not properly reflect the dynamic change of the convergence phenomenon. Secondly, in order to forecast the promising convergence technologies, most of the existing analysis method use the general purpose indicators in process. This method does not fully utilize the specificity of convergence phenomenon. The new convergence technology is highly dependent on the existing technology, which is the origin of that technology. Based on that, it can grow into the independent field or disappear rapidly, according to the change of the dependent technology. In the existing analysis, the potential growth of convergence technology is judged through the traditional indicators designed from the general purpose. However, these indicators do not reflect the principle of convergence. In other words, these indicators do not reflect the characteristics of convergence technology, which brings the meaning of new technologies emerge through two or more mature technologies and grown technologies affect the creation of another technology. Thirdly, previous studies do not provide objective methods for evaluating the accuracy of models in forecasting promising convergence technologies. In the studies of convergence technology, the subject of forecasting promising technologies was relatively insufficient due to the complexity of the field. Therefore, it is difficult to find a method to evaluate the accuracy of the model that forecasting promising convergence technologies. In order to activate the field of forecasting promising convergence technology, it is important to establish a method for objectively verifying and evaluating the accuracy of the model proposed by each study. To overcome these limitations, we propose a new method for analysis of convergence technologies. First of all, through topic modeling, we derive a new technology classification in terms of text content. It reflects the dynamic change of the actual technology market, not the existing fixed classification standard. In addition, we identify the influence relationships between technologies through the topic correspondence weights of each document, and structuralize them into a network. In addition, we devise a centrality indicator (PGC, potential growth centrality) to forecast the future growth of technology by utilizing the centrality information of each technology. It reflects the convergence characteristics of each technology, according to technology maturity and interdependence between technologies. Along with this, we propose a method to evaluate the accuracy of forecasting model by measuring the growth rate of promising technology. It is based on the variation of potential growth centrality by period. In this paper, we conduct experiments with 13,477 patent documents dealing with technical contents to evaluate the performance and practical applicability of the proposed method. As a result, it is confirmed that the forecast model based on a centrality indicator of the proposed method has a maximum forecast accuracy of about 2.88 times higher than the accuracy of the forecast model based on the currently used network indicators.

A study on the prediction of korean NPL market return (한국 NPL시장 수익률 예측에 관한 연구)

  • Lee, Hyeon Su;Jeong, Seung Hwan;Oh, Kyong Joo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.123-139
    • /
    • 2019
  • The Korean NPL market was formed by the government and foreign capital shortly after the 1997 IMF crisis. However, this market is short-lived, as the bad debt has started to increase after the global financial crisis in 2009 due to the real economic recession. NPL has become a major investment in the market in recent years when the domestic capital market's investment capital began to enter the NPL market in earnest. Although the domestic NPL market has received considerable attention due to the overheating of the NPL market in recent years, research on the NPL market has been abrupt since the history of capital market investment in the domestic NPL market is short. In addition, decision-making through more scientific and systematic analysis is required due to the decline in profitability and the price fluctuation due to the fluctuation of the real estate business. In this study, we propose a prediction model that can determine the achievement of the benchmark yield by using the NPL market related data in accordance with the market demand. In order to build the model, we used Korean NPL data from December 2013 to December 2017 for about 4 years. The total number of things data was 2291. As independent variables, only the variables related to the dependent variable were selected for the 11 variables that indicate the characteristics of the real estate. In order to select the variables, one to one t-test and logistic regression stepwise and decision tree were performed. Seven independent variables (purchase year, SPC (Special Purpose Company), municipality, appraisal value, purchase cost, OPB (Outstanding Principle Balance), HP (Holding Period)). The dependent variable is a bivariate variable that indicates whether the benchmark rate is reached. This is because the accuracy of the model predicting the binomial variables is higher than the model predicting the continuous variables, and the accuracy of these models is directly related to the effectiveness of the model. In addition, in the case of a special purpose company, whether or not to purchase the property is the main concern. Therefore, whether or not to achieve a certain level of return is enough to make a decision. For the dependent variable, we constructed and compared the predictive model by calculating the dependent variable by adjusting the numerical value to ascertain whether 12%, which is the standard rate of return used in the industry, is a meaningful reference value. As a result, it was found that the hit ratio average of the predictive model constructed using the dependent variable calculated by the 12% standard rate of return was the best at 64.60%. In order to propose an optimal prediction model based on the determined dependent variables and 7 independent variables, we construct a prediction model by applying the five methodologies of discriminant analysis, logistic regression analysis, decision tree, artificial neural network, and genetic algorithm linear model we tried to compare them. To do this, 10 sets of training data and testing data were extracted using 10 fold validation method. After building the model using this data, the hit ratio of each set was averaged and the performance was compared. As a result, the hit ratio average of prediction models constructed by using discriminant analysis, logistic regression model, decision tree, artificial neural network, and genetic algorithm linear model were 64.40%, 65.12%, 63.54%, 67.40%, and 60.51%, respectively. It was confirmed that the model using the artificial neural network is the best. Through this study, it is proved that it is effective to utilize 7 independent variables and artificial neural network prediction model in the future NPL market. The proposed model predicts that the 12% return of new things will be achieved beforehand, which will help the special purpose companies make investment decisions. Furthermore, we anticipate that the NPL market will be liquidated as the transaction proceeds at an appropriate price.

A Deep Learning Based Approach to Recognizing Accompanying Status of Smartphone Users Using Multimodal Data (스마트폰 다종 데이터를 활용한 딥러닝 기반의 사용자 동행 상태 인식)

  • Kim, Kilho;Choi, Sangwoo;Chae, Moon-jung;Park, Heewoong;Lee, Jaehong;Park, Jonghun
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.163-177
    • /
    • 2019
  • As smartphones are getting widely used, human activity recognition (HAR) tasks for recognizing personal activities of smartphone users with multimodal data have been actively studied recently. The research area is expanding from the recognition of the simple body movement of an individual user to the recognition of low-level behavior and high-level behavior. However, HAR tasks for recognizing interaction behavior with other people, such as whether the user is accompanying or communicating with someone else, have gotten less attention so far. And previous research for recognizing interaction behavior has usually depended on audio, Bluetooth, and Wi-Fi sensors, which are vulnerable to privacy issues and require much time to collect enough data. Whereas physical sensors including accelerometer, magnetic field and gyroscope sensors are less vulnerable to privacy issues and can collect a large amount of data within a short time. In this paper, a method for detecting accompanying status based on deep learning model by only using multimodal physical sensor data, such as an accelerometer, magnetic field and gyroscope, was proposed. The accompanying status was defined as a redefinition of a part of the user interaction behavior, including whether the user is accompanying with an acquaintance at a close distance and the user is actively communicating with the acquaintance. A framework based on convolutional neural networks (CNN) and long short-term memory (LSTM) recurrent networks for classifying accompanying and conversation was proposed. First, a data preprocessing method which consists of time synchronization of multimodal data from different physical sensors, data normalization and sequence data generation was introduced. We applied the nearest interpolation to synchronize the time of collected data from different sensors. Normalization was performed for each x, y, z axis value of the sensor data, and the sequence data was generated according to the sliding window method. Then, the sequence data became the input for CNN, where feature maps representing local dependencies of the original sequence are extracted. The CNN consisted of 3 convolutional layers and did not have a pooling layer to maintain the temporal information of the sequence data. Next, LSTM recurrent networks received the feature maps, learned long-term dependencies from them and extracted features. The LSTM recurrent networks consisted of two layers, each with 128 cells. Finally, the extracted features were used for classification by softmax classifier. The loss function of the model was cross entropy function and the weights of the model were randomly initialized on a normal distribution with an average of 0 and a standard deviation of 0.1. The model was trained using adaptive moment estimation (ADAM) optimization algorithm and the mini batch size was set to 128. We applied dropout to input values of the LSTM recurrent networks to prevent overfitting. The initial learning rate was set to 0.001, and it decreased exponentially by 0.99 at the end of each epoch training. An Android smartphone application was developed and released to collect data. We collected smartphone data for a total of 18 subjects. Using the data, the model classified accompanying and conversation by 98.74% and 98.83% accuracy each. Both the F1 score and accuracy of the model were higher than the F1 score and accuracy of the majority vote classifier, support vector machine, and deep recurrent neural network. In the future research, we will focus on more rigorous multimodal sensor data synchronization methods that minimize the time stamp differences. In addition, we will further study transfer learning method that enables transfer of trained models tailored to the training data to the evaluation data that follows a different distribution. It is expected that a model capable of exhibiting robust recognition performance against changes in data that is not considered in the model learning stage will be obtained.

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF

A Study on The Enhancement of Aviation Safety in Airport Planning & Construction from a Legal Perspective (공항개발계획과 사업에서의 항공안전성 제고에 대한 법률적 소고)

  • Kim, Tae-Han
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.27 no.2
    • /
    • pp.67-106
    • /
    • 2012
  • Today air traffic at the airport is complicated including a significant increase in the volume of air transport, so aviation accidents are constantly occurring. Therefore, we should newly recognize importance of the Air Traffic Safety, the core values of the Air Traffic. The location of airport that is the basic infrastructure of the air traffic and the security of safety for facilities and equipments are more important than what you can. From this dimension, I analyze the step-by-step safety factors that are taken into account in the airport development projects from the construction or improvement of the airport within the current laws and institutions and give my opinion on the enhancement of safety in the design and construction of airport. The safety of air traffic, as well as airport, depends on location, development, design, construction, inspection and management of the airport including airport facilities because we have to carry out the national responsibility that prevents the risk of large social overhead capital for many and unspecified persons in modern society through legislation regarding intervention of specialists and locational criteria for aviation safety from the planning stage of airport development. In addition, well-defined installation standards of airports and air navigation facilities, the key points of the airport development phase, can ensure the safety of the airport and airport facilities. Of course, the installation standards of airport and air navigation facilities are based on the global standard due to the nature of air traffic. However, to prevent the chaos for the safety standards in design, construction, inspection of them and to ensure the aviation safety, the safety standards must be further subdivided in the course of domestic legislation. The criteria for installation of the Air Navigation facilities is regulated most specifically. However, to ensure the safety of the operation for Air Navigation Facilities, performance system proved suitable for the Safety of Air Navigation Facilities must change over from arbitrary restrictions to mandatory restrictions and be applied for foreign producers as well as domestic producers. Of course, negligence of pilots and defective aircraft maintenance lead to a large portion of the aviation accidents. However, I think that air traffic accidents can be reduced if the airport or airport facility is perfect enough to ensure the safety. Therefore, legal and institutional supplement to prioritize the aviation safety from the stage of airport development may be necessary.

  • PDF