• Title/Summary/Keyword: two-time-scale system

Search Result 439, Processing Time 0.024 seconds

A Comparative Efficacy of Propacetamol and Ketorolac in Postoperative Patient Controlled Analgesia

  • Heo, Bong Ha;Park, Ji Hun;Choi, Jung Il;Kim, Woong Mo;Lee, Hyoung Gon;Cho, Soo Young;Yoon, Myoung Ha
    • The Korean Journal of Pain
    • /
    • v.28 no.3
    • /
    • pp.203-209
    • /
    • 2015
  • Background: Ketorolac has been used as a postoperative analgesia in combination with opioids. However, the use of ketorolac may produce serious side effects in vulnerable patients. Propacetamol is known to induce fewer side effects than ketorolac because it mainly affects the central nervous system. We compared the analgesic effects and patient satisfaction levels of each drug when combined with fentanyl patient-controlled analgesia (PCA). Methods: The patients were divided into two groups, each with n = 46. The patients in each group were given 60 mg of ketorolac or 2 g of propacetamol (mixed with fentanyl) for 10 minutes. The patients were then given 180 mg of ketorolac or 8 g of propacetamol (mixed with fentanyl and ramosetron) through PCA. We assessed the visual analogue pain scale (VAS) at the time point immediately before administration (baseline) and at 15, 30, and 60 minutes, and 24 hours after administration. Also, the side effects of each regimen and each patient's degree of satisfaction were assessed. Results: There was a significant decline in the VAS score in both groups (P < 0.05). However, there were no significant differences in the VAS scores between the groups at each time point. Satisfaction scores between the groups showed no significant difference. Conclusions: The efficacy of propacetamol is comparable to that of ketorolac in postoperative PCA with fentanyl.

Verification and validation of isotope inventory prediction for back-end cycle management using two-step method

  • Jang, Jaerim;Ebiwonjumi, Bamidele;Kim, Wonkyeong;Cherezov, Alexey;Park, Jinsu;Lee, Deokjung
    • Nuclear Engineering and Technology
    • /
    • v.53 no.7
    • /
    • pp.2104-2125
    • /
    • 2021
  • This paper presents the verification and validation (V&V) of a calculation module for isotope inventory prediction to control the back-end cycle of spent nuclear fuel (SNF). The calculation method presented herein was implemented in a two-step code system of a lattice code STREAM and a nodal diffusion code RAST-K. STREAM generates a cross section and provides the number density information using branch/history depletion branch calculations, whereas RAST-K supplies the power history and three history indices (boron concentration, moderator temperature, and fuel temperature). As its primary feature, this method can directly consider three-dimensional core simulation conditions using history indices of the operating conditions. Therefore, this method reduces the computation time by avoiding a recalculation of the fuel depletion. The module for isotope inventory calculates the number densities using the Lagrange interpolation method and power history correction factors, which are applied to correct the effects of the decay and fission products generated at different power levels. To assess the reliability of the developed code system for back-end cycle analysis, validation study was performed with 58 measured samples of pressurized water reactor (PWR) SNF, and code-to-code comparison was conducted with STREAM-SNF, HELIOS-1.6 and SCALE 5.1. The V&V results presented that the developed code system can provide reasonable results with comparable confidence intervals. As a result, this paper successfully demonstrates that the isotope inventory prediction code system can be used for spent nuclear fuel analysis.

Instrumentation and system identification of a typical school building in Istanbul

  • Bakir, Pelin Gundes
    • Structural Engineering and Mechanics
    • /
    • v.43 no.2
    • /
    • pp.179-197
    • /
    • 2012
  • This study presents the findings of the structural health monitoring and the real time system identification of one of the first large scale building instrumentations in Turkey for earthquake safety. Within this context, a thorough review of steps in the instrumentation, monitoring is presented and seismic performance evaluation of structures using both nonlinear pushover and nonlinear dynamic time history analysis is carried out. The sensor locations are determined using the optimal sensor placement techniques used in NASA for on orbit modal identification of large space structures. System identification is carried out via the stochastic subspace technique. The results of the study show that under ambient vibrations, stocky buildings can be substantially stiffer than what is predicted by the finite element models due to the presence of a large number of partitioning walls. However, in a severe earthquake, it will not be safe to rely on this resistance due to the fact that once the partitioning walls crack, the bare frame contributes to the lateral stiffness of the building alone. Consequently, the periods obtained from system identification will be closer to those obtained from the FE analysis. A technique to control the validity of the proportional damping assumption is employed that checks the presence of phase difference in displacements of different stories obtained from band pass filtered records and it is confirmed that the "proportional damping assumption" is valid for this structure. Two different techniques are implemented for identifying the influence of the soil structure interaction. The first technique uses the transfer function between the roof and the basement in both directions. The second technique uses a pre-whitening filter on the data obtained from both the basement and the roof. Subsequently the impulse response function is computed from the scaled cross correlation between the input and the output. The overall results showed that the structure will satisfy the life safety performance level in a future earthquake but some soil structure interaction effects should be expected in the North South direction.

A Study on the Architecture Design of Road and Facility Operation Management System for 3D Spatial Data Processing (3차원 공간데이터 처리를 위한 차로 및 시설물 운영 관리 시스템 아키텍처 설계 연구)

  • KIM, Duck-Ho;KIM, Sung-Jin;LEE, Jung-Uck
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.24 no.4
    • /
    • pp.136-147
    • /
    • 2021
  • Autonomous driving-related technologies are developing step by step by applying the degree of driving. It is essential that operational management technology for roads where autonomous vehicles move should also develop in line with autonomous driving technology. However, in the case of road operation management, it is currently managed using only two-dimensional information, showing limitations in the systematic operation management of lane and facility information and maintenance. This study proposed a plan to construct an operation management system architecture capable of 3D spatial information-based operation management by designing a convergence database that can process real-time big data with high-definition road map data. Through this study, when using a high-definition road map based operation management system for lane and facility maintenance in the future, it is possible to visualize and manage facilities, edit and analyze data of multiple users, link various GIS S/W and efficiently process large scale of real-time data.

Direct Pars Repair Surgery Using Two Different Surgical Methods : Pedicle Screw with Universal Hook System and Direct Pars Screw Fixation in Symptomatic Lumbar Spondylosis Patients

  • Shin, Myung-Hoon;Ryu, Kyeong-Sik;Rathi, Nitesh Kumar;Park, Chun-Kun
    • Journal of Korean Neurosurgical Society
    • /
    • v.51 no.1
    • /
    • pp.14-19
    • /
    • 2012
  • Objective : The authors performed a retrospective study to assess the clinical and radiological outcome in symptomatic lumbar spondylolysis patients who underwent a direct pars repair surgery using two different surgical methods; pedicle screw with universal hook system (PSUH) and direct pars screw fixation (DPSF), and compared the results between two different treated groups. Methods : Forty-seven consecutive patients (PSUH; 23, DPSF; 15) with symptomatic lumbar spondylolysis who underwent a direct pars repair surgery were included. The average follow-up period was 37 months in the PSUH group, and 28 months in the DPSF group. The clinical outcome was measured using visual analogue pain scale (VAS) and Oswestry disability index (ODI). The length of operation time, the amount of blood loss, the duration of hospital stay, surgical complications, and fusion status were also assessed. Results : When compared to the DPSF group, the average preoperative VAS and ODI score of the PSUH group were less decreased at the last follow-up; (the PSUH group; back VAS : 4.9 vs. 3.0, leg VAS : 6.8 vs. 2.2, ODI : 50.6% vs. 24.6%, the DPSF group; back VAS : 5.7 vs. 1.1, leg VAS : 6.1 vs. 1.2, ODI : 57.4% vs. 18.2%). The average operation time was 174.9 minutes in the PSUH group, and 141.7 minutes in the DPSF group. The average blood loss during operation was 468.8 cc in the PSUH group, and 298.8 cc in the DPSF group. The average hospital stay after operation was 8.9 days in the PSUH group, and 7 days in the DPSF group. In the PSUH group, there was one case of a screw misplacement requiring revision surgery. In the DPSF group, one patient suffered from transient leg pain. The successful bone fusion rate was 78.3% in the PSUH group, and 93.3% in the DPSF group. Conclusion : The present study suggests that the technique using direct pars screw would be more effective than the method using pedicle screw with lamina hook system, in terms of decreased operation time, amount of blood loss, hospital stay, and increased fusion success rate, as well as better clinical outcome.

Camera Calibration for Machine Vision Based Autonomous Vehicles (머신비젼 기반의 자율주행 차량을 위한 카메라 교정)

  • Lee, Mun-Gyu;An, Taek-Jin
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.8 no.9
    • /
    • pp.803-811
    • /
    • 2002
  • Machine vision systems are usually used to identify traffic lanes and then determine the steering angle of an autonomous vehicle in real time. The steering angle is calculated using a geometric model of various parameters including the orientation, position, and hardware specification of a camera in the machine vision system. To find the accurate values of the parameters, camera calibration is required. This paper presents a new camera-calibration algorithm using known traffic lane features, line thickness and lane width. The camera parameters considered are divided into two groups: Group I (the camera orientation, the uncertainty image scale factor, and the focal length) and Group II(the camera position). First, six control points are extracted from an image of two traffic lines and then eight nonlinear equations are generated based on the points. The least square method is used to find the estimates for the Group I parameters. Finally, values of the Group II parameters are determined using point correspondences between the image and its corresponding real world. Experimental results prove the feasibility of the proposed algorithm.

Organization of Circular Motion Accuracy Measuring System of NC Lathe using Linear Scales (리니어 스케일을 이용한 NC 선반의 원 운동정도 측정 시스템의 구성)

  • Kim Young Seuk;Kim Jae Yeol;Kim Jong Kwan;Han Ji Hee;Jung Jung Pyo
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.13 no.5
    • /
    • pp.1-6
    • /
    • 2004
  • Measurements of circular motion accuracy of NC lathe have achieved with ball bar systems proposed by Bryan, but the ball bar systems have ifluenced on the measuring data by way of the accuracy of the balls and the contacts of balls and bar seats. Therefore in this study, error data during of circular motion of ATC(Automatic Tool Changer) of NC lathe will be acquired by reading zx plane coordinates using two optical linear scales. Two optical linear scales of measuring unit are fixed on z-x plane of NC lathe, and the moving part is fixed to ATC and then is made to receive data of coordinates of the ATC at constant time intervals using tick pulses comming out from computer. And then, error data files of radial direction of circular motion are calculated with the data read, and the aspect of circular motion are modeled to plots, and are analysed by means of statistical treatments of circularity, means, standard deviations etc.

Seismic assessment of base-isolated nuclear power plants

  • Farmanbordar, Babak;Adnan, Azlan Bin;Tahir, Mahmood Md.;Faridmehr, Iman
    • Advances in Computational Design
    • /
    • v.2 no.3
    • /
    • pp.211-223
    • /
    • 2017
  • This research presented a numerical and experimental study on the seismic performance of first-generation base-isolated and fixed-base nuclear power plants (NPP). Three types of the base isolation system were applied to rehabilitate the first-generation nuclear power plants: frictional pendulum (FP), high-damping rubber (HDR) and lead-rubber (LR) base isolation. Also, an Excel program was proposed for the design of the abovementioned base isolators in accordance with UBC 97 and the Japan Society of Base Isolation Regulation. The seismic assessment was performed using the pushover and nonlinear time history analysis methods in accordance with the FEMA 356 regulation. To validate the adequacy of the proposed design procedure, two small-scale NPPs were constructed at Universiti Teknologi Malaysia's structural laboratory and subjected to a pushover test for two different base conditions, fixed and HDR-isolated base. The results showed that base-isolated structures achieved adequate seismic performance compared with the fixed-base one, and all three isolators led to a significant reduction in the containment's tension, overturning moment and base shear.

Developing an Intrusion Detection Framework for High-Speed Big Data Networks: A Comprehensive Approach

  • Siddique, Kamran;Akhtar, Zahid;Khan, Muhammad Ashfaq;Jung, Yong-Hwan;Kim, Yangwoo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.8
    • /
    • pp.4021-4037
    • /
    • 2018
  • In network intrusion detection research, two characteristics are generally considered vital to building efficient intrusion detection systems (IDSs): an optimal feature selection technique and robust classification schemes. However, the emergence of sophisticated network attacks and the advent of big data concepts in intrusion detection domains require two more significant aspects to be addressed: employing an appropriate big data computing framework and utilizing a contemporary dataset to deal with ongoing advancements. As such, we present a comprehensive approach to building an efficient IDS with the aim of strengthening academic anomaly detection research in real-world operational environments. The proposed system has the following four characteristics: (i) it performs optimal feature selection using information gain and branch-and-bound algorithms; (ii) it employs machine learning techniques for classification, namely, Logistic Regression, Naïve Bayes, and Random Forest; (iii) it introduces bulk synchronous parallel processing to handle the computational requirements of large-scale networks; and (iv) it utilizes a real-time contemporary dataset generated by the Information Security Centre of Excellence at the University of Brunswick (ISCX-UNB) to validate its efficacy. Experimental analysis shows the effectiveness of the proposed framework, which is able to achieve high accuracy, low computational cost, and reduced false alarms.

Robust Segmentation for Low Quality Cell Images from Blood and Bone Marrow

  • Pan Chen;Fang Yi;Yan Xiang-Guo;Zheng Chong-Xun
    • International Journal of Control, Automation, and Systems
    • /
    • v.4 no.5
    • /
    • pp.637-644
    • /
    • 2006
  • Biomedical image is often complex. An applied image analysis system should deal with the images which are of quite low quality and are challenging to segment. This paper presents a framework for color cell image segmentation by learning and classification online. It is a robust two-stage scheme using kernel method and watershed transform. In first stage, a two-class SVM is employed to discriminate the pixels of object from background; where the SVM is trained on the data which has been analyzed using the mean shift procedure. A real-time training strategy is also developed for SVM. In second stage, as the post-processing, local watershed transform is used to separate clustering cells. Comparison with the SSF (Scale space filter) and classical watershed-based algorithm (those are often employed for cell image segmentation) is given. Experimental results demonstrate that the new method is more accurate and robust than compared methods.