• Title/Summary/Keyword: Analysis algorithm

Search Result 12,357, Processing Time 0.045 seconds

Comparison of Dosimetrical and Radiobiological Parameters on Three VMAT Techniques for Left-Sided Breast Cancer

  • Kang, Seong-Hee;Chung, Jin-Beom;Kim, Kyung-Hyeon;Kang, Sang-Won;Eom, Keun-Yong;Song, Changhoon;Kim, In-Ah;Kim, Jae-Sung
    • Progress in Medical Physics
    • /
    • v.30 no.1
    • /
    • pp.7-13
    • /
    • 2019
  • Purpose: To compare the dosimetrical and radiobiological parameters among various volumetric modulated arc therapy (VMAT) techniques using restricted and continuous arc beams for left-sided breast cancer. Materials and Methods: Ten patients with left-sided breast cancer without regional nodes were retrospectively selected and prescribed the dose of 42.6 Gy in 16 fractions on the planning target volume (PTV). For each patient, three plans were generated using the $Eclipse^{TM}$ system (Varian Medical System, Palo Alto, CA) with one partial arc 1pVMAT, two partial arcs 2pVMAT, and two tangential arcs 2tVMAT. All plans were calculated through anisotropic analytic algorithm and photon optimizer with 6 MV photon beam of $VitalBEAM^{TM}$. The same dose objectives for each plan were used to achieve a fair comparison during optimization. Results: For PTV, dosimetrical parameters such as Homogeneity index, conformity index, and conformal number were superior in 2pVMAT than those in both techniques. $V_{95%}$, which indicates PTV coverage, was 91.86%, 96.60%, and 96.65% for 1pVMAT, 2pVMAT, and 2tVMAT, respectively. In most organs at risk (OARs), 2pVMAT significantly reduced the delivered doses compared with the other techniques, excluding the doses to contralateral lung. For the analysis of radiobiological parameters, a significant difference in normal tissue complication probability was observed in ipsilateral lung while no difference was observed in the other OARs. Conclusions: Our study showed that 2pVMAT had better plan quality and normal tissue sparing than 1pVMAT and 2tVMAT but not for all parameters. Therefore, 2pVMAT could be considered the priority choice for the treatment planning for left breast cancer.

An Inventory Model for Deteriorating Products with Ordering Cost inclusive of a Freight Cost under Trade Credit (신용거래 하에 운송비용이 포함된 주문 비용을 고려한 퇴화성 제품의 재고 모형)

  • Shinn, Seong-Whan
    • The Journal of the Convergence on Culture Technology
    • /
    • v.5 no.1
    • /
    • pp.353-360
    • /
    • 2019
  • Trade credit is being used as a price discrimination strategy by the suppliers in order to increase the customer's demand. From the viewpoint of the customer, if delayed payment is allowed for a certain period of time from the supplier, the effect of reducing the inventory carrying cost will positively affect the customer's order quantity. Also, in deriving the economic order quantity(EOQ) formula, it is tacitly assumed that the customer's ordering cost is a fixed cost. However in many business transactions, the customer pays the freight cost for the transportation of his order and so, the customer's ordering cost contains not only a fixed cost but also a freight cost which is a function of the order size. Therefore, in this study, we analyzed the inventory model which considers that the customer's ordering cost contains not only a fixed cost but also a freight cost which is a function of the customer's order size when the supplier permits a delay in payments. For the analysis, it is also assumed that inventory is exhausted not only by customer's demand but also by deterioration. Investigation of the properties of an optimal solution allows us to develop an algorithm whose validity is illustrated using an example problem.

Three Dimensional Measurement of Ideal Trajectory of Pedicle Screws of Subaxial Cervical Spine Using the Algorithm Could Be Applied for Robotic Screw Insertion

  • Huh, Jisoon;Hyun, Jae Hwan;Park, Hyeong Geon;Kwak, Ho-Young
    • Journal of Korean Neurosurgical Society
    • /
    • v.62 no.4
    • /
    • pp.376-381
    • /
    • 2019
  • Objective : To define optimal method that calculate the safe direction of cervical pedicle screw placement using computed tomography (CT) image based three dimensional (3D) cortical shell model of human cervical spine. Methods : Cortical shell model of cervical spine from C3 to C6 was made after segmentation of in vivo CT image data of 44 volunteers. Three dimensional Cartesian coordinate of all points constituting surface of whole vertebra, bilateral pedicle and posterior wall were acquired. The ideal trajectory of pedicle screw insertion was defined as viewing direction at which the inner area of pedicle become largest when we see through the biconcave tubular pedicle. The ideal trajectory of 352 pedicles (eight pedicles for each of 44 subjects) were calculated using custom made program and were changed from global coordinate to local coordinate according to the three dimensional position of posterior wall of each vertebral body. The transverse and sagittal angle of trajectory were defined as the angle between ideal trajectory line and perpendicular line of posterior wall in the horizontal and sagittal plane. The averages and standard deviations of all measurements were calculated. Results : The average transverse angles were $50.60^{\circ}{\pm}6.22^{\circ}$ at C3, $51.42^{\circ}{\pm}7.44^{\circ}$ at C4, $47.79^{\circ}{\pm}7.61^{\circ}$ at C5, and $41.24^{\circ}{\pm}7.76^{\circ}$ at C6. The transverse angle becomes more steep from C3 to C6. The mean sagittal angles were $9.72^{\circ}{\pm}6.73^{\circ}$ downward at C3, $5.09^{\circ}{\pm}6.39^{\circ}$ downward at C4, $0.08^{\circ}{\pm}6.06^{\circ}$ downward at C5, and $1.67^{\circ}{\pm}6.06^{\circ}$ upward at C6. The sagittal angle changes from caudad to cephalad from C3 to C6. Conclusion : The absolute values of transverse and sagittal angle in our study were not same but the trend of changes were similar to previous studies. Because we know 3D address of all points constituting cortical shell of cervical vertebrae. we can easily reconstruct 3D model and manage it freely using computer program. More creative measurement of morphological characteristics could be carried out than direct inspection of raw bone. Furthermore this concept of measurement could be used for the computing program of automated robotic screw insertion.

A System Recovery using Hyper-Ledger Fabric BlockChain (하이퍼레저 패브릭 블록체인을 활용한 시스템 복구 기법)

  • Bae, Su-Hwan;Cho, Sun-Ok;Shin, Yong-Tae
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.2
    • /
    • pp.155-161
    • /
    • 2019
  • Currently, numerous companies and institutes provide services using the Internet, and establish and operate Information Systems to manage them efficiently and reliably. The Information System implies the possibility of losing the ability to provide normal services due to a disaster or disability. It is preparing for this by utilizing a disaster recovery system. However, existing disaster recovery systems cannot perform normal recovery if files for system recovery are corrupted. In this paper, we proposed a system that can verify the integrity of the system recovery file and proceed with recovery by utilizing hyper-ledger fabric blockchain. The PBFT consensus algorithm is used to generate the blocks and is performed by the leader node of the blockchain network. In the event of failure, verify the integrity of the recovery file by comparing the hash value of the recovery file with the hash value in the blockchain and proceed with recovery. For the evaluation of proposed techniques, a comparative analysis was conducted based on four items: existing system recovery techniques and data consistency, able to data retention, recovery file integrity, and using the proposed technique, the amount of traffic generated was analyzed to determine whether it was actually applicable.

Apriori Based Big Data Processing System for Improve Sensor Data Throughput in IoT Environments (IoT 환경에서 센서 데이터 처리율 향상을 위한 Apriori 기반 빅데이터 처리 시스템)

  • Song, Jin Su;Kim, Soo Jin;Shin, Young Tae
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.10 no.10
    • /
    • pp.277-284
    • /
    • 2021
  • Recently, the smart home environment is expected to be a platform that collects, integrates, and utilizes various data through convergence with wireless information and communication technology. In fact, the number of smart devices with various sensors is increasing inside smart homes. The amount of data that needs to be processed by the increased number of smart devices is also increasing, and big data processing systems are actively being introduced to handle it effectively. However, traditional big data processing systems have all requests directed to cluster drivers before they are allocated to distributed nodes, leading to reduced cluster-wide performance sharing as cluster drivers managing segmentation tasks become bottlenecks. In particular, there is a greater delay rate on smart home devices that constantly request small data processing. Thus, in this paper, we design a Apriori-based big data system for effective data processing in smart home environments where frequent requests occur at the same time. According to the performance evaluation results of the proposed system, the data processing time was reduced by up to 38.6% from at least 19.2% compared to the existing system. The reason for this result is related to the type of data being measured. Because the amount of data collected in a smart home environment is large, the use of cache servers plays a major role in data processing, and association analysis with Apriori algorithms stores highly relevant sensor data in the cache.

Comparison of Sizes of Anatomical Structures according to Scan Position Changes in Patients with Interstitial Lung Disease Using High-Resolution Thoracic CT (고해상도 흉부 전산화단층촬영을 이용한 간질성 폐질환을 가진 환자의 자세에 따른 해부학적 구조물 크기 비교)

  • Lee, Jae-min;Park, Je-heon;Kim, Ju-seong;Lim, Cheong-Hwan;Lee, Ki-Baek
    • Journal of radiological science and technology
    • /
    • v.44 no.2
    • /
    • pp.91-100
    • /
    • 2021
  • High-Resolution thoracic CT (HRCT) is a scanning protocol in which thin slice thickness and sharpness algorithm are utilized to enhance image resolution for diagnosis and assessment of interstitial lung disease (ILD). This examination is sometimes performed in both supine and prone position to improve sensitivity to early changes of these conditions. Anatomical structures (the size of lung field and heart and descending aorta) of 150 patients who underwent HRCT were retrospectively compared. HRCT had been conducted in two positions (supine and prone). Data were divided into five groups according to patient body weights (from 40 to more than 80kg, 10kg intervals, 60 patients/each group). Quantitative analysis was utilized in Image J program. In the supine position defined as the control group, the average values of lung fields and heart size and aorta were compared with the prone position defined as the experimental group. The size of the lungs was found to be higher in the supine position, and it was confirmed that there was a statistically significant difference in patients over 70 kg (p<0.05). In addition, both sizes of the heart and descending aorta were larger in prone position, but in the case of the heart, there was no correlation with the presence or absence of ILD disease (p>0.05). Also, the area of prone in the descending aorta was higher than supine position, but there was no statistically significant difference between supine and prone position (p>0.05). In conclusion, when the severity of ILD disease was severe, there was no statistically significant difference in the area difference between supine and prone position, so it is considered that it will be helpful in diagnostic decision.

Improved VFM Method for High Accuracy Flight Simulation (고정밀 비행 시뮬레이션을 위한 개선 VFM 기법 연구)

  • Lee, Chiho;Kim, Mukyeom;Lee, Jae-Lyun;Jeon, Kwon-Su;Tyan, Maxim;Lee, Jae-Woo
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.49 no.9
    • /
    • pp.709-719
    • /
    • 2021
  • Recent progress in analysis and flight simulation methods enables wider use of a virtual certification and reduces number of certification flight tests. Aerodynamic database (AeroDB) is one of the most important components for the flight simulation. It is composed of aerodynamic coefficients at a range of flight conditions and control deflections. This paper proposes and efficient method for construction of AeroDB that combines Gaussian Process based Variable Fidelity Modeling with adaptive sampling algorithm. A case study of virtual certification of a F-16 fighter is presented. Four AeroDB were constructed using different number and distribution of high-fidelity data points. The constructed database is then used to simulate gliding, short pitch, and roll response. Compliance with certification regulations is then checked. The case study demonstrates that the proposed method can significantly reduce number of high-fidelity data points while maintaining high accuracy of the simulation.

The Analysis of Changes in East Coast Tourism using Topic Modeling (토핑 모델링을 활용한 동해안 관광의 변화 분석)

  • Jeong, Eun-Hee
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.6
    • /
    • pp.489-495
    • /
    • 2020
  • The amount of data is increasing through various IT devices in a hyper-connected society where the 4th revolution is progressing, and new value can be created by analyzing that data. This paper was collected total 1,526 articles from 2017 to 2019 in central magazines, economic magazines, regional associations, and major broadcasting companies with the keyword "(East Coast Tourism or East Coast Travel) and Gangwon-do" through Bigkinds. It was performed the topic modeling using LDA algorithm implemented in the R language to analyze the collected 1,526 articles. It was extracted keywords for each year from 2017 to 2019, and classified and compared keywords with high frequency for each year. It was setted the optimal number of topics to 8 using Log Likelihood and Perplexity, and then inferred 8 topics using the Gibbs Sampling method. The inferred topics were Gangneung and Beach, Goseong and Mt.Geumgang, KTX and Donghae-Bukbu line, weekend sea tour, Sokcho and Unification Observatory, Yangyang and Surfing, experience tour, and transportation network infra. The changes of articles on East coast tourism was was analyzed using the proportion of the inferred eight topics. As the result, the proportion of Unification Observatory and Mt. Geumgang showed no significant change, the proportion of KTX and experience tour increased, and the proportion of other topics decreased in 2018 compared to 2017. In 2019, the proportion of KTX and experience tour decreased, but the proportion of other topics showed no significant change.

A Study on Prediction of EPB shield TBM Advance Rate using Machine Learning Technique and TBM Construction Information (머신러닝 기법과 TBM 시공정보를 활용한 토압식 쉴드TBM 굴진율 예측 연구)

  • Kang, Tae-Ho;Choi, Soon-Wook;Lee, Chulho;Chang, Soo-Ho
    • Tunnel and Underground Space
    • /
    • v.30 no.6
    • /
    • pp.540-550
    • /
    • 2020
  • Machine learning has been actively used in the field of automation due to the development and establishment of AI technology. The important thing in utilizing machine learning is that appropriate algorithms exist depending on data characteristics, and it is needed to analysis the datasets for applying machine learning techniques. In this study, advance rate is predicted using geotechnical and machine data of TBM tunnel section passing through the soil ground below the stream. Although there were no problems of application of statistical technology in the linear regression model, the coefficient of determination was 0.76. While, the ensemble model and support vector machine showed the predicted performance of 0.88 or higher. it is indicating that the model suitable for predicting advance rate of the EPB Shield TBM was the support vector machine in the analyzed dataset. As a result, it is judged that the suitability of the prediction model using data including mechanical data and ground information is high. In addition, research is needed to increase the diversity of ground conditions and the amount of data.

Analysis on Handicaps of Automated Vehicle and Their Causes using IPA and FGI (IPA 및 FGI 분석을 통한 자율주행차량 핸디캡과 발생원인 분석)

  • Jeon, Hyeonmyeong;Kim, Jisoo
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.20 no.3
    • /
    • pp.34-46
    • /
    • 2021
  • In order to accelerate the commercialization of self-driving cars, it is necessary to accurately identify the causes of deteriorating the driving safety of the current self-driving cars and try to improve them. This study conducted a questionnaire survey of experts studying autonomous driving in Korea to identify the causes of problems in the driving safety of autonomous vehicles and the level of autonomous driving technology in Korea. As a result of the survey, the construction section, heavy rain/heavy snow conditions, fine dust conditions, and the presence of potholes were less satisfied with the current technology level than their importance, and thus priority research and development was required. Among them, the failure of road/road facilities and the performance of the sensor itself in the construction section and the porthole, and the performance of the sensor and the absence of an algorithm were the most responsible for the situation connected to the weather. In order to realize safe autonomous driving as soon as possible, it is necessary to continuously identify and resolve the causes that hinder the driving safety of autonomous vehicles.