• Title/Summary/Keyword: $A^*$ Algorithm

Search Result 54,473, Processing Time 0.079 seconds

K-DEV: A Borehole Deviation Logging Probe Applicable to Steel-cased Holes (철재 케이싱이 설치된 시추공에서도 적용가능한 공곡검층기 K-DEV)

  • Yoonho, Song;Yeonguk, Jo;Seungdo, Kim;Tae Jong, Lee;Myungsun, Kim;In-Hwa, Park;Heuisoon, Lee
    • Geophysics and Geophysical Exploration
    • /
    • v.25 no.4
    • /
    • pp.167-176
    • /
    • 2022
  • We designed a borehole deviation survey tool applicable for steel-cased holes, K-DEV, and developed a prototype for a depth of 500 m aiming to development of own equipment required to secure deep subsurface characterization technologies. K-DEV is equipped with sensors that provide digital output with verified high performance; moreover, it is also compatible with logging winch systems used in Korea. The K-DEV prototype has a nonmagnetic stainless steel housing with an outer diameter of 48.3 mm, which has been tested in the laboratory for water resistance up to 20 MPa and for durability by running into a 1-km deep borehole. We confirmed the operational stability and data repeatability of the prototype by constantly logging up and down to the depth of 600 m. A high-precision micro-electro-mechanical system (MEMS) gyroscope was used for the K-DEV prototype as the gyro sensor, which is crucial for azimuth determination in cased holes. Additionally, we devised an accurate trajectory survey algorithm by employing Unscented Kalman filtering and data fusion for optimization. The borehole test with K-DEV and a commercial logging tool produced sufficiently similar results. Furthermore, the issue of error accumulation due to drift over time of the MEMS gyro was successfully overcome by compensating with stationary measurements for the same attitude at the wellhead before and after logging, as demonstrated by the nearly identical result to the open hole. We believe that the methodology of K-DEV development and operational stability, as well as the data reliability of the prototype, were confirmed through these test applications.

Evaluating efficiency of Coaxial MLC VMAT plan for spine SBRT (Spine SBRT 치료시 Coaxial MLC VMAT plan의 유용성 평가)

  • Son, Sang Jun;Mun, Jun Ki;Kim, Dae Ho;Yoo, Suk Hyun
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.26 no.2
    • /
    • pp.313-320
    • /
    • 2014
  • Purpose : The purpose of the study is to evaluate the efficiency of Coaxial MLC VMAT plan (Using $273^{\circ}$ and $350^{\circ}$ collimator angle) That the leaf motion direction aligned with axis of OAR (Organ at risk, It means spinal cord or cauda equine in this study.) compare to Universal MLC VMAT plan (using $30^{\circ}$ and $330^{\circ}$ collimator angle) for spine SBRT. Materials and Methods : The 10 cases of spine SBRT that treated with VMAT planned by Coaxial MLC and Varian TBX were enrolled. Those cases were planned by Eclipse (Ver. 10.0.42, Varian, USA), PRO3 (Progressive Resolution Optimizer 10.0.28) and AAA (Anisotropic Analytic Algorithm Ver. 10.0.28) with coplanar $360^{\circ}$ arcs and 10MV FFF (Flattening filter free). Each arc has $273^{\circ}$ and $350^{\circ}$ collimator angle, respectively. The Universal MLC VMAT plans are based on existing treatment plans. Those plans have the same parameters of existing treatment plans but collimator angle. To minimize the dose difference that shows up randomly on optimizing, all plans were optimized and calculated twice respectively. The calculation grid is 0.2 cm and all plans were normalized to the target V100%=90%. The indexes of evaluation are V10Gy, D0.03cc, Dmean of OAR (Organ at risk, It means spinal cord or cauda equine in this study.), H.I (Homogeneity index) of the target and total MU. All Coaxial VMAT plans were verified by gamma test with Mapcheck2 (Sun Nuclear Co., USA), Mapphan (Sun Nuclear Co., USA) and SNC patient (Sun Nuclear Co., USA Ver 6.1.2.18513). Results : The difference between the coaxial and the universal VMAT plans are follow. The coaxial VMAT plan is better in the V10Gy of OAR, Up to 4.1%, at least 0.4%, the average difference was 1.9% and In the D0.03cc of OAR, Up to 83.6 cGy, at least 2.2 cGy, the average difference was 33.3 cGy. In Dmean, Up to 34.8 cGy, at least -13.0 cGy, the average difference was 9.6 cGy that say the coaxial VMAT plans are better except few cases. H.I difference Up to 0.04, at least 0.01, the average difference was 0.02 and the difference of average total MU is 74.1 MU. The coaxial MLC VMAT plan is average 74.1 MU lesser then another. All IMRT verification gamma test results for the coaxial MLC VMAT plan passed over 90.0% at 1mm / 2%. Conclusion : Coaxial MLC VMAT treatment plan appeared to be favorable in most cases than the Universal MLC VMAT treatment planning. It is efficient in lowering the dose of the OAR V10Gy especially. As a result, the Coaxial MLC VMAT plan could be better than the Universal MLC VMAT plan in same MU.

A study of the plan dosimetic evaluation on the rectal cancer treatment (직장암 치료 시 치료계획에 따른 선량평가 연구)

  • Jeong, Hyun Hak;An, Beom Seok;Kim, Dae Il;Lee, Yang Hoon;Lee, Je hee
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.28 no.2
    • /
    • pp.171-178
    • /
    • 2016
  • Purpose : In order to minimize the dose of femoral head as an appropriate treatment plan for rectal cancer radiation therapy, we compare and evaluate the usefulness of 3-field 3D conformal radiation therapy(below 3fCRT), which is a universal treatment method, and 5-field 3D conformal radiation therapy(below 5fCRT), and Volumetric Modulated Arc Therapy (VMAT). Materials and Methods : The 10 cases of rectal cancer that treated with 21EX were enrolled. Those cases were planned by Eclipse(Ver. 10.0.42, Varian, USA), PRO3(Progressive Resolution Optimizer 10.0.28) and AAA(Anisotropic Analytic Algorithm Ver. 10.0.28). 3fCRT and 5fCRT plan has $0^{\circ}$, $270^{\circ}$, $90^{\circ}$ and $0^{\circ}$, $95^{\circ}$, $45^{\circ}$, $315^{\circ}$, $265^{\circ}$ gantry angle, respectively. VMAT plan parameters consisted of 15MV coplanar $360^{\circ}$ 1 arac. Treatment prescription was employed delivering 54Gy to recum in 30 fractions. To minimize the dose difference that shows up randomly on optimizing, VMAT plans were optimized and calculated twice, and normalized to the target V100%=95%. The indexes of evaluation are D of Both femoral head and aceta fossa, total MU, H.I.(Homogeneity index) and C.I.(Conformity index) of the PTV. All VMAT plans were verified by gamma test with portal dosimetry using EPID. Results : D of Rt. femoral head was 53.08 Gy, 50.27 Gy, and 30.92 Gy, respectively, in the order of 3fCRT, 5fCRT, and VMAT treatment plan. Likewise, Lt. Femoral head showed average 53.68 Gy, 51.01 Gy and 29.23 Gy in the same order. D of Rt. aceta fossa was 54.86 Gy, 52.40 Gy, 30.37 Gy, respectively, in the order of 3fCRT, 5fCRT, and VMAT treatment plan. Likewise, Lt. Femoral head showed average 53.68 Gy, 51.01 Gy and 29.23 Gy in the same order. The maximum dose of both femoral head and aceta fossa was higher in the order of 3fCRT, 5fCRT, and VMAT treatment plan. C.I. showed the lowest VMAT treatment plan with an average of 1.64, 1.48, and 0.99 in the order of 3fCRT, 5fCRT, and VMAT treatment plan. There was no significant difference on H.I. of the PTV among three plans. Total MU showed that the VMAT treatment plan used 124.4MU and 299MU more than the 3fCRT and 5fCRT treatment plan, respectively. IMRT verification gamma test results for the VMAT plan passed over 90.0% at 2mm/2%. Conclusion : In rectal cancer treatment, the VMAT plan was shown to be advantageous in most of the evaluation indexes compared to the 3D plan, and the dose of the femoral head was greatly reduced. However, because of practical limitations there may be a case where it is difficult to select a VMAT treatment plan. 5fCRT has the advantage of reducing the dose of the femoral head as compared to the existing 3fCRT, without regard to additional problems. Therefore, not only would it extend survival time but the quality of life in general, if hospitals improved radiation therapy efficiency by selecting the treatment plan in accordance with the hospital's situation.

  • PDF

The Evaluation of TrueX Reconstruction Method in Low Dose (저선량에서의 TrueX 재구성 방법에 의한 유용성 평가)

  • Oh, Se-Moon;Kim, Kye-Hwan;Kim, Seung-Jeong;Lee, Hong-Jae;Kim, Jin-Eui
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.15 no.2
    • /
    • pp.83-87
    • /
    • 2011
  • Purpose: Recently in diagnostics area PET/CT is using a variety of areas including oncology, as well as in cardiology, neurology, etc. While increasing in the importance of PET/CT, there are various researches in the image quality related to reconstruction method. We compared and tested Iterative 2D Reconstruction Method with True X Reconstruction method by Siemens through phantom experiment, so we can see increasing of clinical usefulness of PET/CT. Materials and Methods: We measured contrast ratio and FWHM due to evaluating images on dose and experiment using Biograph 40 True Point PET/CT (Siemens, Germany). Getting a result of contrast ratio and FWHM, we used NEMA IEC PET body phantom (Data Spectrum Corp.) and capillary tube. We used the current TrueX and the previous Iterative 2D algorithm for all images which have 10 minutes long. Also, a clinical suitability of parameter for Iterative 2D and a recommended parameter by Siemens for True X are applied to the experiment. Results: We tested FWHM using capillary tube. As a result, TrueX was less than Iterative 2D. Also, the differences of FWHM get bigger in low dose. On the other hand, we tested contrasts ratio using NEMA IEC PET body phantom. As a result, TrueX was better aspect than Iterative 2D. However, there was no difference in dose. Conclusion: In this experiment, TrueX get higher results of contrast ratio and spatial resolution than Itertive 2D through experiment. Also, in the reconstruction result through TrueX, TrueX had better aspect of resolution than Iterative 2D in low dose. However, contrast ratio had no specific difference. In other words, TrueX reconstruction method in PET/CT had higher clinical value in use because TrueX can reduce exposure of patient and had a better quality of screen.

  • PDF

A Development of Traffic Queue Length Measuring Algorithm Using ILD(Inductive Loop Detector) Based on COSMOS (실시간 신호제어시스템의 대기길이 추정 알고리즘 개발)

  • seong ki-ju;Lee choul-ki;Jeong Jun-ha;Lee young-in;Park dae-hyun
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.3 no.1 s.4
    • /
    • pp.85-96
    • /
    • 2004
  • The study begin with a basic concept, if the occupancy length of vehicle detector is directly proportional to the delay of vehicle. That is, it analogize vehicle's delay of a occupancy time. The results of a study was far superior in the estimation of a queue length. It is a very good points the operator is not necessary to optimize s1, s2, Thdoc. Thdoc(critical congestion degree) replaced 0.7 with 0.2 - 0.3. But, if vehicles have been experience in delay was not occupy vehicle detector, the study is in existence some problems. In conclusion, it is necessary that stretch queue detector or install paired queue detector. Also I want to be made steady progress a following study relation to this study, because it is required traffic signal control on congestion.

  • PDF

Reconstruction of Metabolic Pathway for the Chicken Genome (닭 특이 대사 경로 재확립)

  • Kim, Woon-Su;Lee, Se-Young;Park, Hye-Sun;Baik, Woon-Kee;Lee, Jun-Heon;Seo, Seong-Won
    • Korean Journal of Poultry Science
    • /
    • v.37 no.3
    • /
    • pp.275-282
    • /
    • 2010
  • Chicken is an important livestock as a valuable biomedical model as well as food for human, and there is a strong rationale for improving our understanding on metabolism and physiology of this organism. The first draft of chicken genome assembly was released in 2004, which enables elaboration on the linkage between genetic and metabolic traits of chicken. The objectives of this study were thus to reconstruct metabolic pathway of the chicken genome and to construct a chicken specific pathway genome database (PGDB). We developed a comprehensive genome database for chicken by integrating all the known annotations for chicken genes and proteins using a pipeline written in Perl. Based on the comprehensive genome annotations, metabolic pathways of the chicken genome were reconstructed using the PathoLogic algorithm in Pathway Tools software. We identified a total of 212 metabolic pathways, 2,709 enzymes, 71 transporters, 1,698 enzymatic reactions, 8 transport reactions, and 1,360 compounds in the current chicken genome build, Gallus_gallus-2.1. Comparative metabolic analysis with the human, mouse and cattle genomes revealed that core metabolic pathways are highly conserved in the chicken genome. It was indicated the quality of assembly and annotations of the chicken genome need to be improved and more researches are required for improving our understanding on function of genes and metabolic pathways of avian species. We conclude that the chicken PGDB is useful for studies on avian and chicken metabolism and provides a platform for comparative genomic and metabolic analysis of animal biology and biomedicine.

The Evaluation of Image Correction Methods for SPECT/CT in Various Radioisotopes with Different Energy Levels (SPECT/CT에서 서로 다른 에너지의 방사성동위원소 사용시 영상보정기법의 유용성 평가)

  • Shin, Byung Ho;Kim, Seung Jeong;Yun, Seok Hwan;Kim, Tae Yeop;Lim, Jung Jin;Woo, Jae Ryong;Oh, So Won;Kim, Yu Kyeong
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.17 no.2
    • /
    • pp.53-58
    • /
    • 2013
  • Purpose: To optimize correction method for SPECT/CT, image quality consisting of resolution and contrast was evaluated using three radioisotopes ($^{99m}Tc$, $^{201}Tl$ and $^{131}I$) and three different correction methods; attenuation correction (AC), scatter correction (SC) and both attenuation and scatter correction (ACSC). Materials and Methods: Images were acquired with a SPECT/CT scanner and a conventional CT protocol with an OESM reconstruction algorithm (2 iterations and 10 subsets). For resolution measurement, fixed radioactivity (2.22 kBq) was infused into a spatial resolution phantom and full width at half maximum (FWHM) was measured using a vendor-provided software. For contrast evaluation, radioactive source with a ratio of 1:8 to background was filled in a Flanged Jaszczak phantom and percent contrast (%) were calculated. All the parameters for image quality were compared with non-correction (NC) method. Results: As compared with NC, image resolution of all three isotopes were significantly improved by AC and ACSC, not by SC. In particular, ACSC showed better resolution than AC alone for $^{99m}Tc$ and $^{201}Tl$. Image contrast of all three radioisotopes in a sphere with the largest diameter were enhanced by all correction methods. ACSC showed the highest contrast in all three radioisotopes, which was the most accurate in $^{99m}Tc$ (85.9%). Conclusion: Image quality of SPECT/CT was improved in all the radioisotopes by CT-based attenuation correction methods, except SC alone. SC failed to improve resolution in any radioisotopes, but it was effective in contrast enhancement. ACSC would be the best correction method as it improved resolution in radioisotopes with low energy levels and contrast in radioisotope with low energy levels. However, in radioisotope with high energy level, AC would be better than ACSC for resolution improvement.

  • PDF

The Behavior Analysis of Exhibition Visitors using Data Mining Technique at the KIDS & EDU EXPO for Children (유아교육 박람회에서 데이터마이닝 기법을 이용한 전시 관람 행동 패턴 분석)

  • Jung, Min-Kyu;Kim, Hyea-Kyeong;Choi, Il-Young;Lee, Kyoung-Jun;Kim, Jae-Kyeong
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.2
    • /
    • pp.77-96
    • /
    • 2011
  • An exhibition is defined as market events for specific duration to present exhibitors' main products to business or private visitors, and it plays a key role as effective marketing channels. As the importance of exhibition is getting more and more, domestic exhibition industry has achieved such a great quantitative growth. But, In contrast to the quantitative growth of domestic exhibition industry, the qualitative growth of Exhibition has not achieved competent growth. In order to improve the quality of exhibition, we need to understand the preference or behavior characteristics of visitors and to increase the level of visitors' attention and satisfaction through the understanding of visitors. So, in this paper, we used the observation survey method which is a kind of field research to understand visitors and collect the real data for the analysis of behavior pattern. And this research proposed the following methodology framework consisting of three steps. First step is to select a suitable exhibition to apply for our method. Second step is to implement the observation survey method. And we collect the real data for further analysis. In this paper, we conducted the observation survey method to obtain the real data of the KIDS & EDU EXPO for Children in SETEC. Our methodology was conducted on 160 visitors and 78 booths from November 4th to 6th in 2010. And, the last step is to analyze the record data through observation. In this step, we analyze the feature of exhibition using Demographic Characteristics collected by observation survey method at first. And then we analyze the individual booth features by the records of visited booth. Through the analysis of individual booth features, we can figure out what kind of events attract the attention of visitors and what kind of marketing activities affect the behavior pattern of visitors. But, since previous research considered only individual features influenced by exhibition, the research about the correlation among features is not performed much. So, in this research, additional analysis is carried out to supplement the existing research with data mining techniques. And we analyze the relation among booths using data mining techniques to know behavior patterns of visitors. Among data mining techniques, we make use of two data mining techniques, such as clustering analysis and ARM(Association Rule Mining) analysis. In clustering analysis, we use K-means algorithm to figure out the correlation among booths. Through data mining techniques, we figure out that there are two important features to affect visitors' behavior patterns in exhibition. One is the geographical features of booths. The other is the exhibit contents of booths. Those features are considered when the organizer of exhibition plans next exhibition. Therefore, the results of our analysis are expected to provide guideline to understanding visitors and some valuable insights for the exhibition from the earlier phases of exhibition planning. Also, this research would be a good way to increase the quality of visitor satisfaction. Visitors' movement paths, booth location, and distances between each booth are considered to plan next exhibition in advance. This research was conducted at the KIDS & EDU EXPO for Children in SETEC(Seoul Trade Exhibition & Convention), but it has some constraints to be applied directly to other exhibitions. Also, the results were derived from a limited number of data samples. In order to obtain more accurate and reliable results, it is necessary to conduct more experiments based on larger data samples and exhibitions on a variety of genres.

Dual Codec Based Joint Bit Rate Control Scheme for Terrestrial Stereoscopic 3DTV Broadcast (지상파 스테레오스코픽 3DTV 방송을 위한 이종 부호화기 기반 합동 비트율 제어 연구)

  • Chang, Yong-Jun;Kim, Mun-Churl
    • Journal of Broadcast Engineering
    • /
    • v.16 no.2
    • /
    • pp.216-225
    • /
    • 2011
  • Following the proliferation of three-dimensional video contents and displays, many terrestrial broadcasting companies have been preparing for stereoscopic 3DTV service. In terrestrial stereoscopic broadcast, it is a difficult task to code and transmit two video sequences while sustaining as high quality as 2DTV broadcast due to the limited bandwidth defined by the existing digital TV standards such as ATSC. Thus, a terrestrial 3DTV broadcasting with a heterogeneous video codec system, where the left image and right images are based on MPEG-2 and H.264/AVC, respectively, is considered in order to achieve both high quality broadcasting service and compatibility for the existing 2DTV viewers. Without significant change in the current terrestrial broadcasting systems, we propose a joint rate control scheme for stereoscopic 3DTV service based on the heterogeneous dual codec systems. The proposed joint rate control scheme applies to the MPEG-2 encoder a quadratic rate-quantization model which is adopted in the H.264/AVC. Then the controller is designed for the sum of the left and right bitstreams to meet the bandwidth requirement of broadcasting standards while the sum of image distortions is minimized by adjusting quantization parameter obtained from the proposed optimization scheme. Besides, we consider a condition on maintaining quality difference between the left and right images around a desired level in the optimization in order to mitigate negative effects on human visual system. Experimental results demonstrate that the proposed bit rate control scheme outperforms the rate control method where each video coding standard uses its own bit rate control algorithm independently in terms of the increase in PSNR by 2.02%, the decrease in the average absolute quality difference by 77.6% and the reduction in the variance of the quality difference by 74.38%.

Predictive Clustering-based Collaborative Filtering Technique for Performance-Stability of Recommendation System (추천 시스템의 성능 안정성을 위한 예측적 군집화 기반 협업 필터링 기법)

  • Lee, O-Joun;You, Eun-Soon
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.119-142
    • /
    • 2015
  • With the explosive growth in the volume of information, Internet users are experiencing considerable difficulties in obtaining necessary information online. Against this backdrop, ever-greater importance is being placed on a recommender system that provides information catered to user preferences and tastes in an attempt to address issues associated with information overload. To this end, a number of techniques have been proposed, including content-based filtering (CBF), demographic filtering (DF) and collaborative filtering (CF). Among them, CBF and DF require external information and thus cannot be applied to a variety of domains. CF, on the other hand, is widely used since it is relatively free from the domain constraint. The CF technique is broadly classified into memory-based CF, model-based CF and hybrid CF. Model-based CF addresses the drawbacks of CF by considering the Bayesian model, clustering model or dependency network model. This filtering technique not only improves the sparsity and scalability issues but also boosts predictive performance. However, it involves expensive model-building and results in a tradeoff between performance and scalability. Such tradeoff is attributed to reduced coverage, which is a type of sparsity issues. In addition, expensive model-building may lead to performance instability since changes in the domain environment cannot be immediately incorporated into the model due to high costs involved. Cumulative changes in the domain environment that have failed to be reflected eventually undermine system performance. This study incorporates the Markov model of transition probabilities and the concept of fuzzy clustering with CBCF to propose predictive clustering-based CF (PCCF) that solves the issues of reduced coverage and of unstable performance. The method improves performance instability by tracking the changes in user preferences and bridging the gap between the static model and dynamic users. Furthermore, the issue of reduced coverage also improves by expanding the coverage based on transition probabilities and clustering probabilities. The proposed method consists of four processes. First, user preferences are normalized in preference clustering. Second, changes in user preferences are detected from review score entries during preference transition detection. Third, user propensities are normalized using patterns of changes (propensities) in user preferences in propensity clustering. Lastly, the preference prediction model is developed to predict user preferences for items during preference prediction. The proposed method has been validated by testing the robustness of performance instability and scalability-performance tradeoff. The initial test compared and analyzed the performance of individual recommender systems each enabled by IBCF, CBCF, ICFEC and PCCF under an environment where data sparsity had been minimized. The following test adjusted the optimal number of clusters in CBCF, ICFEC and PCCF for a comparative analysis of subsequent changes in the system performance. The test results revealed that the suggested method produced insignificant improvement in performance in comparison with the existing techniques. In addition, it failed to achieve significant improvement in the standard deviation that indicates the degree of data fluctuation. Notwithstanding, it resulted in marked improvement over the existing techniques in terms of range that indicates the level of performance fluctuation. The level of performance fluctuation before and after the model generation improved by 51.31% in the initial test. Then in the following test, there has been 36.05% improvement in the level of performance fluctuation driven by the changes in the number of clusters. This signifies that the proposed method, despite the slight performance improvement, clearly offers better performance stability compared to the existing techniques. Further research on this study will be directed toward enhancing the recommendation performance that failed to demonstrate significant improvement over the existing techniques. The future research will consider the introduction of a high-dimensional parameter-free clustering algorithm or deep learning-based model in order to improve performance in recommendations.