• Title/Summary/Keyword: MAR algorithm

Search Result 38, Processing Time 0.024 seconds

Mission planning and performance verification of an unmanned surface vehicle using a genetic algorithm

  • Park, Jihoon;Kim, Sukkeun;Noh, Geemoon;Kim, Hyeongmin;Lee, Daewoo;Lee, Inwon
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.13 no.1
    • /
    • pp.575-584
    • /
    • 2021
  • This study contains the process of developing a Mission Planning System (MPS) of an USV that can be applied in real situations and verifying them through HILS. In this study, we set the scenario of a single USV with limited operating time. Since the USV may not perform some missions due to the limited operating time, an objective function was defined to maximize the Mission Achievement Rate (MAR). We used a genetic algorithm to solve the problem model, and proposed a method using a 3-D population. The simulation showed that the probability of deriving the global optimal solution of the mission planning algorithm was 96.6% and the computation time was 1.6 s. Furthermore, USV showed it performs the mission according to the results of the MPS. We expect that the MPS developed in this study can be applied to the real environment where USV performs missions with limited time conditions.

A Parallel Algorithm for Merging Relaxed Min-Max Heaps (Relaxed min-max 힙을 병합하는 병렬 알고리즘)

  • Min, Yong-Sik
    • The Transactions of the Korea Information Processing Society
    • /
    • v.5 no.5
    • /
    • pp.1162-1171
    • /
    • 1998
  • This paper presents a data structure that implements a mergable double-ended priority queue : namely an improved relaxed min-max-pair heap. By means of this new data structure, we suggest a parallel algorithm to merge priority queues organized in two relaxed heaps of different sizes, n and k, respectively. This new data-structure eliminates the blossomed tree and the lazying method used to merge the relaxed min-max heaps in [9]. As a result, employing max($2^{i-1}$,[(m+1/4)]) processors, this algorithm requires O(log(log(n/k))${\times}$log(n)) time. Also, on the MarPar machine, this method achieves a 35.205-fold speedup with 64 processors to merge 8 million data items which consist of two relaxed min-max heaps of different sizes.

  • PDF

Multiple imputation for competing risks survival data via pseudo-observations

  • Han, Seungbong;Andrei, Adin-Cristian;Tsui, Kam-Wah
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.4
    • /
    • pp.385-396
    • /
    • 2018
  • Competing risks are commonly encountered in biomedical research. Regression models for competing risks data can be developed based on data routinely collected in hospitals or general practices. However, these data sets usually contain the covariate missing values. To overcome this problem, multiple imputation is often used to fit regression models under a MAR assumption. Here, we introduce a multivariate imputation in a chained equations algorithm to deal with competing risks survival data. Using pseudo-observations, we make use of the available outcome information by accommodating the competing risk structure. Lastly, we illustrate the practical advantages of our approach using simulations and two data examples from a coronary artery disease data and hepatocellular carcinoma data.

A Study on an Inductive Motion Edit Methodology using a Uniform Posture Map (균등 자세 지도를 이용한 귀납적 동작 편집 기법에 관한 연구)

  • 이범로;정진현
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.2C
    • /
    • pp.162-171
    • /
    • 2003
  • It is difficult to reuse the captured motion data, because the data has a difficulty in editing it. In this paper, a uniform posture mar (UPM) algorithm, one of unsupervised learning neural network is proposed to edit the captured motion data. Because it needs much less computational cost than other motion editing algorithms, it is adequate to apply in teal-time applications. The UPM algorithm prevents from generating an unreal posture in learning phase. It not only makes more realistic motion curves, but also contributes to making more natural motions. Above of all, it complements the weakness of the existing algorithm where the calculation quantity increases in proportion to increase the number of restricted condition to solve the problems of high order articulated body. In this paper, it is shown two applications as a visible the application instance of UPM algorithm. One is a motion transition editing system, the other is a inductive inverse kinematics system. This method could be applied to produce 3D character animation based on key frame method, 3D game, and virtual reality, etc.

Source Identification of PM-10 in Suwon Using the Method of Positive Matrix Factorization (PMF 방법론을 이용한 수원지역 PM-10의 오염원 확인)

  • 황인조;김태오;김동술
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.17 no.2
    • /
    • pp.133-145
    • /
    • 2001
  • The receptor modeling is one of the statistical methods to achieve reasonable air pollution strategies. The pur-pose of this study was to survey the concentration variability oi inorganic elements and ionic species in the PM-10 particles, to qualitatively characterize emission sources by an advanced algorithm called positive matrix factoriza-tion(PMF) as a receptor model that can strictly provide results in every loading matrix. A total of 254 samples was collected by a PM-10 high volume air sampler from Mar. 1997 to Feb. 1998 in Kyung Hee University at Suwon Campus. Fourteen chemical species(Zn, Cu, Fe, Pb, Al, Mn, $Na^{+}$, NH$_4$+, $K^{+}$, $Mg^{2+}$, $Ca^{2+}$, $SO_4^{2-}$, $NO_{3}^{-}$, and $Cl^{-}$) were determined by AAS and IC methods. The study results showed that the average monthly concentration of PM-10 particles were 86.3$\mu\textrm{g}$/$\textrm{m}^3$ in March (maximum) and 28.5$\mu\textrm{g}$/$\textrm{m}^3$ in August(minimum), respectively. The concentrations of Na+, NH$_4$+, $K^{+}$ and $Cl^{-}$ in winter, $Mg^{2+}$, $Ca^{2+}$ and $NO_{3}^{-}$, in spring, and $SO_4^{2-}$ in summer showed the largest peak concentration for the respective season. Through and app-lication of a PMF program of Pm-10 concentration data of Suwon, 9 sources were qualitatively identified , such as incineration source, oil burning source, soil related source, open burning source automobile source, coal burning sources, secondary sulfate related source, and secondary nitrate related source.

  • PDF

Reconstruction of Necrosis Following Total Knee Replacement Arthroplasty (슬관절 전치환술 후 발생한 피부 괴사부의 재건)

  • Ahn, Hee Chang;Lim, Young Soo;Kim, Chang Yeon;Hwang, Weon Joong
    • Archives of Plastic Surgery
    • /
    • v.32 no.1
    • /
    • pp.93-99
    • /
    • 2005
  • In spite of proper maneuver of total knee replacement arthroplasty, some patients suffer from skin necrosis just above the implant. From Mar. 2000 to Jan. 2004, the authors performed reconstruction of knee skin defects after total knee replacement athroplasty. Total 6 cases of flap surgery were performed and patients ranged between 43-years-old to 82-years-old. Rectus femoris perforator based reversed adipofascial flaps were used in 2 cases, medial gastrocnemius muscular island flaps were used in 2 cases and sural artery based on adipofascial rotation flap was used in 1 case. One patient with extended necrosis underwent reconstruction with dual flaps of sural artery based adipofascial rotation flap and medial gastrocnemius muscular island flap. There were no distinctive complication needing additional procedure in all cases during the long term follow up. Reconstruction of necrosis following total knee replacement arthroplasty had several characteristics different from simple knee defect. The patients might have the history of long term steroid usages, excessive skin tension due to implants, underlying disease such as diabetes, rheumatoid disease, and etc. In addition, the early ambulation is mandatory in these patients of total knee replacement arthroplasty. With regards to these special considerations, a single stage and reliable operation must be needed. The authors introduce various reconstruction methods and algorithm that may aid easy decision making.

Debelppment of C++ Compiler and Programming Environment (C++컴파일러 및 프로그래밍 환경 개발)

  • Jang, Cheon-Hyeon;O, Se-Man
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.3
    • /
    • pp.831-845
    • /
    • 1997
  • In this paper,we proposed and developed a compiler and interactive programming enviroments for C++ wich is mostly worth of nitice among the object -oriented languages.To develope the compiler for C++ we took front=end/back-end model using EM virtual machine.In develpoing Front-End,we formailized C++ gram-mar with the context semsitive tokens which must be manipulated by dexical scanner and designed a AST class li-brary which is the hierarchy of AST node class and well defined interface among them,In develpoing Bacik-End,we proposed model for three major components :code oprtimizer,code generator and run-time enviroments.We emphasized the retargatable back-end which can be systrmatically reconfigured to genrate code for a variety of distinct target computers.We also developed terr pattern matching algorithm and implemented target code gen-erator which produce SPARC code.We also proposed the theroy and model for construction interative pro-gramming enviroments. To represent language features we adopt AST as internal reprsentation and propose uncremental analysis algorithm and viseal digrams.We also studied unparsing scheme, visual diagram,graphical user interface to generate interactive environments automatically Results of our resarch will be very useful for developing a complier and programming environments, and also can be used in compilers for parallel and distributed enviroments.

  • PDF

Effects of Iterative Reconstruction Algorithm, Automatic Exposure Control on Image Quality, and Radiation Dose: Phantom Experiments with Coronary CT Angiography Protocols (반복적 재구성 알고리즘과 관전류 자동 노출 조정 기법의 CT 영상 화질과 선량에 미치는 영향: 관상동맥 CT 조영 영상 프로토콜 기반의 팬텀 실험)

  • Ha, Seongmin;Jung, Sunghee;Chang, Hyuk-Jae;Park, Eun-Ah;Shim, Hackjoon
    • Progress in Medical Physics
    • /
    • v.26 no.1
    • /
    • pp.28-35
    • /
    • 2015
  • In this study, we investigated the effects of an iterative reconstruction algorithm and an automatic exposure control (AEC) technique on image quality and radiation dose through phantom experiments with coronary computed tomography (CT) angiography protocols. We scanned the AAPM CT performance phantom using 320 multi-detector-row CT. At the tube voltages of 80, 100, and 120 kVp, the scanning was repeated with two settings of the AEC technique, i.e., with the target standard deviations (SD) values of 33 (the higher tube current) and 44 (the lower tube current). The scanned projection data were reconstructed also in two ways, with the filtered back projection (FBP) and with the iterative reconstruction technique (AIDR-3D). The image quality was evaluated quantitatively with the noise standard deviation, modulation transfer function, and the contrast to noise ratio (CNR). More specifically, we analyzed the influences of selection of a tube voltage and a reconstruction algorithm on tube current modulation and consequently on radiation dose. Reduction of image noise by the iterative reconstruction algorithm compared with the FBP was revealed eminently, especially with the lower tube current protocols, i.e., it was decreased by 46% and 38%, when the AEC was established with the lower dose (the target SD=44) and the higher dose (the target SD=33), respectively. As a side effect of iterative reconstruction, the spatial resolution was decreased by a degree that could not mar the remarkable gains in terms of noise reduction. Consequently, if coronary CT angiogprahy is scanned and reconstructed using both the automatic exposure control and iterative reconstruction techniques, it is anticipated that, in comparison with a conventional acquisition method, image noise can be reduced significantly with slight decrease in spatial resolution, implying clinical advantages of radiation dose reduction, still being faithful to the ALARA principle.