• Title/Summary/Keyword: histogram data

Search Result 488, Processing Time 0.027 seconds

Design of Optimized pRBFNNs-based Face Recognition Algorithm Using Two-dimensional Image and ASM Algorithm (최적 pRBFNNs 패턴분류기 기반 2차원 영상과 ASM 알고리즘을 이용한 얼굴인식 알고리즘 설계)

  • Oh, Sung-Kwun;Ma, Chang-Min;Yoo, Sung-Hoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.6
    • /
    • pp.749-754
    • /
    • 2011
  • In this study, we propose the design of optimized pRBFNNs-based face recognition system using two-dimensional Image and ASM algorithm. usually the existing 2 dimensional face recognition methods have the effects of the scale change of the image, position variation or the backgrounds of an image. In this paper, the face region information obtained from the detected face region is used for the compensation of these defects. In this paper, we use a CCD camera to obtain a picture frame directly. By using histogram equalization method, we can partially enhance the distorted image influenced by natural as well as artificial illumination. AdaBoost algorithm is used for the detection of face image between face and non-face image area. We can butt up personal profile by extracting the both face contour and shape using ASM(Active Shape Model) and then reduce dimension of image data using PCA. The proposed pRBFNNs consists of three functional modules such as the condition part, the conclusion part, and the inference part. In the condition part of fuzzy rules, input space is partitioned with Fuzzy C-Means clustering. In the conclusion part of rules, the connection weight of RBFNNs is represented as three kinds of polynomials such as constant, linear, and quadratic. The essential design parameters (including learning rate, momentum coefficient and fuzzification coefficient) of the networks are optimized by means of Differential Evolution. The proposed pRBFNNs are applied to real-time face image database and then demonstrated from viewpoint of the output performance and recognition rate.

A Randomized Controlled Trial about the Levels of Radiation Exposure Depends on the Use of Collimation C-arm Fluoroscopic-guided Medial Branch Block

  • Baek, Seung Woo;Ryu, Jae Sung;Jung, Cheol Hee;Lee, Joo Han;Kwon, Won Kyoung;Woo, Nam Sik;Kim, Hae Kyoung;Kim, Jae Hun
    • The Korean Journal of Pain
    • /
    • v.26 no.2
    • /
    • pp.148-153
    • /
    • 2013
  • Background: C-arm fluoroscope has been widely used to promote more effective pain management; however, unwanted radiation exposure for operators is inevitable. We prospectively investigated the differences in radiation exposure related to collimation in Medial Branch Block (MBB). Methods: This study was a randomized controlled trial of 62 MBBs at L3, 4 and 5. After the patient was laid in the prone position on the operating table, MBB was conducted and only AP projections of the fluoroscope were used. Based on a concealed random number table, MBB was performed with (collimation group) and without (control group) collimation. The data on the patient's age, height, gender, laterality (right/left), radiation absorbed dose (RAD), exposure time, distance from the center of the field to the operator, and effective dose (ED) at the side of the table and at the operator's chest were collected. The brightness of the fluoroscopic image was evaluated with histogram in Photoshop. Results: There were no significant differences in age, height, weight, male to female ratio, laterality, time, distance and brightness of fluoroscopic image. The area of the fluoroscopic image with collimation was 67% of the conventional image. The RAD ($29.9{\pm}13.0$, P = 0.001) and the ED at the left chest of the operators ($0.53{\pm}0.71$, P = 0.042) and beside the table ($5.69{\pm}4.6$, P = 0.025) in collimation group were lower than that of the control group ($44.6{\pm}19.0$, $0.97{\pm}0.92$, and $9.53{\pm}8.16$), resepectively. Conclusions: Collimation reduced radiation exposure and maintained the image quality. Therefore, the proper use of collimation will be beneficial to both patients and operators.

Detection of The Real-time Weather Information from a Vehicle Black Box (차량용 블랙박스 영상에서의 실시간 기상정보 검지)

  • Kang, Ju-mi;Lee, Jaesung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2014.10a
    • /
    • pp.320-323
    • /
    • 2014
  • Today is going with the advancement of intelligent transportation systems and traffic environment and helping to provide safe and convenient service through a mobile device work with the popularization of the vehicle black box. The traffic flow by a variety of causes is constantly changing, it is often unable to prepare the driver, depending on external factors can not be controlled by the power of the public, leading to a major accident. The system needs to pass the real-time weather data in the inter-operator to prevent this. The proposed detection algorithm weather information delivered real-time weather information for this paper. The weather condition is detected by using the contrast between the histogram of the motion of the wiper and the clear day algorithm. In general, the wiper is worked in extreme weather conditions that will have a value different contrast due to rain or snow. Situation was considered clear, snowy conditions, such as using it on a rainy situation. First, designated as ROI (Region Of Interest) of the minimum area that can be detected in order to reduce the amount of calculation for the wiper, the wiper, which was detected through the operation of the threshold Thresholding the brightness of the vehicle wiper. In addition, we distinguish the value of each meteorological situation by using contrast. Results was obtained to 80% for the snow conditions, a rainy situation.

  • PDF

Parallel Processing of Satellite Images using CUDA Library: Focused on NDVI Calculation (CUDA 라이브러리를 이용한 위성영상 병렬처리 : NDVI 연산을 중심으로)

  • LEE, Kang-Hun;JO, Myung-Hee;LEE, Won-Hee
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.19 no.3
    • /
    • pp.29-42
    • /
    • 2016
  • Remote sensing allows acquisition of information across a large area without contacting objects, and has thus been rapidly developed by application to different areas. Thus, with the development of remote sensing, satellites are able to rapidly advance in terms of their image resolution. As a result, satellites that use remote sensing have been applied to conduct research across many areas of the world. However, while research on remote sensing is being implemented across various areas, research on data processing is presently insufficient; that is, as satellite resources are further developed, data processing continues to lag behind. Accordingly, this paper discusses plans to maximize the performance of satellite image processing by utilizing the CUDA(Compute Unified Device Architecture) Library of NVIDIA, a parallel processing technique. The discussion in this paper proceeds as follows. First, standard KOMPSAT(Korea Multi-Purpose Satellite) images of various sizes are subdivided into five types. NDVI(Normalized Difference Vegetation Index) is implemented to the subdivided images. Next, ArcMap and the two techniques, each based on CPU or GPU, are used to implement NDVI. The histograms of each image are then compared after each implementation to analyze the different processing speeds when using CPU and GPU. The results indicate that both the CPU version and GPU version images are equal with the ArcMap images, and after the histogram comparison, the NDVI code was correctly implemented. In terms of the processing speed, GPU showed 5 times faster results than CPU. Accordingly, this research shows that a parallel processing technique using CUDA Library can enhance the data processing speed of satellites images, and that this data processing benefits from multiple advanced remote sensing techniques as compared to a simple pixel computation like NDVI.

Characterizing Geomorphological Properties of Western Pacific Seamounts for Cobalt-rich Ferromanganese Crust Resource Assessment (서태평양 해저산의 망간각 자원평가를 위한 해저지형 특성 분석)

  • Joo, Jongmin;Kim, Jonguk;Ko, Youngtak;Kim, Seung-Sep;Son, Juwon;Pak, Sang Joon;Ham, Dong-Jin;Son, Seung Kyu
    • Economic and Environmental Geology
    • /
    • v.49 no.2
    • /
    • pp.121-134
    • /
    • 2016
  • We characterize the spatial distribution of Cobalt-rich ferromanganese crusts covering the summit and slopes of a seamount in the western Pacific, using acoustic backscatter from multibeam echo sounders (MBES) and seafloor video observation. Based on multibeam bathymetric data, we identify that ~70% of the summit area of this flattopped seamount has slope gradients less than $5^{\circ}$. The histogram of the backscatter intensity data shows a bi-modal distribution, indicating significant variations in seabed hardness. On the one hand, visual inspection of the seafloor using deep-sea camera data exhibits that the steep slope areas with high backscatter are mainly covered by manganese crusts. On the other hand, the visual analyses for the summit reveal that the summit areas with relatively low backscatter are covered by sediments. The other summit areas, however, exhibit high acoustic reflectivity due to coexistence of manganese crusts and sediments. Comparison between seafloor video images and acoustic backscatter intensity suggests that the central summit has relatively flat topography and low backscatter intensity resulting from unconsolidated sediments. In addition, the rim of the summit and the slopes are of high acoustic reflectivity because of manganese crusts and/or bedrock outcrops with little sediments. Therefore, we find a strong correlation between the acoustic backscatter data acquired from sea-surface multibeam survey and the spatial distribution of sediments and manganese crusts. We propose that analyzing acoustic backscatter can be one of practical methods to select optimal minable areas of the ferromanganese crusts from seamounts for future mining.

Rough Set Analysis for Stock Market Timing (러프집합분석을 이용한 매매시점 결정)

  • Huh, Jin-Nyung;Kim, Kyoung-Jae;Han, In-Goo
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.77-97
    • /
    • 2010
  • Market timing is an investment strategy which is used for obtaining excessive return from financial market. In general, detection of market timing means determining when to buy and sell to get excess return from trading. In many market timing systems, trading rules have been used as an engine to generate signals for trade. On the other hand, some researchers proposed the rough set analysis as a proper tool for market timing because it does not generate a signal for trade when the pattern of the market is uncertain by using the control function. The data for the rough set analysis should be discretized of numeric value because the rough set only accepts categorical data for analysis. Discretization searches for proper "cuts" for numeric data that determine intervals. All values that lie within each interval are transformed into same value. In general, there are four methods for data discretization in rough set analysis including equal frequency scaling, expert's knowledge-based discretization, minimum entropy scaling, and na$\ddot{i}$ve and Boolean reasoning-based discretization. Equal frequency scaling fixes a number of intervals and examines the histogram of each variable, then determines cuts so that approximately the same number of samples fall into each of the intervals. Expert's knowledge-based discretization determines cuts according to knowledge of domain experts through literature review or interview with experts. Minimum entropy scaling implements the algorithm based on recursively partitioning the value set of each variable so that a local measure of entropy is optimized. Na$\ddot{i}$ve and Booleanreasoning-based discretization searches categorical values by using Na$\ddot{i}$ve scaling the data, then finds the optimized dicretization thresholds through Boolean reasoning. Although the rough set analysis is promising for market timing, there is little research on the impact of the various data discretization methods on performance from trading using the rough set analysis. In this study, we compare stock market timing models using rough set analysis with various data discretization methods. The research data used in this study are the KOSPI 200 from May 1996 to October 1998. KOSPI 200 is the underlying index of the KOSPI 200 futures which is the first derivative instrument in the Korean stock market. The KOSPI 200 is a market value weighted index which consists of 200 stocks selected by criteria on liquidity and their status in corresponding industry including manufacturing, construction, communication, electricity and gas, distribution and services, and financing. The total number of samples is 660 trading days. In addition, this study uses popular technical indicators as independent variables. The experimental results show that the most profitable method for the training sample is the na$\ddot{i}$ve and Boolean reasoning but the expert's knowledge-based discretization is the most profitable method for the validation sample. In addition, the expert's knowledge-based discretization produced robust performance for both of training and validation sample. We also compared rough set analysis and decision tree. This study experimented C4.5 for the comparison purpose. The results show that rough set analysis with expert's knowledge-based discretization produced more profitable rules than C4.5.

Evaluation of the Usefulness of Restricted Respiratory Period at the Time of Radiotherapy for Non-Small Cell Lung Cancer Patient (비소세포성 폐암 환자의 방사선 치료 시 제한 호흡 주기의 유용성 평가)

  • Park, So-Yeon;Ahn, Jong-Ho;Suh, Jung-Min;Kim, Yung-Il;Kim, Jin-Man;Choi, Byung-Ki;Pyo, Hong-Ryul;Song, Ki-Won
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.24 no.2
    • /
    • pp.123-135
    • /
    • 2012
  • Purpose: It is essential to minimize the movement of tumor due to respiratory movement at the time of respiration controlled radiotherapy of non-small cell lung cancer patient. Accordingly, this Study aims to evaluate the usefulness of restricted respiratory period by comparing and analyzing the treatment plans that apply free and restricted respiration period respectively. Materials and Methods: After having conducted training on 9 non-small cell lung cancer patients (tumor n=10) from April to December 2011 by using 'signal monitored-breathing (guided- breathing)' method for the 'free respiratory period' measured on the basis of the regular respiratory period of the patents and 'restricted respiratory period' that was intentionally reduced, total of 10 CT images for each of the respiration phases were acquired by carrying out 4D CT for treatment planning purpose by using RPM and 4-dimensional computed tomography simulator. Visual gross tumor volume (GTV) and internal target volume (ITV) that each of the observer 1 and observer 2 has set were measured and compared on the CT image of each respiratory interval. Moreover, the amplitude of movement of tumor was measured by measuring the center of mass (COM) at the phase of 0% which is the end-inspiration (EI) and at the phase of 50% which is the end-exhalation (EE). In addition, both observers established treatment plan that applied the 2 respiratory periods, and mean dose to normal lung (MDTNL) was compared and analyzed through dose-volume histogram (DVH). Moreover, normal tissue complication probability (NTCP) of the normal lung volume was compared by using dose-volume histogram analysis program (DVH analyzer v.1) and statistical analysis was performed in order to carry out quantitative evaluation of the measured data. Results: As the result of the analysis of the treatment plan that applied the 'restricted respiratory period' of the observer 1 and observer 2, there was reduction rate of 38.75% in the 3-dimensional direction movement of the tumor in comparison to the 'free respiratory period' in the case of the observer 1, while there reduction rate was 41.10% in the case of the observer 2. The results of measurement and comparison of the volumes, GTV and ITV, there was reduction rate of $14.96{\pm}9.44%$ for observer 1 and $19.86{\pm}10.62%$ for observer 2 in the case of GTV, while there was reduction rate of $8.91{\pm}5.91%$ for observer 1 and $15.52{\pm}9.01%$ for observer 2 in the case of ITV. The results of analysis and comparison of MDTNL and NTCP illustrated the reduction rate of MDTNL $3.98{\pm}5.62%$ for observer 1 and $7.62{\pm}10.29%$ for observer 2 in the case of MDTNL, while there was reduction rate of $21.70{\pm}28.27%$ for observer 1 and $37.83{\pm}49.93%$ for observer 2 in the case of NTCP. In addition, the results of analysis of correlation between the resultant values of the 2 observers, while there was significant difference between the observers for the 'free respiratory period', there was no significantly different reduction rates between the observers for 'restricted respiratory period. Conclusion: It was possible to verify the usefulness and appropriateness of 'restricted respiratory period' at the time of respiration controlled radiotherapy on non-small cell lung cancer patient as the treatment plan that applied 'restricted respiratory period' illustrated relative reduction in the evaluation factors in comparison to the 'free respiratory period.

  • PDF

INVESTIGATION OF THE EFFECT OF AN ANTIBIOTIC "P" ON POTATOES ("감자에 대한 항생제(抗生劑) 피마리신의 통계적(統計的) 효과(效果) 분석(分析)")

  • Kim, Jong-Hoon
    • Journal of Korean Society for Quality Management
    • /
    • v.5 no.2
    • /
    • pp.59-120
    • /
    • 1977
  • An antibiotic 'P', which is one of the products of the Gist Brocades N. V. is being tested by its research department as fungicide on seed-potatoes. For this testing they designed experiments, with two control groups, one competitor's product, eight formulations of the antibiotic to be tested in different concentrations and one mercury treatment which can not be used in practice. The treated potatoes were planted in three different regions, where bifferent conditions prevail. After several months the harvested potatoes are divided in groups according to their diameter, potato illness is analysed and counted. These data were summarised in percentage and given to us for Analysis. We approached and analysed the data by following methods: a. Computation of the mean and standard deviation of the percenage of good results in each size group and treatment. b. Computation of the experimental errors by substraction of each treatment mean from observed data. c. Description of the frequency table, plotting of a histogram and a normal curve on same graph to check normality. d. Test of normality paper and chi-sqeare test to check the goodness of fit to a normal curve. e. Test for homogeneity of variance in each treatment with the Cochran's test and Hartley's test. f. Analysis of Variance for testing the means by one way classifications. g. Drawing of graphs with upper and lower confidence limits to show the effect of different treatments. h. T-test and F-test to two Control mean and variance for making one control of Dunnett's test. i. Dunnett's Test and calculations for numerical comarision of different treatments wth one control. In region R, where the potatoes were planted, it was this year very dry and rather bad conditions to grow potatoes prevailed during the experimental period. The results of this investigation show us that treatment No.2, 3 and 4 are significantly different from other treatments and control groups (none treated, just like natural state). Treatment no.2 is the useless mercury formulation. So only No. 3 and 4, which have high concentrations of antibiotic 'P', gave a good effect to the potatoes. As well as the competitors product, middle and low concentrated formulations are not significantly different from control gro-ups of every size. In region w, where the potatoes got the same treatments as in region R, prevailed better weather conditions and was enough water obtainable from the lake. The results in this region showed that treatment No. 2, 3, 4, and 5 are Significantly different from other treatments and the control groups. Again No.2 is the mercury treatmentin this investigation. Not only high concentrated formulation of antibiotic 'P', but also the competitor's poroduct gave good results. But, the effect of 'P', was better than the competitors porduct. In region G, where the potatoes got the same treatments as in the regions R and w. and the climate conditions were equal to region R, the results showed that most of the treatments are not significantly different from the control groups. Only treatment no. 3 was a little bit different from the others. but not Significantly different. It seems to us that the difference between the results in the three regions was caused by certain conditions like, the nature of the soil the degres of moisture and hours of sunshine, but we are not sure of that. As a conclusion, we can say that antibiotic 'P' has a good effect on potatoes, but in most investigations a rather high concentration of 'P' was required in formulations.

  • PDF

Evaluation of Dose Distributions Recalculated with Per-field Measurement Data under the Condition of Respiratory Motion during IMRT for Liver Cancer (간암 환자의 세기조절방사선치료 시 호흡에 의한 움직임 조건에서 측정된 조사면 별 선량결과를 기반으로 재계산한 체내 선량분포 평가)

  • Song, Ju-Young;Kim, Yong-Hyeob;Jeong, Jae-Uk;Yoon, Mee Sun;Ahn, Sung-Ja;Chung, Woong-Ki;Nam, Taek-Keun
    • Progress in Medical Physics
    • /
    • v.25 no.2
    • /
    • pp.79-88
    • /
    • 2014
  • The dose distributions within the real volumes of tumor targets and critical organs during internal target volume-based intensity-modulated radiation therapy (ITV-IMRT) for liver cancer were recalculated by applying the effects of actual respiratory organ motion, and the dosimetric features were analyzed through comparison with gating IMRT (Gate-IMRT) plan results. The ITV was created using MIM software, and a moving phantom was used to simulate respiratory motion. The doses were recalculated with a 3 dose-volume histogram (3DVH) program based on the per-field data measured with a MapCHECK2 2-dimensional diode detector array. Although a sufficient prescription dose covered the PTV during ITV-IMRT delivery, the dose homogeneity in the PTV was inferior to that with the Gate-IMRT plan. We confirmed that there were higher doses to the organs-at-risk (OARs) with ITV-IMRT, as expected when using an enlarged field, but the increased dose to the spinal cord was not significant and the increased doses to the liver and kidney could be considered as minor when the reinforced constraints were applied during IMRT plan optimization. Because the Gate-IMRT method also has disadvantages such as unsuspected dosimetric variations when applying the gating system and an increased treatment time, it is better to perform a prior analysis of the patient's respiratory condition and the importance and fulfillment of the IMRT plan dose constraints in order to select an optimal IMRT method with which to correct the respiratory organ motional effect.

Quality Assurance of Leaf Speed for Dynamic Multileaf Collimator (MLC) Using Dynalog Files (Dynalog file을 이용한 동적다엽조준기의 Leaf 속도 정도관리 평가)

  • Kim, Joo Seob;Ahn, Woo Sang;Lee, Woo Suk;Park, Sung Ho;Choi, Wonsik;Shin, Seong Soo
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.26 no.2
    • /
    • pp.305-312
    • /
    • 2014
  • Purpose : The purpose of this study is to analyze the mechanical and leaf speed accuracy of the dynamic multileaf collimator (DMLC) and determine the appropriate period of quality assurance (QA). Materials and Methods : The quality assurance of the DMLC equipped with Millennium 120 leaves has been performed total 92 times from January 2012 to June 2014. The the accuracy of leaf position and isocenter coincidence for MLC were checked using the graph paper and Gafchromic EBT film, respectively. The stability of leaf speed was verified using a test file requiring the leaves to reach maximum leaf speed during the gantry rotation. At the end of every leaf speed QA, dynamic dynalog files created by MLC controller were analyzed using dynalog file viewer software. This file concludes the information about the planned versus actual position for all leaves and provides error RMS (root-mean square) for individual leaf deviations and error histogram for all leaf deviations. In this study, the data obtained from the leaf speed QA were used to screen the performance degradation of leaf speed and determine the need for motor replacement. Results : The leaf position accuracy and isocenteric coincidence of MLC was observed within a tolerance range recommanded from TG-142 reports. Total number of motor replacement were 56 motors over whole QA period. For all motors replaced from QA, gradually increased patterns of error RMS values were much more than suddenly increased patterns of error RMS values. Average error RMS values of gradually and suddenly increased patterns were 0.298 cm and 0.273 cm, respectively. However, The average error RMS values were within 0.35 cm recommended by the vendor, motors were replaced according to the criteria of no counts with misplacement > 1 cm. On average, motor replacement for gradually increased patterns of error RMS values 22 days. 28 motors were replaced regardless of the leaf speed QA. Conclusion : This study performed the periodic MLC QA for analyzing the mechanical and leaf speed accuracy of the dynamic multileaf collimator (DMLC). The leaf position accuracy and isocenteric coincidence showed whthin of MLC evaluation is observed within the tolerance value recommanded by TG-142 report. Based on the result obtained from leaf speed QA, we have concluded that QA protocol of leaf speed for DMLC was performed at least bimonthly in order to screen the performance of leaf speed. The periodic QA protocol can help to ensure for delivering accurate IMRT treatment to patients maintaining the performance of leaf speed.